London Student

Report condemns university rankings as “not only subjective but misleading”

A new report published by the Higher Education Policy Institute (HEPI) to scrutinise university rankings has condemned the rankings as “not only subjective but misleading.”

The report, titled “International university rankings: For good or ill?,” focuses on what they identify as the four main international rankings:

  • THE World University Rankings (Times Higher Education)
  • QS World University Rankings (Quacquarelli Symonds Ltd)
  • Academic Ranking of World Universities (Shanghai Jiao Tong University)
  • U-Multirank (Centre for Higher Education)

The report condemns the rankings because they don’t measure what they claim to. The rankings focus on research-related criteria rather than on teaching.

The author of the report, HEPI President Bahram Bekhradnia said: “They do not match the claims made for them. They fail to identify the ‘best’ universities in the world, given the numerous functions universities fulfil that do not feature in the ranking.

“Indeed, what is arguably their most important activity – educating students – is omitted.” According to the report this renders the rankings useless to the students they claim to serve with their rankings.

Bekhradnia also criticises the amount of impact the rankings have not just on universities and their policies but also on politicians and governments. The rankings influence major decisions over education funding in countries like France, Germany and the UK according to the report.

This is dangerous because the data the rankings make use of isn’t reliable, according to HEPI. The main reason for this is that the data used for rankings like the THE ranking is submitted by the universities themselves.

This can lead to anomalous results, as an incident cited in the report shows: “A case in point is provided by the most recent THE rankings, where before publication Trinity College Dublin concerned that its position had deteriorated, investigated and discovered that on a key measure it had misplaced the decimal point in its returns – an error of a factor of 10! [sic]”

The author of the report makes suggestions to improve the validity of the rankings. Bekhradnia suggests using a radar diagram is a better way of presenting the data than an an ordinal list.

Another idea put forward in the report suggests that the top-ranked universities should boycott the rankings by refusing to provide their data. This, Bekhradnia says, should lead to a broadening and general improvement of the methodologies.

Leave a comment

Marie Segger