Thousands of high school students are currently deliberating over which university to attend next year. But which are the best? A study published in the open access journal BMC Medicine warns against using international rankings of universities to answer this question. They are misleading and should be abandoned, the study concludes.

The study focuses on the published 2006 rankings of the Times Higher Education Supplement "World University Rankings" and the Shanghai Jiao Tong University "Academic Ranking of World Universities". It found that only 133 institutions were shared between the top-200 lists of the Shanghai and Times rankings; four of the top-50 in the Shanghai list did not even appear among the first 500 universities of the Times ranking.

The study's authors argue that such discrepancies stem from poor methodology and inappropriate indicators, making the ranking systems invalid.

The Shanghai system, for example, measures research excellence in part by the number of Nobel- and Fields-winning alumni at the institution. However, few universities boast laureates on their staff, and their presence does not necessarily lead to better undergraduate education. Furthermore, the prize-winning staff usually have performed their ground-breaking work at another institution, so the measurement really addresses the ability of institutions to attract prestigious awardees rather than being the site where ground-breaking work is done.

The Times ranking, on the other hand, places great emphasis on the results of a survey sent out to more than 190,000 researchers. They are asked to list what they think are the top 30 universities in their field of research. Yet this survey is entirely opinion-based, and with a response rate below 1% may contain significant bias.

"There are flaws in the way that almost every indicator is used to compile these two popular rankings," says John Ioannidis, who led the analysis team. "I don't disagree that excellence is important to define, measure, interpret and improve, but the existing ranking criteria could actually harm science and education."

The study authors call for global collaboration to standardise data on key aspects of universities and other institutions, and any flaws should be openly admitted and not underestimated. "Evaluation exercises should not force spurious averages and oversimplified rankings for the many faces of excellence," says Ioannidis. "And efforts to improve institutions should not focus just on the numbers being watched."

Article: "International ranking systems for universities and institutions: a critical appraisal", John PA Ioannidis, Nikolaos A Patsopoulos, Fotini K Kavvoura, Athina Tatsioni, Evangelos Evangelou, Ioanna Kouri, Despina G Contopoulos-Ioannidis and George Liberopoulos, BMC Medicine

Old NID
5211

Donate

Please donate so science experts can write for the public.

At Science 2.0, scientists are the journalists, with no political bias or editorial control. We can't do it alone so please make a difference.

Donate with PayPal button 
We are a nonprofit science journalism group operating under Section 501(c)(3) of the Internal Revenue Code that's educated over 300 million people.

You can help with a tax-deductible donation today and 100 percent of your gift will go toward our programs, no salaries or offices.

Latest reads

Article teaser image
Donald Trump does not have the power to rescind either constitutional amendments or federal laws by mere executive order, no matter how strongly he might wish otherwise. No president of the United…
Article teaser image
The Biden administration recently issued a new report showing causal links between alcohol and cancer, and it's about time. The link has been long-known, but alcohol carcinogenic properties have been…
Article teaser image
In British Iron Age society, land was inherited through the female line and husbands moved to live with the wife’s community. Strong women like Margaret Thatcher resulted.That was inferred due to DNA…