Niks meer missen?
Schrijf je in voor onze nieuwsbrief
Leiden Ranking makes method fully public
Foto: Marc Kolle
international

Leiden Ranking makes method fully public

Hoger Onderwijs Persbureau Hoger Onderwijs Persbureau,
31 January 2024 - 12:22

Has the University of Amsterdam really produced so many well-cited publications? Anyone who does not trust the new Leiden Ranking can now check it for themselves.

For 15 years, the CWTS research center has been producing a world ranking of universities: the Leiden Ranking. It allows you to choose certain criteria and thus create your own variants.
 
The idea behind it is that other world rankings flatten the performance of universities into a single outcome, whereas there are all kinds of ways of looking at it. One criterion is not necessarily better than another.
 
Now the creators are going one step further. They are launching an “open edition.” Not only can you now choose your own criteria, you can also check the underlying data.

 

Black box
Previously, the Leiden Ranking was a kind of “black box,” explains director Ludo Waltman. But with the Open Edition, the data and algorithms used are public. In principle, anyone can now make their own ranking. Waltman says: “Everyone can decide for themselves what they consider important for measuring a university’s performance.”
 
For the regular Leiden Ranking, which also still exists, CWTS uses data from the so-called “Web of Science,” but these are not accessible to everyone. The new edition relies on data from OpenAlex, which anyone can download.

Rankings are under fire. Dutch universities would like to reduce their focus on them

This creates differences between the open edition and the old edition. For example, what proportion of science articles were among the world’s best-cited 10 percent in recent years? For the University of Amsterdam, it is 14.9 or 15.7 percent, depending on which edition you look at.
 
Shifts
So the “rankings” vary a bit, too. In both versions, the UvA is at the top, but in the old edition Erasmus University Rotterdam is sixth, and in the open edition it’s third. At least, if you look at this one criterion. You could also choose the best-cited one percent, or the percentage of open-access publications, for example.
 
Rankings are under fire. Dutch universities would like to reduce their focus on them, so goes their thinking, if only because you can question the criteria. For example, how heavily do you weigh the reputation of institutions against their scientific impact?
 
Besides, there is more value at a university than the number of citations it receives. Articles about earthquakes in Groningen or Dutch health care will not easily make it to the world’s most prestigious journals, but does that make them less valuable? And if a scientist dabbles in education, does that harm a university’s position in the world rankings?
 
Recognizing and valuing
In the context of “recognizing and valuing,” universities want to put more emphasis on the different tasks that employees can perform: researching, teaching, disseminating knowledge, leading, and so on. A focus on rankings does not accommodate this.
 
Utrecht University is therefore missing from the rankings of the British magazine Times Higher Education. The university no longer provides its data to the researchers and is therefore no longer included in the list. But Dutch universities are not all on the same page. The other institutions still participate.

website loading