You can verify the data yourself
Leiden Ranking fully discloses its method
For fifteen years, research centre CWTS has been making its own world ranking of universities: the Leiden Ranking. It allows you to select certain criteria, thereby creating alternative versions.
The idea behind it: other world rankings reduce the achievements of universities to a single outcome, but there are all kinds of possible perspectives. One criterion isn’t necessarily better than another.
Now, the makers have taken things to the next level. They’ve launched an ‘open edition’. This means you can’t only choose your own criteria, but also verify the underlying data.
Black box'
The Leiden Ranking used to be a kind of "black box", says Director Ludo Waltman in an explanatory note. But in the Open Edition, the data and algorithms used are public. In principle, anyone can now make their own ranking. Waltman: “Everyone can decide for themselves what they think is important to measure a university’s performance.”
For the regular Leiden Ranking, which also still exists, CWTS uses data from what is known as the ‘Web of Science’, but those aren’t accessible to everyone. The new edition is based on data from OpenAlex, which can be downloaded by anyone.
This causes differences between the open edition and the old edition. For example, what share of scientific articles belonged to the best-cited ten percent in the world over the past years? For the University of Amsterdam, this percentage is 14.9 or 15.7, depending on which edition you consult.
Shifts
This means the "charts" also change a bit. In both versions, the University of Amsterdam takes the top position, but in the old edition, Erasmus University Rotterdam is in sixth place and, in the open edition, third place. At least, if you look at this particular criterion. You could also choose the best-cited 1 percent, or the percentage of open-access publications, for instance.
Rankings come in for a lot of criticism. Dutch universities have decided to place less emphasis on their ranking score, if only because the criteria raise questions: how much weight do you assign, for instance, to the reputation of institutions compared to their scientific impact?
What’s more, there’s more to a university than the number of citations. Articles on earthquakes in Groningen or Dutch healthcare are less likely to be published in the most prestigious journals of the world, but does that make them less valuable? And if a researcher shifts their focus to education, is that harmful to a university’s position in a world ranking?
Recognise and reward
In the context of "Recognition and Rewards", universities are looking to place more emphasis on other tasks that staff can carry out, such as research, teaching, spreading knowledge, managing people, etc. These tasks don't really help them in rankings, which is why Utrecht University is missing in the ranking made by the British magazine Times Higher Education. The university no longer supplies its own data to the makers of that ranking. But Dutch universities are not all on the same page: the other institutions are still participating.