UU catalyst in discussion about recognising and rewarding scientists
“The impact factor doesn’t accurately show the quality of each individual scientist. We have a strong belief that something has to change, and abandoning the impact factor is one of those changes,” stated Utrecht professor Paul Boselie in June, in an interview with scientific journal Nature. The impact factor for is based on the number of articles published in renowned scientific journals, as well as the average number of times other scientists cite those publications.
Boselie is one of the architects of UU's Recognition and Reward programme, which aims to change the way scientists are assessed, starting in 2022. From then on, the impact factor will no longer be included in the assessments, neither will it be part of the recruitment process for new researchers.
But Utrecht University isn’t the only university working on a change. A work group within the Association of Dutch Universities (VSNU in the Dutch acronym) is currently developing a new system for all universities. Dutch research organisations like NWO and ZonMw are doing something similar.
The Nature article caused quite some commotion this past summer. Although UU was praised for the initiative, it received its fair share of criticism, too. In July, 175 scientists – among whom 142 professors – penned an open letter (in Dutch) arguing that, by letting go of the impact factor, the Netherlands could lose its top position internationally. After all, the number of publications and citations obtained by each scientists is a concrete and objective metric, while the new criteria soon to be introduced are, in their view, vague and subjective. The new system would be disastrous, they declare, especially for the exact, medical, and life sciences. Young researchers, who will no longer be able to compete internationally, will be affected the most.
What's more, the critics feel that the picture being painted of the quality of articles published by renowned journals is much too negative. After all, those articles are assessed by renowned experts, so they do say something about the quality of the researcher. Not to mention that, in certain disciplines, institutions don’t just measure the amount of publications in renowned journals, but publications in more specialised journals as well.
The letter also states that an assessment based on the criteria soon to be adopted would make it laborious to compare scientists. As an example, they mention the applications for the prestigious NWO grants Veni, Vidi, and Vici, in which criteria deemed as "objective" like publications, citations, and lectures, have been replaced by a "narrative" and a list of no more than ten publications. Applicants have to explain why their research is relevant for science and society. According to the letter, the first experiences have been alarming. “Committee members have no clue how to compare candidates, and will often turn to Google to find specific achievements.”
But that's not all. The letter's authors also feel that some of the new criteria, like the public accessibility of data (open science), the composition of the research team, and proof of leadership aren’t scientifically driven, but rather of political nature. Besides, they're hard to measure.
One of the signatories is Gerard van Koten, an emeritus professor of Organic Chemistry and Catalysis with an outstanding track record in terms of publications and administrative positions. “Collaboration is extremely important, and the university should embed that into the entire programme and interdisciplinary research. But that doesn’t mean we should disregard publications in renowned journals, or citations. Experience has taught me how important they are. If I must assess scientists, the citations of the past five years are a reliable measure. That goes for young scientists, too. If we'd let go of this metric, limiting ourselves to skills like collaboration instead, we'd run the risk of not granting opportunities to brilliant scientists who may be more individualistic. I do think we should look at collaboration and the societal relevance of a scientist's work, but let's not lose sight of the brilliant nerd, or the scientist who conducts fundamental research that doesn’t have immediate social impact. I travel to China often, and the fact that UU is ranked high in the Shanghai ranking is truly impactful. It makes institutes open to collaborating with us, and it makes young, talented scientists aspire to come to Utrecht.”
A group of nearly 400 young scientists reacted (also in Dutch) to the critical letter, stating that the number of publications is seen as the most important thing in a scientist's career all too often, even though science is about much more than that. That's why they support the proposed criteria of the Recognition & Rewards programme. “The past few decades, the definition of talent has been way too narrow for the way contemporary academia functions”, reads the reaction letter. Its authors think it requires guts to find new paths for assessing a scientist's work. At the same time, they would like UU to maintain its renown internationally: therefore, the number of publications in esteemed journals and the number citations should still be considered in the new system, alongside the new criteria. The 400 academics also refer to the Declaration on Research Assessment (Dora). Penned in 2012, it aims to improve the assessment of scientists and their publications by disregarding impact factors.
Martijn Huysmans, Assistant Professor at the UU School of Economics, a member of the Utrecht Young Academy who's also active in the Recognition & Rewards work group, is one of the people who signed the reaction letter. In his view, the assessment based on renowned journals and citations is "pretend objectivity".
“A perverse system was developed to score in this world. Prestigious referents and editors at these magazines encourage citations to strengthen each other’s reputations, making it a self-fulfilling prophecy”, he states. Huysmans thinks young researchers like himself can’t benefit from this method. “Institutions that have a lot of money to generate data, for instance, score more easily. That doesn’t say much about the quality of the individual scientist.”
That's why he’s much more enthusiastic about assessing scientists in a way that takes the context of their research into account. “Perhaps it takes the referents more time to compare researchers, but it’s a much better method than just looking at citations or publications.”
The chairpeople of VSNU's Recognition & Rewards work group also penned a response (in English) to the professors’ letter. Rianne Letschert, President of Maastricht University, and Jeroen Geurts, President of ZonMw, state that UU's initiative reflects a global trend, so it is not likely to therefore affect Dutch science disproportionately. They acknowledge that a culture shift takes time, but then again it's positive to innovate in science. Looking beyond publications, they say, benefits research diversity, making for a greater emphasis on social impact. Lastly, they do not think that open science is a political argument, but rather a way to make science more transparent.
UU president Anton Pijpers agrees. “We must acknowledge that publishing in well-known magazines isn’t the only thing that counts. We put emphasis on scientists who are also good leaders, have an eye for the social impact of their research, and focus on collaboration. We, members of the Executive Board, used to receive letters requesting us to appoint a professor, in which the first sentence – right after the name – would mention the h-index in parentheses (someone’s publication & citation score, ed.). As though that would be reason enough to appoint someone. We’re not doing that anymore. Instead, we now work with a Triple model that looks at multiple aspects, like teamwork, leadership, and research qualities.”