Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS The Official PLOS Blog

The Responsible Research Assessment Initiative

Note: PLOS is delighted to once again partner with the Einstein Foundation Award for Promoting Quality in Research. The awards program honors researchers who reflect rigor, reliability, robustness, and transparency in their work. The Einstein Foundation received dozens of stellar submissions. We asked this year’s finalists to write about their research in the run up to the ceremony on March 14th in Berlin. This is the second blog in our 5-part series.  

Author: Anne Gärtner is a postdoctoral researcher in personality psychology at the TUD Dresden University of Technology. She is principal investigator in the Collaborative Research Centre (CRC) 940 (“Volition and Cognitive Control”) and representative for junior scientists in the Executive Committee of the German Psychological Society (DGPs). She received her PhD in 2019 at the TUD Dresden University of Technology. In her research, she uses neuroimaging and neurobiological methods to improve our understanding on how people perceive and regulate emotions. Furthermore, her work aims to enhance transparency, credibility, and reproducibility of psychological science by studying how to reform research assessment towards quality evaluation.

In recent years, a consensus emerged in the scientific community that evaluating scientific performance solely based on quantitative indicators (such as number of publications, number of first authorships, h-index, journal impact factors) is inadequate. The shift towards prioritizing research quality, transparency, robustness, and reproducibility is evident in initiatives like the San Francisco Declaration on Research Assessment (DORA) and the Coalition for Advancing Research Assessment (CoARA).

Despite these efforts, hiring and promotion procedures at universities as well as funding decisions still heavily favor easily measurable quantitative indicators over assessing scientific work’s quality. A survey within the German Psychological Society highlights the disproportionate emphasis on publication numbers and funding, neglecting crucial quality criteria (Abele-Brehm & Bühner, 2016).

Relying solely on quantity in assessing scientific performance is problematic due to its questionable validity (Brembs et al., 2013; Dougherty & Horne, 2022; Kepes et al., 2022; Opthof, 1997). For example, research has shown that the correlation between journal rank (as measured via journal impact factors) and the methodological quality of papers published in a journal is low or even negative (Brembs, 2018). Moreover, academia is a competitive work environment, and incentives in hiring and promotion processes can influence individuals to prioritize prolific publishing over crucial aspects like research transparency and leadership skills. These incentive systems can have undesirable effects on the entire science system, especially without effective mechanisms for quality control and self-correction (Vazire & Holcombe, 2022).

Dr. Anne Gärtner, psychologist and neuroscientist at Dresden University of Technology, aims to shift hiring and promotion procedures from a focus on research quantity to a robust consideration of research quality. She was awarded the 2023 Early Career Award by the Einstein Foundation Berlin and advocates for incentivizing quality in research through the Responsible Research Assessment Initiative. She plans to establish criteria incorporating qualitative aspects such as integrity, transparency, robustness, and methodological rigor using a scoring system (Gärtner et al., 2022; Schönbrodt et al., 2022). For example: Are all research data and materials documented in FAIR format and open accessible? Are statistical analyses accompanied by comprehensible meta-data and code that is publicly available? Can the research be replicated and independently verified? Was the research pre-registered and the methodology disclosed before publication? Do formulated theories adhere to the principles of formal logic?

Collaborating since 2020, the initiative published their proposal in several articles and received more than 40 commentaries from the academic community in response. Since then, the endeavor has turned into a community-driven effort, with multiple bottom-up working groups to work on field-specific expansion sets. The next goal of the initiative is to conduct interviews with experts and various stakeholders to refine the criteria and foster the way to their application in appointment procedures. Furthermore, the initiative is currently developing an online tool that integrates qualitative assessment with the responsible use of quantitative indicators to support hiring and promotion committees (see here for more information).

While the focus lies currently on research output assessment, Anne Gärtner’s long-term vision is to develop a more comprehensive set of metrics that also covers the remaining types of academic contributions commonly evaluated in hiring and promotion procedures: teaching quality, leadership skills, academic governance, and social impact. The shift away from metrics of publication quantity in hiring and promotion procedures could ultimately become a blueprint for the entire academic system and, for example, be transferred to the distribution of research funding, scholarships, and awards.

“Prioritizing quality in research requires a fundamental change in the incentives of the academic system.”

Anne Gärtner, Postdoctoral research fellow at the Faculty of Psychology at the Dresden University of Technology

There is a dilemma faced by young scientists to either stick to the old system and publish as many papers as possible – or invest time and effort in producing high-quality research, which in turn negatively impacts the quantity, and thus, the career. Anne Gärtner hopes the project fosters a future where increased publication signifies more high-quality and valuable research.

“I hope that our project contributes to a future where we’ll see not just more research being published every year, but more high-quality and high-value research.”, says Anne Gärtner.

You can read more about the general principles of responsible research assessment in this preprint and a specific proposal for hiring and promotion in this preprint (these articles will soon be published in Meta-Psychology together with 15 commentaries). 

Here are some resources that may be of interest:


Abele-Brehm, A. E., & Bühner, M. (2016). Wer soll die Professur bekommen?: Eine Untersuchung zur Bewertung von Auswahlkriterien in Berufungsverfahren der Psychologie. Psychologische Rundschau, 67(4), 250–261.

Brembs, B. (2018). Prestigious Science Journals Struggle to Reach Even Average Reliability. Frontiers in Human Neuroscience, 12, 37.

Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: Unintended consequences of journal rank. Frontiers in Human Neuroscience, 7.

Dougherty, M. R., & Horne, Z. (2022). Citation counts and journal impact factors do not capture some indicators of research quality in the behavioural and brain sciences. Royal Society Open Science, 9(8), 220334.

Gärtner, A., Leising, D., & Schönbrodt, F. D. (2022). Responsible Research Assessment II: A specific proposal for hiring and promotion in psychology [Preprint]. PsyArXiv.

Kepes, S., Keener, S. K., McDaniel, M. A., & Hartman, N. S. (2022). Questionable research practices among researchers in the most research‐productive management programs. Journal of Organizational Behavior, 43(7), 1190–1208.

Opthof, T. (1997). Sense and nonsense about the impact factor. Cardiovascular Research, 33(1), 1–7.

Schönbrodt, F., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D., Phan, L. V., Schmitt, M., Scheel, A. M., Schubert, A.-L., Steinberg, U., & Leising, D. (2022). Responsible Research Assessment I: Implementing DORA for hiring and promotion in psychology.

Vazire, S., & Holcombe, A. O. (2022). Where Are the Self-Correcting Mechanisms in Science? Review of General Psychology, 26(2), 212–223.

Related Posts
Back to top