This year, two Cancer Research UK (CRUK) award programs include a little something extra in their spring grant cycles. The funder has…
PLOS has received a grant from the Alfred P. Sloan Foundation to study how researchers evaluate both the credibility and impact of research outputs (e.g. articles, preprints, data, and code). We will be conducting this research in partnership with the American Society for Cell Biology (ASCB).
We’ll be looking at this in two contexts: (1) when researchers are discovering and reading these outputs in the process of their own research, and (2) when they are assessing these outputs while participating in grant application review panels and hiring committees.
We are interested in characterizing the steps that researchers go through to form judgments of both credibility and impact in these two contexts.
Why is this new research needed?
Previous research has explored the factors that influence trust and how researchers decide which articles to read and cite, showing that personal inspection, social clues, and peer review, are important. Recently, preprints have created new challenges for researchers who evaluate large amounts of new information outside of the traditional framework of journal peer review. A survey by the Center for Open Science suggested that cues related to Open Science content (e.g. signaling open availability of data) and independent verification are important for judging credibility of preprints.
Yet, the current system for assessing research is dominated by signifiers of impact. Since impact cannot be known until after time has passed, various proxies have been used that signal perceived, or the potential for, impact such as publication in a high Impact Factor journal.
PLOS and other organizations, in particular the multi-stakeholder organization DORA, have stressed the negative consequences of such a narrow focus on proxies of impact and the need for reform. Most problematically, researchers are pushed to place a higher priority on pursuing these proxies than on making their research credible, reproducible, and reusable.
Many have suggested that this focus on proxies for impact in research assessment is at least in part due to the practical limitations in evaluating credibility and impact. By understanding what truly matters to researchers in forming these judgments, we hope to find insights that can inform better practice in research assessment, inspire better tools for researchers, and help us evolve how we signal the markers of credibility in the articles we publish.
Collaboration with ASCB
We are not undertaking this project alone! We are excited to be conducting this research in collaboration with the ASCB. This partnership helps maximize the connection of this project to the research community. As the organizational sponsor of DORA, ASCB is also a scientific association demonstrably dedicated to advocating sound research policies and improving research culture.
What comes after the research?
We will publish a report on the insights obtained through this research. This report will have a CC BY license, with anonymized and aggregated data.
Future research might include a quantitative initiative to validate the findings with a broader group of researchers. Ultimately, we would like to understand if there are opportunities to better serve researchers’ needs to assess and discover new research than currently available methods. Therefore, we also hope that our initial report will prompt further research by others, which we believe is important for the research and scholarly communication communities. We are very thankful to the Alfred P. Sloan Foundation for their interest in this research project and dedicated support of Scholarly Communication issues in general.
We are confident that improvements to research assessment culture and practice are possible. And we also believe that there is no more important a moment to be doing this, as we experience a global crisis during which unbiased, rigorous, and credible research will play an unprecedented role.
The successful research proposal was inspired and improved by conversations with many representatives from the research, funding, and publishing community. We wish to acknowledge and thank, in particular, Jessica Polka and Ron Vale for insightful discussions.