Written by Marcel LaFlamme With the latest release of the PLOS Open Science Indicators (OSI) results, we are introducing a new indicator…
Engaging with article metrics: own your impact narrative
As we wrote back in March, PLOS was an early adopter and advocate of metrics at the article level. The move to article-level metrics (often known as altmetrics) is “defined by its belief that research impact is best measured by diverse real time data about how a research artifact is being used and discussed.”
Article-level metrics/altmetrics have always been presented as alternatives to solely relying on traditional shorthand assessments like total citation counts and the reductive Journal Impact Factor. Hopefully, a detailed critique of the perils and misuse of journal-level impact metrics is not needed in this day and age, suffice to repeat that it is unscientific to convey more credit and reward to one researcher’s article over another’s based solely upon the Journal Impact Factor of the venue of publication.
We’re hopeful that culture is changing, thanks to the efforts of DORA and other organizations advocating, and now providing concrete guidance, for the responsible use of metrics in research assessment. Also, some funders, like the Howard Hughes Medical Institute (HHMI), are making explicit how they review their HHMI Investigators, clearly encouraging them to tell the full story of their impact beyond a simple publication record.
Researchers like you can help. The purpose of this post is to empower researchers to engage with a variety of accessible altmetrics, and signals of rigorous research practice, to define your research article’s unique “impact narrative” and more effectively showcase its strengths.
What metrics and signals can articles offer?
Knowing and highlighting your article-level metrics helps facilitate and normalize more appropriate assessment. When relying on metrics to describe your work, use detailed metrics that are specific to your article such as:
- Citations over time
- Usage over time
- Attention (beyond citations) over time (e.g. from news outlets, policies, social media) often measured as “altmetrics”
The data displayed on PLOS articles’ “Metrics” tab showcases how many times the article has been viewed, cited, saved, and discussed, and clicking on any of the sections opens up the details (e.g. which articles have cited this article).
Metrics can also be enhanced by other knowable signals that convey specific qualities about your article. Some of the most general of these are listed below:
- Peer-review status
- Open availability of peer review report/decision letters
- Availability of Open data/code/protocols/materials
- Adherence to relevant ethical standards for research
- Adherence to standards for rigorous research reporting (e.g. those of EQUATOR network, ARRIVE, or MDAR)
At PLOS, to showcase peer-review status, all articles show the Editor who handled the submission, the peer review timeline, and, if the authors chose the option, the published peer review report.
When publishing your research, you can make choices that help you demonstrate your commitment to rigor and credibility. When publishing with PLOS you commit to share your underlying data. You may also choose to share your protocols, your code, to pre-register your study, and to publish the peer review history of your papers. These are important choices that affirm your commitment to transparency and to your results being available to the scientific community to build upon. You can highlight these choices too, in addition to metrics!
Which metrics and signals should I use to tell my Impact Narrative?
To establish an effective Impact Narrative, showcase altmetrics that are the most relevant to the intentions of your research. Below are some ways this could be done. None of the examples below are mutually exclusive, researchers could be electing to promote all of these qualities:
Impact narratives: examples
- If citations are still the most useful measure of impact for your intentions, ensure you list your article with the most up to date citation counts (including the details) rather than stating the Impact Factor of the journal you published in (especially if that score is calculated based on past articles that are not your own!)
- If you intended your research to get the attention of the public in a geographic region, or of a particular community, showcase the media mentions and social media coverage you received in those target areas/communities. Each article has a great summary page, or you can highlight specific data points within that. Explore!
- If having your article openly peer reviewed was of particular importance to you, ensure you link directly to the published reviewer reports.
- If having your protocol readily available to be built upon is of particular importance, highlight this.
- If ensuring your data is open and reusable was an impact you were striving for, ensure you always highlight that your data is openly available (your PLOS article will always show this in the Data Availability Statement), and if your data is published on a data publication platform (example datasets connected to a PLOS Genetics article), highlight that unique usage too! In general, always remember to demonstrate your impact in additional ways than always related to the article itself — by showcasing any reuse or extension of your data, code, protocols, etc.
Conclusion
We encourage showcasing the signals of rigorous research practices. We encourage engagement, but not obsession, with metrics. Article-level metrics and altmetrics are intended to be useful, and metrics themselves are neutral, but any metric has the potential to be misused, over-interpreted, over-engineered, and gamed, by actors invested in a system. Therefore, here at PLOS, we believe the key is to always:
- Be transparent (e.g. show the maximum or most accurate detail available, even if an average or total is displayed or used)
- Engage with metrics and signals to ensure you own the way they should relate to you (e.g. create own your impact narrative)
Further reading
Reimagining academic assessment: stories of innovation and change
Case studies of universities and national consortia highlight key elements of institutional change to improve academic career assessment.
Research Culture: Changing how we evaluate research is difficult, but not impossible
eLife 2020;9:e58654.
This article outlines a framework for driving institutional change that was developed at a meeting convened by DORA and the Howard Hughes Medical Institute. The framework has four broad goals: understanding the obstacles to changes in the way research is assessed; experimenting with different approaches; creating a shared vision when revising existing policies and practices; and communicating that vision on campus and beyond.
Résumé for Researchers
Royal Society: Résumé for Researchers is intended to be a flexible tool that can be adapted to a range of different processes that require a summative evaluation of a researcher, recognising that their relative importance will be context-specific.
Measuring Up: Impact Factors Do Not Reflect Article Citation Rates
Official PLOS Blog