Inspired by the nutrition-facts labels that have been displayed on US food packaging since the 1990s, John Willinsky wants to see academic publishing take a similar approach to help to inform readers on how strictly a paper meets scholarly standards.
A team at the Public Knowledge Project, a non-profit organization run by Willinsky and his colleagues at Simon Fraser University in Burnaby, Canada, has been investigating how such a label might be standardized in academic publishing1.
Willinsky spoke to Nature Index about what he hopes to achieve with the initiative.
Why should academic papers have publication-facts labels?
I, like many others, have grown concerned about research integrity. Through transparency, we want to show how closely journals and authors are adhering to the scholarly standards of publishing. We want to help readers, including researchers, the media and the public, to decide whether an article is worth reporting on or citing.
The facts that we have selected for the label include publisher and funder names, the journal’s acceptance rate and the number of peer reviewers. The label also shows whether the paper includes a competing-interests statement and an editor list, where the journal is indexed and whether the data have been made publicly available. Averages for other participating journals are listed, for comparison.
It’s important that such information is readily available. When we conducted an exercise with secondary-school students, asking them to find these facts for a single academic article online, many of them took 30 minutes to do so. Some couldn’t find the information. This finding justifies the need for the label: it shouldn’t take half an hour to establish that a journal adheres to scholarly standards.
How did you create the label?
The US nutrition-facts label has been proved to change people’s behaviour, specifically their food-purchasing habits2. Given that so much work went into the label’s development, I thought it would be wise to build on its design.
On the basis of our early consultations with researchers, editors, science journalists, primary-school teachers and others, we created a prototype with eight elements that reflect scholarly publishing standards. We’re now gathering feedback, and might decide to change some of the facts, or to add others. Some people, for example, suggested that we include the number of days that the peer-review process took to complete.
Is AI ready to mass-produce lay summaries of research articles?
We’ve built in ways to automatically generate the label, to ensure that the format is standardized across journals and articles and to make the label available in several languages. We have created a third-party verification system, too, to ensure that authors’ identities are not revealed to peer reviewers and vice versa. This relies on authors, reviewers and editors using ORCID, the service that provides unique indicators with which to identify researchers.
The label will be displayed on the article landing page of the journal website and will be included in the article PDF.
How are you trialling the label’s use?
We’ve completed work with ten focus groups involving journal editors and authors in the United States and Latin America. We also interviewed 15 science journalists about what kinds of fact they’d want to see at a glance.
We built the label specifically for journals using the scholarly publishing workflow system Open Journal System (OJS), run by the Public Knowledge Project. By the middle of the year, we hope to launch a pilot programme involving more than 100 journals using the OJS. The goal is to explore the prospects of industry-wide implementation of the label by next year.
How could journals be compelled to display such a label?
Unlike the nutrition-facts label, which was mandated by the US government, the publication-facts label is the result of voluntary concern about research integrity in the publishing industry.
Although many groups, such as the International Association of Scientific, Technical and Medical Publishers and the Committee on Publication Ethics, manage concerns about research integrity by releasing guidelines on best practices and accumulating tools to flag suspicious activity, we feel that they have not addressed the fact that open access is public access. We need to adapt our practices to cater to the needs of different audiences, not just those in academia.
Although we’re initially building the label for OJS journals, it is an open-source plug-in that other publishing platforms will easily be able to adapt. The software is currently listed as being ‘under development’ on GitHub and will be shared there on release.
We want to show the publishing industry that we’ve piloted this in our own environment and that it is readily adaptable. We want to show that, although you could build your own label, for the sake of comprehensibility, it’s better to have a common format.
This interview has been edited for length and clarity.