9 minute read
Research Publishing Integrity
By Dr. Sven Fund (Managing Director, Fullstopp GmbH, Berlin, Germany; Phone: +49 (0) 172 511 4899) <sven.fund@fullstopp.com> www.fullstopp.com
Scholarly publishing is a branded business, with “integrity” at its core. Indeed, many in the industry and beyond think of prestigious journal brands, esteemed book series, and famous publisher names first when describing this space. There is one unifying factor that all these diverse, different, hopefully distinct, and even unique, brands have in common: None of them would be worth much more than the paper they use for printing without the guarantee of the quality they promise. Quality is the bedrock of publishing, the assumption that all actors in science and publishing adhere to a set of values and principles when conducting their research and disseminating results.
This invisible contract between members of the scientific community is often inaccurately referred to as “research integrity,” although the term “publication integrity” would be more accurate. This latter term would be better as it locates responsibilities more clearly among actors that have an impact on, and an interest, in an intact system.
This issue of Against the Grain is devoted in part to exploring the roles, mechanisms, and responsibilities of “publishing integrity.” We do not limit the conversation to owners of journals, book series, or other types of content: This issue aims to widen the horizon and include libraries and technology providers.
We are not primarily interested in what goes wrong in this issue but also why things go wrong and how different players in the ecosystem can collaborate to protect the heart of the publishing system as it currently exists. We also endeavor to provide an international perspective on this issue, especially as we live in a world where the standards discussed here are diverse.
Digitization, Volume, and Integrity
The digitization of academic publishing over the past two decades is intrinsically linked to a massive increase in the volume of published research. Notably, based on Lens.org data, the number of scholarly works published in 2022 is more than five times the number published in 2000. Thanks to the evolution of the “Publish or Perish” paradigm into a KPIzation of the publication process, there is no end in sight to the inflation of article numbers. Researchers have to publish constantly and share their discoveries as rapidly as possible. They need to do this in order to get career recognition, promotion, and/or less teaching responsibilities from their universities.
Science publishers have adapted perfectly to this trend for volume. This includes the big traditional publishers as well as a range of newcomers who have created business models mainly dependent on volume — and volume increase. The infamous Big Deal, in which libraries were offered “all you can eat” options for scholarly content, led to heated debates around library choice and price discounts.
In an open access world, the Big Deal has survived, though in the much more acceptable form of Transformative Agreements (TAs). Combined read and publish agreements also offer the simplicity of unlimited use of the products scholarly publishers deliver, but now in two dimensions. These kinds of agreements make it easy for all parties involved to account for the unplannable publishing and reading behavior of an institution’s faculty and its students.
A Changing Value Creation
Alongside this increase in volume, facilitated by a business model that can hardly go wrong, publishers have changed their approach to creating value. Many publishers today outsource a large part of the work on manuscripts to vendors around the world, employing relatively inexpensive labor — and increasingly sophisticated technology. This specialization in the value chain has left little but quality assurance to publishers. Brand names, decorated with impressive impact factors, altmetrics scores, and other quality signals, are tightly controlled by publishers as, it seems, the last bastion justifying their existence. Innovative technology companies have developed alongside publishers as they retrench from the value chain, filling the gap with techbased services to support them.
Publishing Integrity, Not Research Integrity
The integrity of the publication process is rhetorically oftentimes euphemistically confused as research integrity. There is overlap between the two, but the core of this problem requires clarity, not verbal confusion. The interests of academic publishers are just too aligned to push fraudulent behavior back into the science system. Indeed, publishers have proven over the centuries that they can safeguard the process of publishing key research in an ethical way. We’d argue that they should not try to throw spanners in the works now that they see growing challenges.
“Bad Actors” On The Offense?
Listening to conversations around publishing integrity and the general lack thereof today, it feels that academia has fallen into the hands of a new sort of mafia. Prestigious journals are being tricked into publishing research articles, and everything from obvious nonsense to manipulated datasets are being thrown at editorial offices. We only hear about these issues when things go wrong.
It is not all that easy, unfortunately. Incorrect incentives offered to researchers climbing the career ladder contribute a lot. “Wrong” often is a function of institutions trying to be objective.
An explanation involving dysfunctional incentives is too simplistic. Indeed, if this was true, we would assume bankers caused fraud just because they connected with other peoples’ money or similarly that doctors participate in patient organ trafficking. A dense network of laws and compliance rules prevent mass abuse and criminal behavior in these examples, and the same is true for scholarly publishing. What sounds most promising beyond defensive strategies is a long overdue reworking of the incentive system governing academic careers. Simultaneously, publishers and research institutions need to have an interest in safeguarding long-term success over shortterm gains.
A Proposed Ecosystem Approach
Publishers are not the only ones with responsibility to avoid fraudulent and unethical behavior in an ecosystem with increasingly distributed responsibilities. This issue of Against the Grain brings together voices from all parts of this landscape, with thought leaders contributing insights from their daily work and future plans.
In this issue, Adya Misra from Sage takes a look at publishers as guardians of publishing quality, highlighting the importance of collaboration as well as the critical role organizations like the Committee on Publication Ethics (COPE) can play in this.
Gareth Dyke, my colleague from Reviewer Credits, starts even earlier. Early career researchers (ECRs) often highlight the fact that many of them have never been trained in conducting peer review. This noteworthy contradiction between a lack of training while conducting the core element of research publishing was also discussed with much rigor at this year’s Academic Publishing in Europe (APE) conference. Gareth has some thought-provoking advice on how to tackle the issue within the workflow of research articles.
Samantha Green and Sami Benchekroun from Morressier spell out challenges as well as potential solutions around human (plagiarism, conflict of interest, ethical violations) and technically induced phenomena like data fabrication and AIgenerated content. Their view on the tech landscape provides us with a number of tangible approaches for the future.
ATG would not be complete without libraries’ views on their critical contributions to publishing integrity early on in the publication cycle. Academic institutions operate in a much broader setting of science and its code of conduct than tech providers or publishers. Dirk Pieper from Bielefeld University Library takes a closer look at the role he and his colleagues play.
Vigilant, Not Alarmist
Despite a heightened level of nervousness around publishing integrity given the drive for volume on the one hand and muchenhanced technological possibilities on the other, it seems that the bedrock for a well-functioning immune response is intact: Players across the spectrum not only agree in the importance of quality control in scholarly publishing, they are also developing responses to new challenges and threats to the system. These changes are happening at speed adequate to address these issues.
Make no mistake, the systemic issues publishing integrity are facing are not being resolved by pointing to anonymous bad actors or single cases. The volume of fraudulent “papers” is just too high, the degree of organization too widespread, and the technology at hand too sophisticated for business as usual.
Solutions presented in this ATG issue give ideas of how players in the ecosystem respond to a potentially existential threat to their operating model. This topic is, and remains, one of shared responsibility and that does not make things easier. Technological innovations will also put additional pressures on the system, with fast learning large language models being kindred companions for mass compilation of content. These are being smiled at by a few for their obvious shortcomings; most observers see the potential they hold for all of us in just a blink of an eye from now.
Quality — Well Defined
To some observers, a one size fits all approach to quality assurance does not make sense any longer — if it ever did. More sophisticated technology, it seems, calls for three replies: Visibility, transparency, and a graded approach.
• Visibility: Fraudulent behavior grows best in the shade, and visibility creates accountability for authors, reviewers, publishers, and institutions. Thus, by making the steps in quality assurance as much a part of the academic record which is visible to the community and the public as an end result, academia and publishing are best served. This is a central challenge for many existing infrastructures and some business models.
• Transparency: The days of smoky, dimly lit gentlemen’s clubs in providing quality in research publishing are definitely over; transparency wins over blind meritocracy.
• A graded approach: COVID research has shown us how important speed of publication can be, even at the expense of traditional patterns of quality assurance. No doubt instant publication in repositories has saved lives; there is no reason to not rely on a staggered approach in many other disciplines and publication formats.
The quality assurance system in academic publishing and hence the concept of publishing integrity needs an overhaul in our current paradigm with potentially game changing new technological capabilities and a system which is already under pressure. The articles complied in this issue of ATG have one uniting theme: Systemic challenges can only be addressed by players in the ecosystem working together. The concepts are out there, technology is at hand, and most arguments are exchanged. What is needed now is an effort, primarily from researchers themselves, to redefine “level of quality” and which signals around the publication process are needed to serve science and research best.