
9 minute read
ATG Special Report — Training Can Solve the Peer Reviewer Diversity Crisis
By Gareth Dyke (ReviewerCredits) <gareth.dyke@Reviewercredits.com>
Crisis in Peer Review
One recent study found that two-thirds of journal review requests get turned down. And during the pandemic, journal editors have had increasing trouble finding suitable peer reviewers for journal submissions. Where did it all go wrong? Or was it inevitably wrong from the outset?
Who Really Enjoys the Peer Review Process? Anyone?
Peer review is usually a daunting, exhausting process for both authors and article reviewers. No-one has a good time, or worse, gains much from this process; authors wait in breathless trepidation (and often fear) for comments to come back about their papers from reviewers via distant ivory-tower journal editors who themselves often struggle to find reviewers willing to work on articles in the first place.
When the wheels finally turn and an article is eventually published, authors tend to feel an enormous sense of relief: “I’ve survived peer review!” I wrote an eBook entitled “Peer Review Survival Skills” on just this topic (I had images of Bear Grylls battling through the wilderness in my head when I wrote it).
One of the key issues here is that limited training is often provided to editors, peer reviewers, and submitting authors. Best practices are out there, but are they being communicated to the key players in the peer review process? Often, participants (as we will see) are expected to “learn on the job” and so mistakes happen, compromising the whole process and publishing integrity as a whole.
Members of the third of our key groups, “reviewing researchers,” are often confused about the peer review process (e.g., common FAQs: Can I suggest peer reviewers, and if so, how?; How long does the process take?; How do I respond effectively when comments are returned?). Editors, if they are not professionals directly employed by publishers, are usually working academics who’ll use keywords for peer reviewer searches and then click buttons within manuscript processing systems like Editorial Manager and ScholarOne.
They’ll often have little insight into the peer reviewers they are selecting (especially if a paper is not directly within their own research area). These “reviewers” might be bad actors or set-ups by authors, again compromising the integrity of the process.
Mistakes happen. Articles get poorly peer reviewed, or worse: research papers end up getting published without being properly assessed. We’ve seen this recently: Clarivate has delisted a range of journals because of issues in peer review from the publisher side. It’s going to be informative to see how aggressive Open Access publishing models, often based around invited volumes of thematic articles and conference proceedings, will emerge from all of this.
Peer Reviewers
Spare a thought for peer reviewers, our focus in this article. These research authors (and they are always research authors, “reviewing researchers”) are often very unsure of how to peer review articles. With some notable exceptions, including the increasing use of open preprint servers, reviews (as part of article revision history) are often not published openly, so academics have little to go on. It’s very difficult to effectively review a new piece of research while simultaneously remaining objective and constructive. What are the key areas to focus on? Is it sufficient to comment on the language and presentation? What kind of review response should be provided if a methodology is clearly flawed or results are incorrectly presented?
Effective peer reviewing does develop with experience, of course, but as any journal editor will tell you, some “reviewing researchers” do a good and thorough job, while others are cursory, time-wasting, or downright rude. These are my Seven Dwarfs of Peer Review. Journal editors only seek one kind: Peer reviewers who are “in-depth, thoughtful and positive in their comments.” Training embedded within the peer review process can help.
I’ll never forget the first time I was asked by a journal to peer review a paper. I was working on my PhD in the UK and the article in question was co-authored by a more senior colleague whom I’d met several times at conferences. And a quite prestigious journal, as I recall. I was not sure what to do.
Arrogance kicks in a little. I’ve made it! The International Journal of X and Y has invited me to work as a peer reviewer. But there’s also imposter syndrome: “I’m not good enough — I don’t know enough — to get this done properly.”
My university did not provide any peer review training or any support at all. Mind you: I did not go looking for this either. I did not talk to my supervisor. I learned on the job and my only metric for “performing effective peer review” has been that journals have asked me to work on additional papers for them in the future.
A Solution: Peer Review Training Within The Workflow
The bulk of existing peer review training is made available on demand by publishers, editor associations, author services providers, or via researcher developer organizations like Vitae. The keywords here being “on demand.” I have to go looking for it, or I can get “accredited” to review for a particular journal or publisher by partaking in one of their course bundles. But where’s the incentive for me, a researcher? Many working academics take a dim view of the peer review process via publishers: “Why should I work for free for a journal which will then make a profit by selling the work on subscription?” Understandable. There’s no clear career development benefit for researchers.
Of course, publishers want to educate their peer reviewers to develop a pool of effective go-to researchers for their journals that they can dip into again and again. There is another way: for us to work with peer reviewers and show them that this process is actually a huge development opportunity. Learning to comment constructively and effectively on the work of others can be beneficial to your own career development.
Being positive about the work of others is a key transferable skill, but this is often not found in the peer review process. It’s human nature to think “let’s identify the issues with this article” when it’s placed in front of you for review, rather than “how can I help these authors to improve and get their work published.” It’s a different mindset. Many researchers who work as journal peer reviewers will not remain in academia: With attrition rates above 80% after PhDs and Postdocs, it’s key for us to share career enhancing transferable skills. What if your future boss gives you a document to comment on? You’d not want to be negative and find ways to bury it, you’d want to give comments to aid improvements.
This is the focus of our training at ReviewerCredits. Accrediting and identifying peer reviewers is one key aim, but we need to support researchers as they embark on article assessment, developing a pool of resources that they can use to become more effective. Learning about how peer review works, how it can be performed effectively, and what journals are looking for when it comes to a “good” review report will help authors better develop their own research articles and steer them more successfully through this process to acceptance.
Of course, every journal and area of study is different, but the fundamental basis of peer review remains the same: To quality control academic content that appears in our journals. A byproduct of the peer review process is to improve article content; of course, researchers often don’t feel that their papers have been improved by peer review, they’ve had to “survive” the process. Changing this mindset from both publisher and researcher perspectives is another key goal of ReviewerCredits.
Our innovative training programmes embedded in our accreditation process provide an overview of publishing and peer review for early career researchers (doctoral students and postdoctoral researchers) and help them develop transferable skills, which can be applied within and beyond academia. The programme can be tailored to meet the profile and learning needs of the participants, with a pre-workshop webinar (video on ReviewerCredit platform, free-to-view), workshop facilitation from highly experienced trainers (encompassing journal editors, peer-review best practice experts, and active researchers) as well as post-workshop review, evaluation, and accreditation.
Learning to be an effective peer reviewer means learning to assess the work of others critically in a positive and meaningful way. Positivity towards the work of others is key to collaboration and is also a transferable leadership skill.
Impacts On Research Integrity
Poor quality, poorly performed, or lacking peer reviews have a massive knock-on effect on research integrity as well as publishers’ bottom lines. One recent analysis found that the cost of a single article retraction exceeded $700,000. Obviously, publishers need to invest more in training researchers to act as effective, thoughtful, and constructive peer reviewers but also in building communities and engagement with their pools of quality control assessors.
We feel we have the solution at ReviewerCredits: A pathway for researchers to learn about the process of peer review, what journals need from them, and how they can be constructive and positive when commenting on the work of others. At the same time, peer reviewers are identified, accredited, and badged by the platform based on their skill level and number of successfully completed reviews for journals, with a positive, communitybased experience.
Publishers should work with researchers to enhance the peer review process. One way they can do this is by working with ReviewerCredits. We are living through a peer reviewer diversity crisis: By providing training and career development opportunities for researchers, we’ll enhance the pool of verified peer reviewers who are who they say they are; have interest in the journal editor connections on offer; and can provide robust, repeatable reviews for journals.
