After exploring how AI supports editorial workflows (Fabio Di Bello) and how it performs in practice (Lucia Steele), this final post broadens the lens. Sven Fund, Managing Director at Reviewer Credits, tackled the systemic issues plaguing peer review, reviewer fatigue, uneven recognition, broken incentives, and explored how AI could be part of the solution, not just for publishers, but for the people who make peer review work: the reviewers themselves.
Flipping the Frame: From Publisher Tools to Reviewer Support
“If you asked a taxi driver whether they support autonomous vehicles, what would they say?” Sven opened with this analogy to make a simple point: whether reviewers adopt AI depends on whether it’s designed to help them, not replace them.
Sven’s argument was compelling: peer review doesn’t just need technical optimization; it needs a cultural shift. And that means shifting our perspective, from what AI can do for publishers, to what it can do for reviewers.
The Mess We’re In: Complexity, Clutter, and Cracks
The current peer review system, as Sven framed it, suffers from five overlapping problems:
- Volume – Rising submission rates outpace reviewer capacity.
- Inappropriate Workload – Editors and reviewers spend time on tasks better handled by machines.
- Broken Workflows – Clunky platforms and inconsistent processes slow everything down.
- Sophisticated Fraud – From papermills to image manipulation, bad actors are evolving.
- Recognition Deficit – Reviewers are invisible, under-credited, and burning out.
In short, peer review is under pressure, and not just logistically. Trust, fairness, and sustainability are on the line.
AI as Force Multiplier, Not Replacement
Sven echoed previous speakers in rejecting AI replacement fantasies. Instead, he sees AI as a force multiplier, a way to handle the scale and complexity of modern peer review without crushing the people who make it work.
Key use cases include:
- Pre-screening for misconduct: IOPP, for example, rejected 8.5% of manuscripts before peer review in 2024 based on ethical concerns flagged by AI tools.
- Reviewer assistance: AI can suggest reviewers, check citations for relevance, detect fraud, and even help reviewers structure their reports.
- Workflow optimization: AI can untangle the “spaghetti code” of publishing systems, creating smoother, faster editorial experiences.
But the goal, Sven emphasized, should be more than speed. It should be to make peer review more rewarding, meaningful, and manageable for those doing the work.
What Do Reviewers Actually Want?
Reviewer Credits surveyed researchers about their attitudes toward AI in peer review:
- 57% supported AI as an assistant to expert reviewers.
- 14% preferred using AI tools before peer review to catch basic issues.
- 29% rejected AI involvement altogether.
These results echoed a broader trend seen in a Nature survey: most researchers haven’t tried AI tools in peer review, but many are open to doing so. The tipping point, Sven suggested, will come with better tools, better training, and better transparency.
The Reviewer Is the Resource
Sven made a powerful argument: reviewers, not just editors or platforms, are the true scarce resource in peer review. And AI offers a chance to finally respect their time and effort:
- Standardized, AI-supported templates could reduce hours of redundant work.
- Citation checking tools could eliminate the need to manually verify 50 references.
- Automated formatting checks could free reviewers to focus on scientific critique.
- Recognition systems, like Reviewer Credits, can finally give reviewers tangible, verifiable credit for their labor.
He proposed a provocative benchmark: if AI could cut peer review time in half, it would transform the system. Reviewers would reclaim time for research, authors would get faster feedback, and the entire ecosystem would be more humane and efficient.
Tool Proliferation—and the Need for Curation
Sven shared a curated list of AI peer review tools (from Reviewer Credits’ internal research), highlighting the explosion of point solutions in this space. But Sven raised a warning flag: without shared standards or transparency, the ecosystem risks fragmentation. He called for more interoperability, clearer evaluation frameworks, and even suggested a “hackathon” where tools could be tested side-by-side using the same manuscript.
Don’t Just Fix the Tech, Fix the Culture
Beyond tools, Sven urged the audience to rethink the values embedded in peer review. We shouldn’t just ask: How can AI improve speed? We should ask:
- Does this help reviewers feel respected?
- Does this reduce cognitive load and burnout?
- Does it make peer review fairer and more transparent?
He advocated for KPIs that center the reviewer: time-to-completion, satisfaction, efficiency gains—not just time-to-publication or editorial throughput. If we want peer review to survive and evolve, we must build it around the people doing the work.
Post 3 Afterword: A Reviewer-First Future
Sven Fund’s presentation closed the series with a shift in mindset. While publishers and editors are critical to peer review’s success, the system ultimately rests on the goodwill, expertise, and time of its reviewers. AI offers an opportunity, not just to make peer review faster, but to make it fairer, more humane, and more sustainable.
The future of peer review isn’t fully automated. It’s augmented. It’s collaborative. And it begins by putting reviewers first.
Series Conclusion: Toward Responsible, Reviewer-Centric AI in Peer Review
From Fabio Di Bello’s workflow deep dive, to Lucia Steele’s real-world experiments, to Sven Fund’s systemic reframing, one thing is clear: AI is here to stay in peer review. But responsible adoption requires more than clever tools, it requires clarity, accountability, and a deep respect for the human labor at the heart of the system.
As this series has shown, the path forward is not one of replacement, but of augmentation. The most promising uses of AI in peer review support editors, empower reviewers, and protect the integrity of science.
Stay curious. Stay ethical. And as always—credit your reviewers.
– By Tony Alves
Read the previous part