Feature

Could the Surgisphere Lancet and NEJM retractions debacle happen again?


 

AI-assisted peer review

A series of investigative reports by The Guardian raised questions about Sapan Desai, the CEO of Surgisphere, including the fact that hospitals purporting to have contributed data to Surgisphere had never heard of the company.

However, peer reviewers are not expected to be investigative reporters, explained Dr. Malički.

“In an ideal world, editors and peer reviewers would have a chance to look at raw data or would have a certificate from the academic institution the authors are affiliated with that the data have been inspected by the institution, but in the real world, of course, this does not happen,” he said.

Artificial intelligence software is being developed and deployed to assist in the peer review process, Dr. Malički noted. In July 2020, Frontiers Science News debuted its Artificial Intelligence Review Assistant to help editors, reviewers, and authors evaluate the quality of a manuscript. The program can make up to 20 recommendations, including “the assessment of language quality, the detection of plagiarism, and identification of potential conflicts of interest.” The program is now in use in all 103 journals published by Frontiers. Preliminary software is also available to detect statistical errors.

Another system under development is FAIRware, an initiative of the Research on Research Institute in partnership with the Stanford Center for Biomedical Informatics Research. The partnership’s goal is to “develop an automated online tool (or suite of tools) to help researchers ensure that the datasets they produce are ‘FAIR’ at the point of creation,” said Dr. Malički, referring to the findability, accessibility, interoperability, and reusability (FAIR) guiding principles for data management. The principles aim to increase the ability of machines to automatically find and use the data, as well as to support its reuse by individuals.

He added that these advanced tools cannot replace human reviewers, who will “likely always be a necessary quality check in the process.”

Greater transparency needed

Another limitation of peer review is the reviewers themselves, according to Dr. Malički. “It’s a step in the right direction that The Lancet is now requesting a peer reviewer with expertise in big datasets, but it does not go far enough to increase accountability of peer reviewers,” he said.

Dr. Malički is the co–editor-in-chief of the journal Research Integrity and Peer Review , which has “an open and transparent review process – meaning that we reveal the names of the reviewers to the public and we publish the full review report alongside the paper.” The publication also allows the authors to make public the original version they sent.

Dr. Malički cited several advantages to transparent peer review, particularly the increased accountability that results from placing potential conflicts of interest under the microscope.

As for the concern that identifying the reviewers might soften the review process, “there is little evidence to substantiate that concern,” he added.

Dr. Malički emphasized that making reviews public “is not a problem – people voice strong opinions at conferences and elsewhere. The question remains, who gets to decide if the criticism has been adequately addressed, so that the findings of the study still stand?”

He acknowledged that, “as in politics and on many social platforms, rage, hatred, and personal attacks divert the discussion from the topic at hand, which is why a good moderator is needed.”

A journal editor or a moderator at a scientific conference may be tasked with “stopping all talk not directly related to the topic.”

Pages

Next Article: