Posts tagged generative AI
Manuscripts and Metadata: Peer Review in the Digital Age

Contrary to the common caricature, the academic world has long been defined more by tech than by tweed. From the typewriting of manuscripts in the 60s to the launch of the first online humanities journal, Postmodern Culture, in 1990,and the founding of JSTOR  in 1994, scholarship has consistently embraced contemporary innovations in ways that enhance its rigour and reach.

Read More
A Technology Wishlist from the Sage Research Integrity Team

Interest in research integrity and ethics in academic publishing has skyrocketed in recent years. With the rise of paper mills, AI, and a news story almost every week about data manipulation scandals, it’s no surprise that the academic community is increasingly focused on these issues. This newfound attention has led to a rise in research integrity startups - companies creating research integrity tools in the hopes of combatting these modern threats.

Read More
False Positives and Red Flags: Navigating AI in Peer Review

Since the launch of ChatGPT in 2022, there has been concern among educators, authors, editors, and reviewers about the use of Large Language Models (LLMs) in writing research papers. An influx of “AI Detectors” appeared online seemingly overnight, claiming to identify content not written by an actual human. These new tools sparked a wave of emails from reviewers and authors asserting that the paper or review they were looking at was written by AI, based on these detectors’ results.

Read More
Three Laws of ChatGPT

Lots has been written in the short time since on the development of generative AI tools and the use of ChatGPT in teaching, research and academic publishing. It has also sparked lots of philosophical conversations within the Research Integrity Group at SAGE about the ethics of using generative AI to ‘write’ (in quote marks) articles, about the potential risks of publishing articles not written (not in quote marks) by a human, and whether bots qualify as authors.

Read More