Last month, a millipede expert in Denmark received an email notifying him that one of his publications had been mentioned in a new manuscript on Preprints.org. But when the researcher, Henrik Enghoff, downloaded the paper, he learned that it cited his work for something off-topic.
Retraction Watch’s story highlights why researchers shouldn’t treat systems like ChatGPT as research assistants that can be safely delegated tasks. In this case, the artificial intelligence system made up references that superficially seemed credible but were, in fact entirely fictitious. The results appeared to match the search criteria the system was provided, but the results were a digital hallucination. The consequences, as they were in this case, can be incredibly serious and devastate a career. They should serve as a warning for researchers considering using ChatGPT in their work.
“I’ve never had anything like this happen before,” Enghoff, a professor at the Natural History Museum of Denmark, in Copenhagen, told Retraction Watch.
Flabbergasted, Enghoff reached out to David Richard Nash at the University of Copenhagen. A few months prior, Nash had been experimenting with OpenAI’s ChatGPT, an artificial-intelligence chatbot, to see if it could be used to find scientific literature. He asked the bot to provide him with recent references on the butterfly species he works with. “It came back with 10 plausible-looking papers,” only one of which existed, Nash told Retraction Watch.
After learning of Enghoff’s case, Nash emailed Preprints.org, a free preprint server owned by the scientific publisher MDPI. He explained that he had looked up five random references in the preprint and found that all of them were fictitious. He also hinted that generative artificial intelligence such as ChatGPT could have been at work, adding:
I suggest that you contact the authors directly and ask for an explanation (and hopefully a retraction and apology to the affected “authors” of these fake references), and also review your policies regarding accepting AI-generated textxs [sic] and references.