Occurred: August 2023
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Danish biologist and academic Henrik Enghoff was falsely cited by ChatGPT in a scientific paper about millipedes.ย
The citation resulted in the paper's withdrawal and raised further questions about the generative AI tool's tendency to 'hallucinate', or produce plusible sounding falsities.
Enghoff had first noticed something strange when he saw the paper, which was written by academics from Ethiopia and China, citing his work for something he does not write about, and referenced two paper he knew he had not authored, and which turned out not to exist.
The paper was first taken down in June 2023 by preprint archive Preprints.org, after David Richard Nash, a University of Copenhagen colleague of Enghoff's, had identified ChatGPT as the likely cuplrit and notified editors of the errors.ย
The paper subsequently resurfaced on preprint platform Research Square, which later withdrew it and blacklisted the 'authors'.ย
In July 2023, Kahsay Tadesse Mawcha of Ethiopia's Aksum University had admitted to Danish newspaper Weekendavisen that he had used ChatGPT when writing his paper, adding that he only later realised the tool was 'not recommended' for the task.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia ๐
Kahsay Tadesse Mawcha (2023). Review From Beneficial Arthropods to Soil-Dwelling Organisms: A Review on Millipedes in Africa
Kahsay Tadesse Mawcha (2023). Review From Beneficial Arthropods to Soil-Dwelling Organisms: A Review on Millipedes in Africa - v2 (pdf)
Page info
Type: Incident
Published: September 2023
Last updated: November 2023