Cosmos Magazine AI-generated science explainers prompt backlash
Occurred: August 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Australian science publication Cosmos Magazine faced a strong backlash for its decision to publish AI-generated articles.
Cosmos used a Walkley Foundation grant to develop a custom AI service based on OpenAI's GPT-4 large language model that generates explainer articles for the magazine's website.
However, the resulting articles contained inaccuracies and oversimplified scientific information. In one instance, an article titled "What happens to our bodies after death?" incorrectly described rigor mortis and autolysis.
The inaccuracies raised concerns about the potential harm to public trust in the publication. Contributors, former editors, and co-founders of Cosmos complained that they were not consulted about the project and argued that it undermines the role of journalists.
It also transpired that Cosmos' AI service has been trained on content from contributors who were not consulted about the project or asked for their consent. As freelancers, they are likely to have retained copyright over their work.
Cosmos responded by saying that its AI-generated content is fact-checked by a trained science communicator and edited by their publishing team and that it plans to continue reviewing the use of its AI service throughout the experiment, which is scheduled to run until February 2025.
The controversy sparked a broader debate about the use of AI in journalism, with critics arguing that such experiments should be conducted with transparency to maintain trust in scientific reporting.
System 🤖
Unknown
Operator: Cosmos
Developer: Cosmos
Country: Australia
Sector: Media/entertainment/sports/arts; Technology
Purpose: Generate articles