Study: Top AI image generators 'easily' produce misleading election photos

Occurred: March 2024

Can you improve this page?
Share your insights with us

AI image generators created election disinformation in 41 percent of cases, including images that could support false claims about candidates or election fraud, according to researchers.

A study by the Center for Countering Digital Hate (CCDH) found that four popular AI image generators - Midjourney, ChatGPT Plus, DreamStudio and Microsoft’s Image Creator - could 'easily' be manipulated into creating deceptive election-related images.

Tested using 40 simple text prompts on the theme of the 2024 US presidential election, the systems generated believable images including one showing Joe Biden lying in bed in a hospital, a picture of Donald Trump in a detention cell, and a security camera image showing a man in a sweatshirt using a baseball bat to smash open a ballot collection box.

The prompts were devised to mimic criminal actors' attempts to spread false information, and then changed to 'jailbreak' the original requests. Midjourney was found to perform the worst of the tools, failing in 65 percent of test runs.

The study raised concerns about how effectively platform policies against producing misleading content were being enforced, and highlighted how the systems could be misused for political purposes.

Databank

Operator: Microsoft; Midjourney; OpenAI; Stability AI
Developer: Microsoft; Midjourney; OpenAI; Stability AI
Country: USA
Sector: Politics
Purpose: Generate images
Technology: Text-to-image; Generative adversarial network (GAN); Neural network; Deep learning; Machine learning
Issue: Mis/disinformation
Transparency

System


Research, advocacy


News, commentary, analysis