Google, Microsoft image searches list nonconsensual deepfake porn

Occurred: May 2024

Google and Microsoft’s Bing included nonconsensual deepfake porn in top image search results alongside tools that advertise the ability to create such material.

NBC News found that deepfake pornographic images featuring the likenesses of 36 female celebrities were the first images Google and other top search engines surfaced in searches for many women’s names and the word “deepfakes,” as well as general terms like “deepfake porn” or “fake nudes.” 

A review of the results found nonconsensual deepfake images and links to deepfake videos in the top Google results for 34 of those searches and the top Bing results for 35 of them. Over half of the top results were links to a popular deepfake website or a competitor. 

The two search engines also returned links to multiple apps and programmes to create and view nonconsensual deepfake porn in the first six results for searches for “fake nudes”.

The incident raised questions about the two companies' ability to detect and manage deepfakes. It also prompted commentators to point to the degradation of the two search engines and the information ecosystem more generally.

Operator: NBC News
Developer: Alphabet/Google; Microsoft
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Determine reliability
Technology: 
Issue:  Accuracy/reliability; Mis/disinformation
Transparency: Governance