West Midlands police use fake AI output to ban Israeli fans from attending football match
West Midlands police use fake AI output to ban Israeli fans from attending football match
Occurred: November 2025
Page published: February 2026
The UK's West Midlands Police used AI-generated false information to justify banning Israeli football fans from a high-profile football match, sparking high-level criticism and a political backlash.
The UK's West Midlands Police recommended a total ban on Maccabi Tel Aviv fans attending a UEFA Europa League football match against Aston Villa at Villa Park in Birmingham scheduled for November 6, 2025. The force's intelligence report classified the event as "high risk," citing a history of violence involving the Israeli club’s supporters.
Specifically, the report included details of a "2023 match between West Ham and Maccabi Tel Aviv" where significant disorder allegedly occurred. This match never took place.
Additionally, the report "conflated" and exaggerated accounts of incidents in Amsterdam, claiming 500-600 fans had thrown innocent people into rivers and that 5,000 police officers had been deployed. The figures were later debunked by Dutch authorities and independent inspectors.
The ban remained in place despite criticism from UK Prime Minister Keir Starmer. While the match was played without the away fans, the subsequent investigation revealed that the "intelligence" was fabricated by an AI tool.
The core error was the inclusion of inaccurate, fabricated information generated by Microsoft Copilot that was treated as credible intelligence without adequate verification.
Officers, including the police force chief, initially misled Parliament about the use of AI, indicating a lack of transparency and oversight over how such technologies are integrated into official processes.
Independent reviews cited confirmation bias: police selected intelligence that supported a predetermined decision to recommend a ban rather than weighing all available evidence objectively.
For football fans: Israeli fans were denied the right to support their team based on fabricated information, while the local Jewish community felt alienated by the force's initial (and false) claim that they had supported the ban.
For policing: The police force faced criticism from MPs, and the UK Home Secretary publicly stated loss of confidence in the police chief, leading to the resignation of Chief Constable Craig Guildford. It also resulted in the suspension of Microsoft Copilot use across the force. It serves as a landmark case of how "shadow AI" (unauthorised use of AI by staff) can lead to legal and human rights violations.
For society: This event highlights the risk of "automation bias", where authorities trust machine-generated data more than human due diligence. It sparked a national debate in the UK regarding the need for strict statutory guardrails on AI in public safety and the justice system to prevent digital misinformation from becoming state-sanctioned discrimination.
Developer: Microsoft
Country: UK
Sector: Govt - police; Media/entertainment/sports/arts
Purpose: Assess violence risk
Technology: Generative AI
Issue: Accountability; Accuracy/reliability; Automation bias; Bias/discrimination; Mis/disinformation; Transparency
AIAAIC Repository ID: AIAAIC2182