DOGE uses faulty AI to cut Veterans Affairs contracts
DOGE uses faulty AI to cut Veterans Affairs contracts
Occurred: June 2025
Page published: September 2025
The US Department of Government Efficiency (DOGE) developed a fundamentally flawed AI tool to identify and cut non-essential contracts at the Department of Veterans Affairs (VA), raising serious concerns about transparency and accountability, and its impact on veterans' services.
An AI-powered tool created by DOGS to review at speed nearly 90,000 VA contracts and identify which could be canceled made critical mistakes such as misreading contract values - inflating some from tens of thousands to USD 34 million.
Created under intense time pressure in response to a presidential order, the tool was developed by Department of Government Efficiency staffer Sahil Lavingia to identify contracts not "directly supporting patient care".
Contracts identified as unneeded were labeled as "munchable."
Non-profit news outfit ProPublica found at least 24 flagged contracts canceled using the tool, including contracts to maintain a gene sequencing device used to develop better cancer treatments and for blood sample analysis in support of a VA research project.
The system used outdated models, only reviewed the first 2,500 words of contracts and had inadequate expert oversight, resulting in glaringly avoidable errors.
Notably, Lavingia’s prompts also failed to include any context about how the VA operates, what kinds of contracts are essential or which ones are required by US federal law, resulting in essential services being mistakenly flagged for termination.
Lavingia, who had no medical or government experience, acknowledged the system produced errors, but said they were later corrected by VA staff. He was later fired for giving an interview to Fast Company magazine about his work with DOGE.
Independent experts agreed AI was the wrong technology for the job.
The flaws in the Lavingia's AI system raised serious concerns about its impact on VA healthcare services operations, and on veterans' health.
The incident was also seen to highlight the dangers of deploying unvetted AI in sensitive government functions without clear governance and safeguards.
Two US senators called for investigations into the use of AI in the VA's contract review process to ensure better accountability and prevent further harm.
The lawmakers argued that the use of AI to identify contracts for termination “adds an entire new level of unease connected to the decision-making, security, governance, and quality control of the entire process.”
MUNCHABLE
Developer: Department of Government Efficiency
Country: USA
Sector: Government
Purpose: Identify unneeded contracts
Technology: Large language model; Machine learning
Issue: Accountability; Accuracy/reliability; Transparency
Incident no: AIAAIC2020