Deloitte Australia fined for AI error-strewn government report
Deloitte Australia fined for AI error-strewn government report
Occurred: September 2025
Page published: October 2025
Deloitte Australia produced an error-strewn report caused by the sloppy and opaque use of generative AI for the Australian government, raising concerns about the company's governance and the credibility of government reports.
Deloitte Australia is partially refunding the Australian government after producing a government report filled with numerous errors, many apparently caused by the use of generative AI.
The report, which assessed the Targeted Compliance Framework (TCF) of the country's welfare system, contained fabricated academic references, fake citations, and even a made-up quote from a Federal Court judgment.
Sydney University researcher Chris Rudge alerted the media that the report was “full of fabricated references,” prompting accusations of inappropriate use of AI from politicians, academia and the media.
Despite the errors, the Department of Employment and Workplace Relations (DEWR) confirmed that the substantive recommendations of the report remained unchanged.
Deloitte used AI tools (specifically a large language model based on Azure OpenAI GPT-4o) without adequate safeguards, leading to "hallucinations" where AI generated false or misleading information.
Deloitte did not disclose its use of AI in the original report, though it did include it in a revised version.
The company agreed to repay the final installment of the AUD 440,000 contract as a voluntary penalty. However, some politicians criticised Deloitte's handling and called for full refund and better oversight on supplers' use of AI.
The incident highlights the risks of relying on generative AI in high-stakes analytical work without proper human review and transparency, not least in government - which can have broad policy and social implications.
It also affects the government's trust in consulting firms and stresses the need for rigorous verification when AI is involved.
Deloitte's fumble comes as AI puts pressure on consultancies, calling into question the quality of their advice, their ability to harness AI, and their business model.
The consultancy's woes come at a poor time. The same day the fumble became public news, the company announced a deal to create compliance products and features for regulated industries with Anthropic the same day its Australian government report mishap was revealed.
What the impact will be on the consulatncy's service quality and on the jobs of its employees remains to be seen.
GPT-4o
Developer: OpenAI
Country: Australia
Sector: Govt - welfare
Purpose: Generate report
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability; Mis/disinformation; Transparency