Haringey Council homeless application cites fake law cases
Haringey Council homeless application cites fake law cases
Occurred: August 2024-
Report incident 🔥 | Improve page 💁 | Access database 🔢
A High Court judge condemned the legal team representing a homeless claimant against Haringey Council after they submitted five fictitious, AI-generated legal cases in court documents, leading to findings of professional misconduct and regulatory referrals.
In a judicial review case where a homeless man (Mr Ayinde) challenged Haringey Council's refusal to provide him with accommodation, his legal team - comprising barrister Sarah Forey and solicitors at Haringey Law Centre - submitted court documents citing five non-existent legal cases, including a purported Court of Appeal authority.
When challenged, the lawyers failed to produce copies of these cases and initially dismissed the issue as “minor citation errors” or “cosmetic errors”.
The judge found this conduct to be improper, unreasonable, and negligent, qualifying as professional misconduct. The court ordered both the barrister and solicitors to be referred to their regulators and to pay wasted costs.
The incident undermined the integrity of legal proceedings and risked misleading the court, potentially impacting the fairness of the process for all parties involved.
The judge found that Ms Forey intentionally included the fake cases in the pleadings, apparently without caring whether they existed, and had obtained them from an unknown source that was not a legitimate law report or legal database.
While Haringey’s counsel suggested the possibility that AI tools might have been misused to generate the fake citations, the court could not make a definitive finding on this point due to lack of direct evidence.
Regardless of the source, the legal team failed in their duty to verify the authenticity of the authorities cited, and their subsequent attempts to minimise the seriousness of the issue aggravated their professional failings.
The case ultimately resulted in a settlement and provision of accommodation, but the legal team’s misconduct risked undermining the claimant's case and could have jeopardised his access to justice.
For the lawyers involved, the incident has led to regulatory referrals and reputational damage, with potential disciplinary consequences.
More widely, this kind of conduct erodes trust in the legal profession and the judicial system, highlighting the dangers of unverified legal research - especially with the increasing use of AI tools - and reinforcing the need for rigorous professional standards to safeguard the integrity of court proceedings.
Operator:
Developer: OpenAI
Country: UK
Sector: Business/professional services; Govt - municipal
Purpose: Cite law cases
Technology: Generative AI; Machine learning
Issue: Accountability; Accuracy/reliability; Authenticity; Mis/disinformation; Transparency
Page info
Type: Incident
Published: May 2025