Walmart product liability lawsuit cites fake legal cases
Walmart product liability lawsuit cites fake legal cases
Occurred: February 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
Documents filed during a lawsuit over a house fire allegedly caused by a hoverboard were found to contain false, AI-generated legal citations.
In a lawsuit filed in June 2023, attorneys representing Walmart and Jetson Electric Bikes from law firms Morgan & Morgan and Goody Law Group submitted a motion in January 2025 citing eight non-existent legal cases.
The fire had destroyed the plaintiffs' house and caused serious burns to family members.
The defendants were unable to identify the cases cited, though one of them appeared to have been a case fabricated by ChatGPT.
A US federal judge issued an order for the attorneys to explain why they should not face sanctions for citing fake cases.
The lawyers admitted using AI to generate the legal case citations, with one saying he had used AI for the first time to add case law supporting the exclusion of certain evidence.
"Our internal artificial intelligence platform 'hallucinated' the cases in question while assisting our attorney in drafting the motion in limine," they wrote, per The Register.
The attorneys face potential sanctions and professional embarrassment. For the plaintiffs, the incident could harm their case and delay justice for their injuries and property damage.
More broadly, the incident highlights the dangers of uncritically trusting AI-generated content in legal proceedings, and serves as a cautionary tale for the legal profession about the risks of using AI without proper verification.
Operator:
Developer: OpenAI
Country: USA
Sector: Business/professional services
Purpose: Generate legal citations
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability
Page info
Type: Incident
Published: February 2025