Australian psychologist makes legal submission with fake AI citations
Australian psychologist makes legal submission with fake AI citations
Occurred: July 2024
Page published: October 2024
An Australian psychologist was accused by a judge of using AI to generate false citations for a defamation trial, misleading the court, wasting judicial resources, and highlighting the dangers of using generative tools for legal research without proper verification.
Legal submissions by appellant Dr Natasha Lakaev made during failed defamation proceedings in front of the Supreme Court of Tasmania cited a case - “Hewitt v Omari [2015] NSWCA 175” that did not exist.
Justice Alan Michael Blow said the submissions made by Dr Lakaev were “surprising” and may have been the result of AI: “When artificial intelligence is used to generate submissions for use in court proceedings, there is a risk that the submissions that are produced will be affected by a phenomenon known as ‘hallucination’.”
It is unclear why this should have happened, though it seems Dr Lakaev defended herself in court and may have drew on a generative AI system to help make her case.
Many lawyers have been caught out using generative AI to prepare court submissions, but this is one of the few times a defendant appears to have used ChatGPT or a similar service and failed to check its accuracy and reliability.
Unknown
Developer:
Country: Australia
Sector: Business/professional services
Purpose: Generate text
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability
March 2024. Dr. Lakaev loses a long-running defamation trial against a former follower.
July 2024. Dr. Lakaev files appeal in the Supreme Court of Tasmania.
July 12, 2024. Chief Justice Blow identifies the fake citation Hewitt v Omari and attributes it to AI hallucination.
July 2024. The appeal is dismissed "for want of prosecution" and due to the lack of merit in the AI-generated arguments.
AIAAIC Repository: AIAAIC1767