Psychologist makes legal submission with fake AI citations
Psychologist makes legal submission with fake AI citations
Occurred: July 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
An Australian psychologist was accused by a judge of using AI to generate false citations for a defamation trial.
Legal submissions by appellant Dr Natasha Lakaev made during failed defamation proceedings in front of the Supreme Court of Tasmania cited a case - “Hewitt v Omari [2015] NSWCA 175” that did not exist.
Justice Alan Michael Blow said the submissions made by Dr Lakaev were “surprising” and may have been the result of AI: “When artificial intelligence is used to generate submissions for use in court proceedings, there is a risk that the submissions that are produced will be affected by a phenomenon known as ‘hallucination’.”
It is unclear why this should have happened, though it seems Dr Lakaev defended herself in court and may have drew on a generative AI system to help make her case.
Many lawyers have been caught out using generative AI to prepare court submissions, but this is one of the few times a defendant appears to have used ChatGPT or a similar service and failed to check its accuracy and reliability.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia 🔗
Unknown
Operator: Dr Natasha Lakaev
Developer:
Country: Australia
Sector: Business/professional services
Purpose: Generate text
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability
Page info
Type: Incident
Published: October 2024