LEAP AI invents Melbourne family court legal case citations
LEAP AI invents Melbourne family court legal case citations
Occurred: July 2024
Page published: October 2024
Report incident๐ฅ| Improve page ๐| Access database ๐ข
An Australian lawyer was referred to a legal complaints authority after AI legal software he had used generated fictitious case citations that were then presented during a family court case.ย
During a hearing on July 19, 2024, a Melbourne-based lawyer submitted a list of purported case precedents that neither the judge nor her staff could verify.ย
Upon investigation, it was revealed that the citations had been fabricated by LEAP's LawY AI tool and that the lawyer had not confirmed their accuracy before presenting them in court, leading to a postponement of the hearing and a slap on the wrist by the legal authorities.
The lawyer admitted to using Leap's AI capabilities without understanding it's hallucinatory limitations.ย
He ultimately issued an apology and offered to cover costs incurred by the other party due to the adjournment.
The incident raises concerns about the accuracy and reliability of LEAP's generative AI software, and highlights the importance of verifying AI-generated information.ย
It is part of a broader trend in which multiple lawyers have faced repercussions for relying on unreliable generative AI systems without properly checking their outputs and, as such, raises questions about the ethical implications of using AI in legal practice.
Similar cases have emerged globally, including instances in Canada and the UK where legal professionals cited fabricated cases generated by tools like ChatGPT.ย
This is yet another example of a lawyer failing to understand that generative AI systems are prone to be inaccurate and unreliable, or understanding their limitations but unwilling to take time to check their outputs.
Lawyers are often handed ethics mandates given their supposed understanding of legal and ethical questions - an assumption that is clearly questionable in instances such as this.ย
More broadly, perhaps lawyers using generative AI to cut corners should consider cutting their fees.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia ๐
LawY ๐
Developer: Leap
Country: Australia
Sector: Business/professional services
Purpose: Provide legal answers
Technology: Generative AI; Machine learning
Issue: Accuracy/reliability