ChatGPT invents legal citations in Avianca court case

Occurred: May 2023

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

An experienced lawyer using ChatGPT to conduct legal research to sue Colombian airline Avianca was informed by the chatbot that the six legal cases it cited were real.


According to the New York Times, Avianca customer Roberto Mata sued the airline after a serving cart injured his knee during a flight, only for his lawyer, Steven Schwartz of Levidow, Levidow & Oberman, to use ChatGPT to 'supplement his own findings.ย 


The bot returned six legal cases, including 'Varghese v. China Southern Airlines Co., Ltd', all of which it claimed were real but turned out to be fake.


Schwartz later said he was 'unaware of the possibility that [ChatGPT's] content could be false.'ย  The judge ordered (pdf) another hearing to 'discuss potential sanctions' for Schwartz in response to this 'unprecedented circumstance', and decided to fine the two lawyers USD 5,000.


The incident resulted in questions about the accuracy and marketing of the OpenAI system, and was seem to make the lawyer and his employers appear unprofessional and out of touch.

Hallucination (artificial intelligence)

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.

Source: Wikipedia ๐Ÿ”—

System ๐Ÿค–

Operator: OpenAI; Levidow, Levidow & Oberman
Developer: OpenAI

Country: USA

Sector: Business/professional services

Purpose: Provide information, communicate

Technology: Chatbot; Generative AI; Machine learning
Issue: Accuracy/reliability; Anthropomorphism; Mis/disinformation

Transparency: Marketing

Page info
Type: Incident
Published: May 2023
Last updated: November 2023