Air Canada found liable for chatbot's poor advice

Occurred: 2022-2024

Can you improve this page?
Share your insights with us

Air Canada was forced to pay damages after a court ruled the airline was liable for the wrong information its chatbot gave a customer before he booked a flight.

Following the death of his grandmother, Air Canada's chatbot told Jake Moffatt that if he purchased a normal-price ticket he would have up to 90 days to claim back a bereavement discount - a special low rate for people traveling due to the loss of an immediate family member.

Moffatt ended up taking the airline to a small-claims tribunal for negligence after it refused to claim back the discount, even though he had the correct documents and did so within the 90-day window, on the basis that it should not be held liable for the chatbot's faulty outputs.

In his ruling, tribunal member Christopher Rivers said Air Canada had failed to take 'reasonable care to ensure its chatbot was accurate.' He also said that 'It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.'

The incident was seen as a reminder that companies need to aware of the risks of using AI, including the legal risks.


Operator: Air Canada
Developer: Air Canada
Country: Canada
Sector: Travel/hopsitality
Purpose: Support customers
Technology: Chatbot
Issue: Accuracy/reliability; Liability
Transparency: Governance