AIFoPa-2024-0001 — Air Canada Chatbot Invents Bereavement Fare Policy; Passenger Relies on Invented Policy; Air Canada Argues Chatbot Is Separate Legal Entity
Jake Moffatt's grandmother died. He needed to fly. He consulted Air Canada's chatbot about bereavement fares. The chatbot told him he could purchase a full-price ticket and apply for a retroactive bereavement discount within 90 days. He did this. He applied. Air Canada denied the application and told him the policy the chatbot had described did not exist.
Moffatt took the matter to British Columbia's Civil Resolution Tribunal. Air Canada's defense was, in the Bureau's assessment, one of the more ambitious legal arguments of the decade: the chatbot, Air Canada argued, was a separate legal entity responsible for its own statements, and Air Canada could not be held accountable for what it said.
The tribunal did not accept this. Justice Christopher Rivers found that Air Canada had presented no argument for why it should not be held responsible for information provided by its own automated system on its own website. Air Canada was ordered to compensate Moffatt for the difference between the bereavement fare and what he paid, plus tribunal fees.
Air Canada has since updated its chatbot. The Bureau notes that "we've updated the chatbot" has become a consistent element of the Official Response across most incidents in this archive, and that this is worth paying attention to.