After Air Canada’s online chatbot made a mistake while advising a client, the airline claimed that the tool was responsible, not the company behind it. That’s remarkable – and wrong, a court has now ruled.
Air Canada has been ordered to pay compensation to a grieving grandchild who claimed he was told by an “ill-informed” chatbot to purchase full-price flight tickets.
Jake Moffatt, a resident of British Columbia, visited the airline’s website to book a flight for his grandmother’s funeral in Ontario. A chatbot was activated to help the man and told him that the airline offered lower rates for passengers booking last-minute due to tragedies.
Moffatt promptly bought a nearly $600 ticket for a next-day flight after being told he’d get some of the money back as long as he applied for it within 90 days.
However, he then learned that the chatbot had not been telling the truth. It turned out that Air Canada only awarded bereavement fees if an individual had submitted the request before a flight.
Shockingly, after Moffatt filed a claim with the Civil Resolution Tribunal (CRT) of British Columbia, a small claims adjudicator, the airline tried to distance itself from its own chatbot’s bad advice. Air Canada claimed the tool was “a separate legal entity that is responsible for its own actions.”
"This is a remarkable submission," CRT member Christopher Rivers wrote after ruling that Air Canada did not make sure its chatbot was accurate and was thus responsible for the mistake.
"While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."
Rivers said that Moffatt paid about $483 more than he should have and ordered Air Canada to pay Moffatt that fee in addition to around $93 in tribunal fees and $26 in prejudgment interest.
Air Canada did argue that Moffatt could find the correct information about fees on another part of its website, but the tribunal said: “There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.”
Many companies have been adding artificial intelligence-powered chatbots to their websites and platforms, hoping for faster service.
But these tools are not always correct. In late 2023, AlgorithmWatch, a non-profit research and advocacy organization, said that Bing Chat, the AI-driven chatbot on Microsoft’s search engine Bing, was producing misinformation about elections and even making up false scandals about real politicians.
Your email address will not be published. Required fields are markedmarked