Air Canada Ordered to Pay Vancouver Man $483 by Tribunal for Chatbot Promise

Vancouver, Canada – A Canadian tribunal ruled Wednesday that Air Canada must pay a Vancouver man a partial refund for his flight ticket, in what could be a landmark case for the use of artificial intelligence in business.

Jake Moffatt had asked the airline’s artificial intelligence support chatbot whether the airline offered bereavement fares following the death of his grandmother. The chatbot said the airline does offer discount fares and that Moffat could receive the discount up to 90 days after flying by filing a claim. However, the airline’s actual bereavement policy does not include a post-flight refund, and specifically states that the discount must be approved beforehand.

Moffat booked and flew from Vancouver to Toronto and back for about $1200, and later requested the promised discount of about half off, but was told by the airline’s support staff that the chatbot’s replies were incorrect and non-binding. Air Canada argued in the civil tribunal that the chatbot is a “separate legal entity” to the company, and that it could not be held responsible for its words to customers.

However, tribunal member Christopher Rivers ruled in favor of Moffat on Wednesday, determining that the airline committed “negligent misrepresentation” and that it must follow through with the chatbot’s promised discount. Rivers ordered Air Canada to pay Moffat the promised $483 refund plus nominal fees.

“I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Rivers continued. “While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.”

This ruling sets a precedent for businesses using AI in customer service, highlighting the importance of accurate and reliable information provided by AI chatbots. As a result, companies may need to ensure that their AI systems provide correct and binding information to customers, and to be held accountable for any misinformation or misleading statements. The impact of this ruling could extend to other industries and jurisdictions as businesses continue to rely on AI technology for customer interactions.