After months of resisting, Air Canada was forced to give a partial refund to a grieving passenger who was misled by an airline chatbot inaccurately explaining the airline’s bereavement travel policy.
On the day Jake Moffatt’s grandmother died, Moffat immediately visited Air Canada’s website to book a flight from Vancouver to Toronto. Unsure of how Air Canada’s bereavement rates worked, Moffatt asked Air Canada’s chatbot to explain.
The chatbot provided inaccurate information, encouraging Moffatt to book a flight immediately and then request a refund within 90 days. In reality, Air Canada’s policy explicitly stated that the airline will not provide refunds for bereavement travel after the flight is booked. Moffatt dutifully attempted to follow the chatbot’s advice and request a refund but was shocked that the request was rejected.
Moffatt tried for months to convince Air Canada that a refund was owed, sharing a screenshot from the chatbot that clearly claimed:
If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.
Air Canada argued that because the chatbot response elsewhere linked to a page with the actual bereavement travel policy, Moffatt should have known bereavement rates could not be requested retroactively. Instead of a refund, the best Air Canada would do was to promise to update the chatbot and offer Moffatt a $200 coupon to use on a future flight.
Unhappy with this resolution, Moffatt refused the coupon and filed a small claims complaint in Canada’s Civil Resolution Tribunal.
This story originally appeared on Ars Technica, a trusted source for technology news, tech policy analysis, reviews, and more. Ars is owned by WIRED’s parent company, Condé Nast.
According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because, Air Canada essentially argued, “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.
Experts told the Vancouver Sun that Moffatt’s case appeared to be the first time a Canadian company tried to argue that it wasn’t liable for information provided by its chatbot.
Tribunal member Christopher Rivers, who decided the case in favor of Moffatt, called Air Canada’s defense “remarkable.”
“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives—including a chatbot,” Rivers wrote. “It does not explain why it believes that is the case” or “why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot.”
Further, Rivers found that Moffatt had “no reason” to believe that one part of Air Canada’s website would be accurate and another would not.
Air Canada “does not explain why customers should have to double-check information found in one part of its website on another part of its website,” Rivers wrote.
In the end, Rivers ruled that Moffatt was entitled to a partial refund of $650.88 in Canadian dollars off the original fare (about $482 USD), which was $1,640.36 CAD (about $1,216 USD), as well as additional damages to cover interest on the airfare and Moffatt’s tribunal fees.
Air Canada told Ars it will comply with the ruling and considers the matter closed.
When Ars visited Air Canada’s website on Friday, there appeared to be no chatbot support available, suggesting that Air Canada has disabled the chatbot.
Air Canada did not respond to Ars’ request to confirm whether the chatbot is still part of the airline’s online support offerings.
Author: Steven Levy
Carlton Reid
Reece Rogers
Lauren Goode
Last March, Air Canada’s chief information officer, Mel Crocker, told the Globe and Mail that the airline had launched the chatbot as an AI “experiment.”
Initially, the chatbot was employed to alleviate the burden on Air Canada’s call center during unexpected flight delays or cancellations.
“So during a snowstorm, if a new boarding pass has not yet been issued to you and you simply want to verify whether a seat is available on another flight, that’s the kind of task we can readily manage with AI,” Crocker narrated to the Globe and Mail.
Gradually, according to Crocker, Air Canada intended for the chatbot to “grow the ability to tackle even more intricate customer service problems”, with the airline’s ultimate objective being the automation of every service that did not necessitate a “human touch”.
If Air Canada can make use of “technology to remedy something that can be automated, we shall proceed to do so,” expressed Crocker.
Air Canada was heavily invested in AI experimentation, as revealed by Crocker to the Globe and Mail. He added that the initial expenditure in customer service AI technology was significantly more than the cost of continuously hiring staff to answer basic inquiries. However, Crocker believed the investment to be worthwhile, stating that the airline’s investments in automation and machine learning technology will reduce its cost and, significantly, provide an improved customer experience.
However, it appears that at least for one user, the chatbot led to a more frustrating customer experience.
Air Canada may have avoided liability in Moffatt’s case, experts informed the Vancouver Sun, had its chatbot warned users that the information it supplied might not be precise.
It seems Air Canada neglected to take this measure, leading to Rivers stating that “Air Canada did not show reasonable care in ensuring the accuracy of its chatbot.”
“It should be obvious to Air Canada that it is responsible for all the information on its website,” Rivers wrote. “It makes no difference whether the information comes from a static page or a chatbot.”
This story originally appeared on Ars Technica.