A tribunal in Canada has ruled that consumers who get misleading information from AI programs can receive compensation.
The Civil Resolution Tribunal of British Columbia made the ruling after a Canadian flyer promised a bereavement discount by an AI chatbot won a case against Air Canada.
Also read: OpenAI Claims The New York Times “Hacked” ChatGPT To Develop A Copyright Case
AI chatbot blunders
The tribunal heard that in November 2022, Jake Moffat booked a flight with Air Canada, intending to travel from Vancouver to Toronto to attend his grandmother’s funeral.
While researching flights, he reached out to an AI chatbot on the airline’s website, asking whether there could be consideration of a compassionate fare significantly lower than regular airfares.
The chatbot informed Moffat that he could buy a normal ticket and then seek a “partial refund” using Air Canada’s bereavement policy within three months, according to the civil resolution issued by the tribunal in February. He took the advice and bought the full ticket.
But in pursuing a refund totalling $880 Canadian dollars, Moffat was told by staff at Air Canada that the chatbot had given him “misleading” information and he was not eligible for a refund.
The man took the matter to court, seeking redress. Moffat relied on screenshots of the conversation he had with the chatbot, which, according to the resolution, carried the following as evidence:
“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family,” the chatbot said.
“If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
In its defence, the airline said that though the AI chatbot misled the customer, Moffat “did not follow proper procedure to request bereavement fares and cannot claim them retroactively.”
The airline even tried to dissociate itself from the AI, saying it was a separate legal entity responsible for its own actions. Air Canada said it “cannot be held liable for information provided by one of its agents, servants, or representatives, including a chatbot.”
‘Big red flag’
Air Canada provides certain accommodations, such as reduced fares, for passengers traveling due to the death of an immediate family member, according to its website.
However, it says, in part, that the bereavement policy doesn’t apply to requests for refunds made after travel.
Despite the airline’s defence, the tribunal member Christopher C. Rivers said the company was responsible for ensuring that all information on its website was accurate and usable to the client. It ruled that Air Canada has a duty of care for users of its chatbot.
Moffat was awarded a partial refund of $650 Canadian dollars [~$480], in addition to damages, interest, and tribunal fees.
As more companies adopt AI in customer service, legal challenges have also started to mount, not least the lawsuit against ChatGPT creator OpenAI for allegedly using copyrighted material like books to train its popular conversational AI chatbot.
Jesse Brody, partner at New York legal consultancy firm Manatt, says the Moffat vs. Air Canada case shows that companies could be held accountable for inaccuracies produced by their AI, even if the software is provided by a third-party vendor.
“This case highlights the increasing accountability companies will face for the actions of their AI systems, especially those that are consumer-facing,” Brody said, as reported by JD Supra.
“The fact that a chatbot was able to entirely fabricate a company’s refund policy should be a big red flag for companies that are relying on chatbots to provide accurate information about the company’s policies to their customers,” he added.