Company claimed its chatbot ‘was responsible for its own actions’ when giving wrong information about bereavement fare
Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.
Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was “responsible for its own actions”.
Amid a broader push by companies to automate services, the case – the first of its kind in Canada – raises questions about the level of oversight companies have over the chat tools.
This is the best summary I could come up with:
Canada’s largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.
In 2022, Jake Moffatt contacted Air Canada to determine which documents were needed to qualify for a bereavement fare, and if refunds could be granted retroactively.
According to Moffat’s screenshot of a conversation with the chatbot, the British Columbia resident was told he could apply for the refund “within 90 days of the date your ticket was issued” by completing an online form.
Moffatt then sued for the fare difference, prompting Air Canada to issue what the tribunal member Christopher Rivers called a “remarkable submission” in its defense.
Air Canada argued that despite the error, the chatbot was a “separate legal entity” and thus was responsible for its actions.
While Air Canada argued correct information was available on its website, Rivers said the company did “not explain why the webpage titled ‘Bereavement Travel’ was inherently more trustworthy” than its chatbot.
The original article contains 414 words, the summary contains 164 words. Saved 60%. I’m a bot and I’m open source!