Chatbot’s Mistake Costs Air Canada Hundreds Of Dollars In Compensation

Chatbot’s Mistake Costs Air Canada Hundreds Of Dollars In Compensation

AI Chatbot Costs Air Canada Hundreds Of Dollars In Compensation

Air Canada’s AI chatbot misrepresented the company’s discount policy to certain customers, as a result of which they are now forced to pay a hefty compensation to those affected.

The incident happened in November 2022 when British Columbia resident Jake Moffatt logged onto the airline’s online portal to book a ticket to his grandmother’s funeral in Ontario.

The judgment was passed on 14th February, giving Air Canada 14 days to comply with it.

The customer-service chatbot then displayed him a bereavement discount, stating that if he used it within the next 90 days, he’d be refunded a part of his current ticket as well as the return ticket.

However, that information was wrong. According to the policy of Air Canada, a customer can only qualify for a bereavement discount if they apply in advance.

It doesn’t issue any refunds for travels that have already happened. Citing this policy, Air Canada refused to offer a discount to Moffatt.

After Moffat sought legal intervention, and rightly so, a British Columbia Civil Resolution Tribunal (CRT) ordered the airline to pay $600 in bereavement refunds, damages, and tribunal costs which are about half of what Moffat had to pay for the tickets.

About The Case: What Does Air Canada Have To Say About This Decision?

Air Canada was clearly not happy about the decision. According to the tribunal, the airline tried to suggest that it “cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.”

Its representatives also said that the airline has laid out all accurate information about the policy on its website. If Moffat had been just a tiny bit diligent, he could easily locate it on their site.

However, the tribular members said that the airline as a whole was responsible for offering service to all its customers, including Moffat. It was also the airline’s responsibility to train their chatbots so they could relay accurate information to the customers.

To put it simply, the judgment said, in so many words, that Air Canada did not take the necessary steps to ensure that its chatbots were equipped with accurate information.

Not only that but even from a mere common sense perspective, we can easily conclude who’s in the right and who’s in the wrong.

However, Air Canada was in no mood to let go of the matter and accept what can easily be classified as a genuine mistake on the part of the airline company. It went so deep into the rabbit hole that it tried to suggest that the chatbot was a “separate legal entity” and therefore it should be independently accountable for its actions.

However, as you may have already guessed, this plea was rejected by tribunal member Christopher Rivers. Think about Air Canada’s absurdity here for a minute—it wanted, who, the company that coded the chatbot for the airline, to pay for the error?

After much discussion and a couple of unsatisfactory answers by the airline to the tribunal, Air Canada finally decided to comply and offer compensation to Moffat.

The Rise of Misinformation With AI

This case has once again sparked the debate of how reliable AI chatbots really are, especially when the Bard AI Chatbot was recently reconstructed and Nvidia launched Chat with RTX.

Avivah Litan, vice president analyst at Gartner said that companies using AI chatbots absolutely need to invest in training and monitoring it. Otherwise, all the profit it makes from its productivity gains will be lost in legal fees and compensation—Air Canada serves as the perfect example for this.

Litan also said that GenAI is prone to hallucination and about 30% of its answers are made up. In a situation like this, they cannot be left on their own to handle customers.

While AI chatbots are fairly popular, it’s important to understand that they can only handle a few tasks at a time. As of now, it makes little to no sense for businesses to give a chatbot the same responsibility as a human agent, especially in the airline industry where even the tiniest mishaps can have serious consequences.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *