On February 14, 2024, British Columbia's Civil Resolution Tribunal (the Tribunal) found that Air Canada was on the hook for damages arising from its chatbot providing misinformation to a customer. While small claims decisions do not typically garner much attention, this case has been making headlines across North America as an example of technology gone awry. It confirms that companies can be found liable for negligent misrepresentations made by a chatbot on a company's website.

This case should be viewed as a warning to companies that deploy chatbots as their first line of customer support, and the potential implications that generative artificial intelligence (AI) will have in this context.

Facts

On November 11, 2022, the applicant Jake Moffatt visited the respondent Air Canada's website to book a flight to attend his grandmother's funeral in Ontario, who had passed away on the same date. While on Air Canada's website, Moffatt engaged the website's chatbot who informed him about Air Canada's bereavement fares. Specifically, the chatbot stated: "If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form".

Notably, the chatbot also provided Moffatt with a link to Air Canada's webpage titled "Bereavement travel" which states that its bereavement policy does not apply to requests for bereavement consideration after travel is completed. This, of course, was inconsistent with the information the chatbot provided to Moffatt. Moffatt purchased a one-way flight from Vancouver to Toronto departing the next day on November 12, and on November 16, purchased another one-way flight from Toronto to Vancouver on November 18.

Moffatt submitted his first application for the bereavement fare on November 17, 2022. On February 5, 2023, Moffatt emailed Air Canada his grandmother's death certificate and the screenshot from the website's chatbot that set out the 90-day period to request a reduced rate. Responding on February 8, an Air Canada representative told Moffatt that the chatbot provided "misleading words." Air Canada's representative informed Moffatt that the chatbot would be updated.

Discussion of the decision

The determinative issue in this case was whether Moffatt's reliance on Air Canada's chatbot amounted to negligent misrepresentation. Negligent misrepresentation occurs when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading. Specifically, Moffatt had to show the Tribunal:

  1. Air Canada owed him a duty of care;
  2. Air Canada's chatbot's representation was untrue, inaccurate, or misleading;
  3. Air Canada made the representation negligently;
  4. Moffatt reasonably relied on the representation; and,
  5. Moffatt's reliance resulted in damages.

In its attempt to refute Moffatt's claim, Air Canada argued that it could not be held liable for information provided by its agents, servants, or representatives. Air Canada argued that this included its website chatbot. The Tribunal summarized Air Canada's argument as such: "In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada's website". Furthermore, the Tribunal held that Air Canada is responsible for all of the information on its website, regardless of whether said information comes from a "static" webpage or a chatbot.

Additionally, the Tribunal found that Air Canada's argument that Moffatt could have found the correct information regarding its bereavement travel policy elsewhere on its website was not a justification for the chatbot providing misinformation. The Tribunal noted that Air Canada failed to explain why the "Bereavement travel" webpage was to be considered more trustworthy than Air Canada's chatbot, and why a customer would have to verify the information given by the chatbot with Air Canada's other webpages. As the Tribunal put it succinctly, "There is no reason why Mr. Moffatt should know that one section of Air Canada's webpage is accurate, and another is not".

The Tribunal found that Moffatt made out his claim for negligent misrepresentation and ordered Air Canada to pay Mr. Moffatt $812.02 in damages.

Implications of the decision

This case is being heralded in the media as an example of what problems may come with the explosion of generative AI's integration into businesses. It warns companies of the importance of being increasingly careful of how they implement automated or generative AI systems that are client-facing. With the rise of internet commerce, customers have been saturated with automated chatbots like the one used by Air Canada in this case. And with the recent rise of generative AI, this trend is unlikely to slow down.

These automated and generative systems, of course, offer immense help to customer service departments across all industries. They are now an everyday tool. But these tools have moved beyond customer service and are becoming integral to many companies' broader business models. This is why organizations need to be aware of the lurking liabilities that come with the convenience of automated chatbots and generative AI.

In this case, there was no discussion of the potential liability of the creator of the chatbot computer program. We may see future cases featuring legal developments that are particularly relevant to generative AI companies that create chatbots, including the question of whether chatbot creators owe a duty of care to the companies that ultimately deploy them as part of their business model.

To help organizations, governments, and regulatory bodies navigate this evolving legal landscape and avoid the situation that occurred in Moffatt v. Air Canada, lawyers will need to wed their legal acumen with technological competencies to effectively advise clients.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.