"It Was My Chatbot, Not Me"

LL
Lerners LLP

Contributor

Lerners LLP is one of Southwestern Ontario’s largest law firms with offices in London, Toronto, Waterloo Region, and Strathroy. Ours is a history of over 90 years of successful client service and representation. Today we are more than 140 exceptionally skilled lawyers with abundant experience in litigation and dispute resolution(including class actions, appeals, and arbitration/mediation,) corporate/commercial law, health law, insurance law, real estate, employment law, personal injury and family law.
Is a company liable when its chatbot gets it wrong? This issue recently made its way to British Columbia's Civil Resolutions Tribunal (BCCRT) in Moffatt v Air Canada...
Canada Technology
To print this article, all you need is to be registered or login on Mondaq.com.

Is a company liable when its chatbot gets it wrong? This issue recently made its way to British Columbia's Civil Resolutions Tribunal (BCCRT) in Moffatt v Air Canada, 2024 BCCRT 149. The BCCRT has jurisdiction over "small claims" of up to $5,000.

FACTS

Jake Moffatt (they/them), booked an Air Canada flight from British Columbia to Ontario following their grandmother's death. An interactive Air Canada support chatbot advised them that they could seek a retroactive "bereavement" fare. When Mr. Moffatt later attempted to secure a refund of the difference between the two fares, they were advised by a (human) agent that Air Canada did not offer retroactive bereavement refunds. The price difference between the regular fare and the bereavement fare was $880.36, although including fees the difference was $650.88.

In its defence, Air Canada claimed that it could not be held liable for misinformation provided by the chatbot, the accurate information was available on its website, and Mr. Moffatt did not follow the proper procedure to request a bereavement fare. It also cited contractual terms, but did not provide relevant portions of the contract.

ISSUES

The issue at the heart of this dispute was whether or not Air Canada had negligently misrepresented the bereavement fare process through its chatbot. It was agreed that, when Mr. Moffatt booked a flight, they did interact with a chatbot that gave them advice about bereavement fares, which included "misleading words". The chatbot's advice included the following:

Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.

...

If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form. (emphasis in original)

The term "bereavement fares" was hyperlinked to an Air Canada webpage titled "Bereavement travel" which contained information about the bereavement travel policy. The webpage explained that the policy does not include retroactive bereavement fares. There was an inconsistency between the chatbot advice and the information at the linked website page. Mr. Moffatt did not review the hyperlinked webpage.

DECISION

The BCCRT found that Mr. Moffatt had relied on the chatbot, and they had, in effect, alleged negligent misrepresentation on the part of Air Canada:

Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading (para 24).

The BCCRT held that the test for negligent misrepresentation was satisfied:

  1. Air Canada owed Mr. Moffatt a duty of care;
  2. Air Canada's representation, through the chatbot, was untrue, inaccurate or misleading;
  3. Air Canada made the representation negligently;
  4. Moffatt relied on the representation; and
  5. Moffatt's reliance resulted in a loss.

Incredibly, Air Canada attempted to argue that it was not liable for misinformation provided by an agent, servant or representative, including a chatbot. The absurdity of Air Canada's position was not lost on the Tribunal which commented:

In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission (para 27).

The Tribunal had no trouble finding that Air Canada had failed to take reasonable care to ensure that its chatbot was providing accurate information. It held that it was unreasonable to expect website visitors to double check information found in one part of the website (a chatboth) with another part of the website to ensure accuracy.

TAKEAWAYS

Air Canada was ordered to pay Mr. Moffatt a total of $812.02. While this might be only a small raindrop in a vast sky for Canada's largest airline, the decision is embarrassing and highlights the commercial risks of relying on artificial intelligence. It is also significant for what it is missing: any effort on Air Canada's part to explain how the chatbot worked, and why it provided bad information. In the absence of an explanation, the BCCRT easily landed on a finding of negligence.

As sophisticated as software applications have become, they do get things wrong – and they will not be held liable for their "own" actions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More