The Yuan requests your support! Our content will now be available free of charge for all registered subscribers, consistent with our mission to make AI a human commons accessible to all. We are therefore requesting donations from our readers so we may continue bringing you insightful reportage of this awesome technology that is sweeping the world. Donate now
Lessons learned from AI chatbots will encourage their responsible use
By Eran Kahana  |  Apr 30, 2024
Lessons learned from AI chatbots will encourage their responsible use
Image courtesy of and under license from Shutterstock.com
Chatbots are widespread forms of AI and, while they handle many routine tasks effectively, too often they are left to deal with problems for which they are ill-equipped, prompting scandals and bad experiences. AI and legal expert Prof Eran Kahana cites an illustrative case study.

MINNEAPOLIS - Artificial intelligence (AI)-powered chatbots frequently provide customer service, training, sales, marketing, and other business functions. They are a popular and attractive business tool as they help lower operating costs, boost efficiency, and enhance customer satisfaction and thus retention. Their irresponsible use, however, leads to serious problems - Air Canada’s recent bad experience in this regard offers a useful cautionary tale.

This saga began with Air Canada’s implementation of an AI chatbot as part of its customer service program. A customer asked the chatbot about the airline’s bereavement airfare policy - a bereavement flight in the United States and Canada is a flight purchased when a near relation has died or is dying. The chatbot advised him: “If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application.” By all measures, this appeared to be a straightforward answer - the customer relied on it and bought his ticket. 

There was one major problem, however: The information the chatbot provided was incorrect. Therefore, when the customer submitted his request for reimbursement - exactly as instructed by its chatbot - Air Canada denied it. The airline’s human representative explained to the customer that this reimbursement was contrary to the airline’s policy. Understandably, the customer was upset and subsequently filed a complaint before the Civil Resolution Tribunal of British Columbia (CRTBC).

The content herein is subject to copyright by The Yuan. All rights reserved. The content of the services is owned or licensed to The Yuan. Such content from The Yuan may be shared and reprinted but must clearly identify The Yuan as its original source. Content from a third-party copyright holder identified in the copyright notice contained in such third party’s content appearing in The Yuan must likewise be clearly labeled as such.
Continue reading
Sign up now to read this story for free.
- or -
Continue with Linkedin Continue with Google
Comments
Share your thoughts.
The Yuan wants to hear your voice. We welcome your on-topic commentary, critique, and expertise. All comments are moderated for civility.