MINNEAPOLIS - Artificial intelligence (AI)-powered chatbots frequently provide customer service, training, sales, marketing, and other business functions. They are a popular and attractive business tool as they help lower operating costs, boost efficiency, and enhance customer satisfaction and thus retention. Their irresponsible use, however, leads to serious problems - Air Canada’s recent bad experience in this regard offers a useful cautionary tale.
This saga began with Air Canada’s implementation of an AI chatbot as part of its customer service program. A customer asked the chatbot about the airline’s bereavement airfare policy - a bereavement flight in the United States and Canada is a flight purchased when a near relation has died or is dying. The chatbot advised him: “If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application.” By all measures, this appeared to be a straightforward answer - the customer relied on it and bought his ticket.
There was one major problem, however: The information the chatbot provided was incorrect. Therefore, when the customer submitted his request for reimbursement - exactly as instructed by its chatbot - Air Canada denied it. The airline’s human representative explained to the customer that this reimbursement was contrary to the airline’s policy. Understandably, the customer was upset and subsequently filed a complaint before the Civil Resolution Tribunal of British Columbia (CRTBC).
The content herein is subject to copyright by The Yuan. All rights reserved. The content of the services is owned or licensed to The Yuan. Such content from The Yuan may be shared and reprinted but must clearly identify The Yuan as its original source. Content from a third-party copyright holder identified in the copyright notice contained in such third party’s content appearing in The Yuan must likewise be clearly labeled as such.