Canadian flag carrier Air Canada recently faced a significant challenge regarding its chatbot service, resulting in a precedent-setting legal battle against a grieving passenger.
Jake Moffatt, dealing with the loss of his grandmother, sought clarification on Air Canada's bereavement rates through the airline's chatbot. It told Moffatt:
Despite the chatbot's assurance that Moffatt could book a flight and request a refund within 90 days, the airline's policy does not allow refunds for bereavement travel after booking.
Moffatt's efforts to obtain a refund were met with rejection, prompting him to pursue legal action against the company.
The Ruling
In a surprising turn of events, the tribunal ruled in favor of Moffatt, emphasizing Air Canada's responsibility to ensure the accuracy of information provided through its chatbot, which is a part of its customer service system.
Tribunal member Christopher Rivers criticized Air Canada's defense, stating, "Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives — including a chatbot."
As a result, the Tribunal member found that Air Canada “did not take reasonable care to ensure its chatbot was accurate."
He also highlighted that the airline "does not explain why customers should have to double-check information found in one part of its website on another part of its website."
The Aftermath
Following the Tribunal's decision, Moffatt was entitled to a partial refund and additional damages, totaling $650.88 in Canadian dollars (CAD) of the original fare. Additionally, Air Canada was also tasked with covering interest on the airfare and tribunal fees incurred by Moffatt.
The case drew attention to the complexities surrounding AI-driven customer service systems and their implications for consumer rights.
While Air Canada initially launched the chatbot as an AI "experiment" to enhance customer service efficiency, the incident underscored the importance of ensuring accuracy and transparency in automated systems.
Air Canada's chief information officer, Mel Crocker, previously touted the benefits of AI technology in streamlining customer service processes.
According to Crocker, the company had hoped its chatbot would "gain the ability to resolve even more complex customer service issues."
However, the tribunal's ruling highlighted the potential risks associated with over-reliance on automated systems, particularly when it comes to sensitive situations like bereavement travel.
Following the tribunal's decision, the Canadian airline announced its compliance with the ruling, signaling the closure of the legal dispute.
It also disabled its chatbot feature, suggesting a reevaluation of its AI-driven customer service strategies.
Editing by Katherine 'Makkie' Maclang