Table of Contents

    Table of Contents

    Another AI Horror Story – Must There Always be a Human in the Loop?

    Another AI Horror Story – Must There Always be a Human in the Loop?
    Oren Ezra

    Oren Ezra

    CMO, K2view

    With companies being held accountable for their chatbot responses, it's up to technology to make GenAI apps more responsible. GenAI data fusion is the answer.

    GenAI Growing Pains

    As the business world increasingly relies on Generative AI (GenAI) apps for customer service (and more), questions of accountability persist. Who’s responsible for the nature of the interaction and the accuracy of the information provided by customer service chatbots?

    A recent customer service chatbot caught swearing and disparaging the company that deployed it is an example of how GenAI can be a brand liability. Further, a Canadian court recently found that GenAI chatbots can be legal and fiscal liabilities, too.

    A Civil Resolution Tribunal (the Canadian equivalent of the US Small Claims Court) recently ruled that Air Canada must compensate a customer due to misinformation provided by the airline's chatbot.

    The potential value of GenAI-powered customer service chatbots is clear. But obviously there’s something fundamentally flawed in how current GenAI chatbots function. The question is, can chatbots be made more responsible, more accurate, and more trustworthy without having a human in the loop?

    Air Canada Case Study

    The story begins when Mr. Jake Moffatt, faced with the sudden loss of his grandmother, needed to book an urgent flight to attend the funeral. Most airlines have what’s referred to as a “bereavement policy”, under which flights for recently bereaved relatives are heavily discounted.

    Seeking assistance to book such a bereavement flight, Moffatt turned to an Air Canada chatbot, a virtual customer service agent powered by GenAI. Unfortunately, the chatbot provided inaccurate information. It directed Moffatt to purchase full-price tickets, falsely claiming that he would be refunded in accordance with the airline's bereavement fare policy. Trusting the chatbot's guidance, Moffatt ended up paying over $1,600 for a trip that would have only cost around $760 under the actual bereavement policy.

    When Air Canada refused to refund Moffatt the difference, he decided to sue. In court, Air Canada attempted to deflect responsibility for the chatbot's misleading advice, claiming that the chatbot functioned as a separate legal entity which was not under their direct control. The presiding judge swiftly rejected that claim, pointing out that the chatbot is an integral part of the airline’s website – and that Air Canada is ultimately responsible for all information disseminated through its official channels, regardless of whether it originates from a static webpage or from an interactive chatbot.

    The court ordered Air Canada to pay Mr. Moffatt a total of $812, including damages incurred, pre-judgment interest, and court fees. This landmark case is believed to be the first in Canada where a customer successfully sued a company for misleading advice provided by a chatbot.

    The Broader Implications

    The Air Canada case is a cautionary tale for businesses that increasingly rely on AI-powered chatbots to streamline their customer service operations.

    This incident highlights the critical need for thorough testing and verification of information delivered by chatbots, particularly those utilizing advanced GenAI technology. GenAI chatbots have the capability to access internal knowledge bases and respond to customer queries without human intervention. However, not all have been configured or equipped to do so.

    Further, as the Moffatt case demonstrates, even the best data-driven chatbot can pose a significant liability if proper safeguards are not in place. The question is – which safeguards are most effective, efficient, and practical?

    Human in the Loop vs More Dependable Technology

    Experts predict a rise in legal disputes related to GenAI-powered chatbots as companies continue to integrate them into their customer service strategies. Clearly, significant improvements are necessary in chatbot technology. Today, 2 primary approaches accomplish this: Human in the Loop and AI Data Fusion.

    Human in the Loop (HITL) is just what it sounds like: Actual people managing the data used to train the chatbot and monitoring its interactions with users. Human involvement allows for the identification and correction of errors and biases, which ultimately leads to more accurate and trustworthy chatbot responses. But HITL comes at a cost. It’s expensive to maintain and doesn’t scale well for businesses with high volumes of customer interactions.

    AI Data Fusion injects enterprise data into Large Language Models – on demand and in real time – to ground GenAI chatbots and deliver responses that users can trust. AI Data Fusion extends conventional Retrieval-Augmented Generation (RAG) by injecting relevant context-rich data into LLM prompts. It augments LLMs with real-time, high-quality, and compliant data – dramatically growing customer intimacy and trust.

    Get the condensed version of the Gartner RAG report for free.

    The Bottom Line  

    The Air Canada case study highlights the critical need for businesses to ensure accurate and trusted chatbot responses. While human intervention can improve accuracy, it’s impractical for high volumes of customer interactions. More dependable technology, like GenAI data fusion, offers a more scalable, viable, and reliable approach to ensuring trust via RAG chatbots – particularly for customer-centric businesses.

    While GenAI offers immense potential to revolutionize customer service, proper safeguards are essential. By guaranteeing chatbot response accuracy with GenAI data fusion, businesses can leverage the power of AI responsibly, avoid legal liability, and ensure a positive and rewarding customer experience.

    Get to know K2view GenAI Data Fusion, the ultimate RAG tool.

    Achieve better business outcomeswith the K2view Data Product Platform

    Solution Overview

    Ground LLMs
    with Enterprise Data

    Put GenAI apps to work
    for your business

    Solution Overview