AI Insights: Recent Developments That Could Impact How Companies Offer AI-Based Customer Service Chatbots

Skadden, Arps, Slate, Meagher & Flom LLP

Two recent developments highlight the challenges companies may face as they explore ways to incorporate AI-based chatbots into their customer service offerings:

  • A putative class action filed in California federal district court alleges that, through the use of an artificial intelligence (AI) tool, Home Depot and Google “wiretapped” customer interactions with the Home Depot call centers in violation of the California Invasion of Privacy Act (CIPA).
  • A decision by the British Columbia Civil Resolution Tribunal found that Air Canada was required to honor information provided to a customer through its AI chatbot even though that information was incorrect.

CIPA Complaint

On February 14, 2024, Christopher Barulich filed a putative class action complaint in the U.S. District Court for the Central District of California against Home Depot and Google, alleging that the companies violated Section 631 of the CIPA through their use of an AI-enabled customer service tool.1 That section prohibits, in pertinent part, any reading, attempts to read, or learning the contents or meaning of any message or communication willfully and without the consent of all parties to the communication, and permits private rights of action.

Barulich alleges that Home Depot used Google’s Cloud Contact Center AI (CCAI), a technology through which customers first speak to an automated agents that “listens” to the customer service call, transcribes and analyzes the call in real time, and then suggests possible replies to a live Home Depot agent to whom the customer is then transferred. Barulich asserts that, by enabling this process, Home Depot allowed Google “to access, record, read and learn the contents of [customers’] calls” without their prior consent. Barulich alleges that he was unaware that he was ever speaking to an automated agent or that the content of his calls were bring passed to a third party (here, Google) for analysis.

Barulich also alleges that Home Depot and Google have “the capability to use the contents of the communications it intercepts for purposes beyond the scope of individual customer service calls,” for example, “us[ing] information and data gleaned from customer service calls” to further train or develop its AI models.

The complaint alleges that the foregoing activity, allowed Google to “eavesdrop or wiretap into live conversations between callers and Home Depot,” in violation of Section 631 of CIPA. The complaint further alleges that Home Depot violate that section of CIPA by “knowingly and willingly enabl[ing]” Google to learn the contents of those communications in real time.”

Barulich seeks injunctive relief, the recovery of $5,000 per violation of CIPA (with no cap on aggregate statutory damages) on behalf of a putative California class, attorneys’ fees and costs.

Air Canada Ruling

On February 14, 2024, the British Columbia Civil Resolution Tribunal found Air Canada to have negligently misrepresented its bereavement airfare policy to plaintiff Jake Moffatt via its AI chatbot, and ordered Air Canada to refund Moffatt the difference in airfare based on what the AI chatbot represented to Moffat he was eligible to receive.2

In November 2022, Moffatt visited Air Canada’s website to learn about Air Canada’s bereavement rates. Moffatt interacted with an AI chatbot that inaccurately stated that Moffatt could request reimbursement for a reduced bereavement rate within 90 days of the date the ticket was issued. In the reply generated by the chatbot, the words “bereavement fares” were highlighted and underlined and linked to Air Canada’s actual bereavement policy which did not allow for such retroactive reimbursement.

Relying on the information provided by the chatbot, as opposed to the actual policy, Moffatt booked his flights and then subsequently submitted an application for a bereavement fare within the 90-day window specified by the chatbot. Air Canada conceded that the chatbot had provided “misleading words,” but asserted that the link provide by the chatbot was to the actual and correct policy.

Given the parties’ commercial relationship as a service provider and consumer, the Tribunal found that Air Canada owed Moffatt a duty of reasonable care to ensure its representations were accurate and not misleading. The Tribunal also found that Air Canada was responsible for all the information on its website, including information generated through an AI-powered chatbot: “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.... While a chatbot has an interactive component, it is still just a part of Air Canada’s website.”

Given the chatbot’s misstatement of the policy, the Tribunal held that Air Canada did not take reasonable care to ensure its chatbot was providing accurate information.

The Tribunal ordered Air Canada to refund Moffatt the difference in airfare of what he paid minus what the Air Canada agent stated the approximate bereavement fare of each flight would be.

Key Takeaways

The foregoing developments highlight some of the issues companies may face as they roll out AI-powered customer service tools:

  • There has been an uptick in threatened and filed litigation and arbitration related to purported CIPA violations. Companies using AI have been on the receiving end of pre-suit demand letters, pre-arbitration notices of dispute, and filed complaints and arbitration demands that invoke the civil cause of action of CIPA to challenge commercially reasonable, expected and ubiquitous technology tools such as live chats, session replay software, cookies, trackers and pixels on a company’s website. Note that violations of the act are also criminal.
  • Using AI to help with call center workflows is something many companies are exploring, but the novel theory asserted in the Barulich complaint (and the potential for another wave of CIPA litigation and mass arbitration efforts focused on AI) should be considered in a company’s risk analysis when evaluating AI tools.
  • Companies using AI vendors should also review and update customer disclosures and website policies to encompass the use of third-party AI technologies, if necessary. In connection with calls, companies should audit their practices to ensure that the requisite disclosures are made at the beginning of call flows — whether with live agents or with interactive voice response — to ensure callers are given notice of any call recordings.
  • The Canadian ruling in Moffatt highlights the risk that companies may face if they allow AI chatbots to articulate company rules and policies. While it is not clear if Air Canada would have prevailed if it had a disclaimer that said the chatbot discussion was for general information purposes only and should not be relied upon, and had provided clear information that the actual bereavement policy was determinative and should be consulted, companies may want to consider whether such disclaimers are appropriate.

_______________

1 Barulich v. Home Depot, Inc., 2:24-cv-01253 (C.D. Cal. Feb. 14, 2024).

2 Moffatt v. Air Canada, 2024 BCCRT 149, ¶¶ 32, 40 (Can. B.C. CRT).

Download PDF

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Skadden, Arps, Slate, Meagher & Flom LLP | Attorney Advertising

Written by:

Skadden, Arps, Slate, Meagher & Flom LLP
Contact
more
less

Skadden, Arps, Slate, Meagher & Flom LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide