CFPB Analyzes AI Chatbots in Banking | Practical Law

CFPB Analyzes AI Chatbots in Banking | Practical Law

The CFPB published an issue spotlight that analyzes artificial intelligence chatbots in banking.

CFPB Analyzes AI Chatbots in Banking

Practical Law Legal Update w-039-7701 (Approx. 3 pages)

CFPB Analyzes AI Chatbots in Banking

by Practical Law Finance
Published on 13 Jun 2023USA (National/Federal)
The CFPB published an issue spotlight that analyzes artificial intelligence chatbots in banking.
On June 6, 2023, the Consumer Financial Protection Bureau (CFPB) published an issue spotlight analyzing the use of artificial intelligence (AI) chatbots by financial institutions. Chatbots simulate human-like responses using computer programming and have been replacing human customer service agents. In 2022, over 98 million users engaged with bank chatbots.
Basic chatbots are rule-based and use either decision tree logic or a database of keywords to trigger preset, limited responses. More complex chatbots use additional technologies to generate responses and can use AI to simulate natural dialog or large language models (LLM) to analyze patterns between words in large datasets and predict what text should follow a question. Additionally, institutions have started building their own chatbots by training algorithms with real customer conversations and chat logs.
The CFPB found that:
  • Financial institutions are increasingly using chatbots as a cost-effective alternative to human customer service.
  • While chatbots are useful for resolving basic inquiries, their effectiveness diminishes as problems become more complex.
  • Financial institutions risk violating legal obligations, eroding customer trust, and causing consumer harm when deploying chatbot technology.
Although chatbots are able to provide constant availability, immediate responses, and large cost savings to financial institutions, the CFPB has compiled a list of challenges experienced by financial services customers. Customers have complained that chatbots:
  • Have a limited ability to solve complex problems including:
    • difficulty in recognizing and resolving customer disputes;
    • providing inaccurate, unreliable, or insufficient information; and
    • failure to provide meaningful customer assistance.
  • Hinder access to timely human intervention.
  • Have technical limitations and associated security risks including:
    • unreliable systems or downtime;
    • security risks posed by impersonation and phishing scams; and
    • additional responsibilities to keep personally identifiable information safe.
Additionally, the CFPB advises financial institutions to consider risks that chatbots pose to their institutions including:
  • Risk of noncompliance with federal consumer financial laws.
  • Risk of diminished customer service and trust when chatbots reduce access to individualized human support agents.
  • Risk of considerable harm to customers if chatbots provide inaccurate information or fail to recognize or resolve a dispute.
The CFPB will continue to actively monitor the market and reminds financial institutions using chatbots that they are expected to use them in a manner that complies with customer and legal obligations. For more information on chatbots, see Use of Bots in Consumer Credit Transactions Checklist.