Technologies and processes that banks rely on, including customer service call transcription, marketing tools, credit decisioning, cybersecurity tools and fraud prevention, may incorporate AI in ways not every user or employee at the bank understands. Other products are in a gray or “it depends” area,
The way generative AI burst onto the scene fewer than three years ago means “anyone with access to the internet today can get access to tools like ChatGPT or Google’s Gemini, for free and with tremendous processing power they couldn’t have had before,” said Chris Calabia, a senior advisor to the Alliance for Innovative Regulation. “It’s possible your staff is experimenting with ChatGPT to help them write reports and analyze data.”
These “hidden” aspects of AI matters because
“Banks need to pay attention and have a definition that aligns with those regulations or they could find themselves afoul of being able to meet them,” said
There is also the question of maintaining customer trust and ensuring responsible usage.
When deploying AI, “there has to be a parallel process to make sure you’ve got the right guardrails, compliance, and risk governance, so you’re not developing solutions that will be toxic or infringe on personally identifiable information,” said Larry Lerner, a partner at McKinsey.
Understanding what AI is
The history of AI in banking dates back for decades.
Basic AI-type systems, then called ‘expert systems,’ existed in financial services as early as the 1980s, said Calabia, to help financial planners come up with plans for individuals’ financial planning needs.
“These systems were designed to mimic human decision-making processes,” he said.
As AI has evolved, so have its definitions. Even now, pinning down a common understanding of AI is hard.
“People talk about AI when they mean software or analytics,” said Zoldi.
The October 2023
The
The report notes that “there is no uniform agreement among the participants in the study on the meaning of ‘artificial intelligence,'” and while the White House’s definition “is broad, it may still not cover all the different concepts associated with the term ‘artificial intelligence.'”
The report also recognizes there are conflations.
“Recent commentary around advancements in AI technology often uses ‘artificial intelligence’ interchangeably with ‘Generative AI,'” it notes.
Without a common lexicon, banks may struggle to assess and manage the risks associated with AI systems, comply with emerging AI-related regulations, communicate effectively with regulators and third-party vendors about AI use, and make informed decisions about AI adoption and implementation, said DeLeon.
“AI is a wave that has taken us over and now we are trying to swim our way to the top,” said DeLeon.
“Very often when people say AI in regulatory circles, they are talking about machine learning and models that learn for themselves,” said Zoldi.
Whether they are using traditional AI, generative AI or machine learning, Zoldi finds, “some banks are not in a good position to explain models to a certain level of scrutiny that would meet credit regulations that exist today.”
Although generative AI is all the rage, “it’s a very small fraction of what banks use,” said Zoldi. “Underneath the hood, 90 to 95% of AI in banks are models that use neural networks and stochastic gradient boosted trees.” Both neural networks and tree-based models self-learn non-linear relationships based on historical data to come up with future predictions.
How banks can find clarity
“Banks need to make sure the models they use are fair and ethical,” said Zoldi.
To start, banks can take an inventory across their business lines as to what processes or operations use AI or machine learning. They should also come up with
It is easier said than done.
Bankwell Bank in New Canaan, Connecticut, is experimenting with AI and generative AI for small business lending, sales and marketing, underwriting and more. The $3.2 billion-asset bank has brought up discussions of AI and generative AI at its town halls, so “everyone from branch associates up to senior management are starting to think about some of these use cases,” said chief innovation officer Ryan Hildebrand. “But we haven’t [said], ‘here is the guidebook with definitions of AI and how it is used and how to talk about it. We’re still early.”
Kim Kirk, the chief operations officer of Queensborough National Bank & Trust Company in Louisville, Georgia, has asked her check-fraud monitoring provider for data flow diagrams to understand where information is residing and how it is being manipulated. She finds that cybersecurity and fraud prevention are two areas where AI is commonly used.
“Bankers should understand the underlying architecture of solutions that they are purchasing from third party service providers, because ultimately that’s our responsibility to protect our customer information,” she said.
The
The $2.1 billion-asset Queensborough was not a direct purchaser of CrowdStrike, “
The focus on AI by the government also underscores the need for banks to register their usage.
“Bankers have to be knowledgeable about the macro enviro of what is happening with AI,” said Kirk. “The explosion of AI in the last several years has been enormous. Everyone is trying to get their arms around it from a governance perspective to make sure we are protecting our customer information appropriately.”
A financial institution’s size does not always correlate with its sophistication around AI usage. Lerner, for example, was impressed when he recently spoke with a midsize credit union about AI.
“I was pleasantly surprised that they already have a center of excellence stood up,” he said. “They are beginning to experiment with code acceleration with generative AI, and they’ve already been talking to the risk and regulatory group to develop an initial set of guardrails.”