• Key insight: Community banks may be able to benefit from the enormous buildup of AI capacity, as the cost of AI-based fraud detection falls.
  • What’s at stake: Criminals’ use of AI-generated deepfakes to commit fraud has been growing rapidly.
  • Expert quote: “What was once the stuff of movies is now a reality,” said Federal Reserve Gov. Michael Barr.

ST. LOUIS — Federal Reserve Gov. Michael Barr argued Wednesday that artificial intelligence has the potential to help community banks fight fraud more effectively, but he also issued a warning to the same banks about the rapidly expanding use of AI by cyber criminals.

Speaking at the Community Banking Research Conference, Barr said that automated fraud detection seemingly hasn’t been cost-effective for some community banks in the past.

“But the huge buildout in AI capacity now underway, along with the explosion in the number of firms seeking to get into AI-based services, may have the potential for driving down costs enough to make AI-based fraud detection more feasible for community banks,” he added.

Barr also flagged the growing risk to banks from criminals’ use of generative-AI deepfakes to commit fraud. “Using only a brief sample of audio and access to information about individuals on the Internet, criminals employ gen AI to impersonate a close relative in a crisis, or a high-value bank client seeking to complete a transaction at their bank,” he said.

“What was once the stuff of movies is now a reality, and a lot is at stake,” Barr said, noting that the use of deepfakes in cybercrime is growing very rapidly.

Barr also touched on the potential of AI to reshape employment across the U.S. economy. Such an outcome could help banks in certain rural communities where data centers get built, but it could hurt lenders in places where there are concentrated job losses.

“I tend to be an optimist about the potential for AI to make workers more productive, raise living standards and create more jobs in new industries, but I am realistic in my expectation that it could cause considerable dislocation of workers and businesses, at least in the short run,” Barr said. “Communities dependent on a small number of employers or a single industry that is significantly affected by AI could experience these dislocations.”

Barr, who was the Fed’s vice chair for supervision during much of the Biden administration, has spoken multiple times this year about the potential impacts of AI on the financial system. 

In an April speech, he argued that banks and regulators should be taking steps to make successful deepfake attacks less likely, and also to make such attacks more resource-intensive for the criminals.

“Another way to disrupt the economics of cybercrime is by increasing penalties for attempting to use gen AI to commit fraud and increasing investment in cybercrime enforcement,” Barr said in the April speech. “This includes targeting the upstream organizations that benefit from illegal action and strengthening anti-money-laundering laws to disrupt illicit fund flows and freeze assets related to cybercrime.”

Barr’s latest remarks came at a conference where community bankers have been discussing the implications of AI for their institutions, which historically have often been slower to adopt new technologies than their larger peers.

On Tuesday, Alexander Price, the president and CEO of Ouray, Colorado-based First Citizens Bank, spoke about community banks’ need to make smart use of AI while also preserving their chief asset —  the relationships they have with customers.

“Community banks excel in crisis, whether that’s a really big crisis like the pandemic, or a … crisis like a death, a divorce, fraud, identity theft,” Price said. “Because that’s where you want to come in and say, ‘Hey, I have this problem.’ And it is so unique to that person that no AI system could possibly say, ‘Here’s exactly what the solution is.'”