Agency officials and technology leaders have told Vixio that regulation of the use of artificial intelligence (AI) in the US payments sector must be limited and allow for innovation.
As AI use cases in the payments space continue to grow in significance — both positive and negative — industry players fear that overzealous regulation could stunt progress and hinder security.
Leaders at the Securities and Exchange Commission (SEC) and the Federal Reserve agree that a balanced approach that fosters innovation while ensuring growing risks are addressed is the best option.
“It’s really important to try to build an environment in which you don't have to ask permission to innovate,” said Hester Peirce, commissioner at the SEC.
“Obviously, the financial services industry is extremely regulated, but one of the best things that you can do for the investor and also just for society is to keep barriers to entry low so that no provider of services gets too comfortable and starts taking advantage of people.”
Peirce believes one of the reasons technology has drawn so many people to the US to build companies and to try new and innovative things is that the country has historically had a willingness to let people experiment without trying to regulate every step they take.
“You want to have competitors nipping at their heels so that they keep their prices low and their products good, and part of that is really making sure that it's very easy to come in and compete,” she said.
The commissioner pointed to the rise of cryptocurrency and the disjointed way it was initially handled by regulators in the US.
“You saw people coming from the technology side with ideas about how they could improve the way the financial services industry operates, or the way people access products and services. And it turns out there are a lot of barriers that have made it very hard for those people to come in and compete. That's not, at the end of the day, a good result.”
The challenge of AI regulation
Like quantum computing, AI is something of a double-edged sword for payments.
Although large language models are enhancing customer service, for example, they are also being used by scammers to improve the techniques they use in payments fraud.
As the attempted passage of the California wire fraud bill, SB278, intended to make amendments to the existing Elder Abuse and Dependent Adult Civil Protection Act, highlighted, payment fraud has grown quickly in terms of sophistication and complexity in recent years, becoming harder to monitor, predict and prevent.
Yinglian Xie, CEO and co-founder of DataVisor, told Vixio that scammers are using new technologies and fraud patterns are evolving much faster than ever before.
“We all know already that deep fake and gen AI technology is available and can process immediately,” she said. “Many of us already understand that fraud is very fast, but we still underestimate how fast that can be.”
Like the quantum issue of encryption, this means that there needs to be a concerted effort within the industry to ensure that resources are deployed to preventative measures unhindered.
“The best solution is not to have regulators trying to figure out how to solve problems. It's great that regulators think about these things, but the brilliant minds that are probably going to solve major problems are probably out there in private industry or in academia,” said the SEC’s Peirce.
“The best solutions are going to come from places that you don't anticipate, so your best bet is to try to make sure that when those solutions are viable, or when someone's coming up with those solutions, they can find capital to make it viable. As a regulator the best thing I can do is make it easy for capital to flow to good ideas.”
Public and private sector leadership
The rapid innovation in AI is similar to previous periods of change such as the digital revolution in finance and the invention of the internet itself, and in those instances it was market competition more than government intervention that drove rapid change.
In November 2024, as covered by Vixio, Federal Reserve Board governor Michelle Bowman called for a cautious approach to AI regulation, suggesting that the risks and benefits are unclear.
In the same month, however, fellow Fed governor Christopher Waller said that the payment system has long been one of the areas where, despite best efforts, the private sector has fallen short.
“Fast forward to today, and we are once again in an era of rapid change and innovation in money and payments,” he said.
Some of this change is about delivering services already available, but doing so with new technologies, he said, but some is about leveraging new technology to rethink existing payment, clearing and settlement structures.
“The Federal Reserve should focus on addressing issues that the private sector cannot address alone and, in doing so, promote an efficient and resilient US payment system,” he said.
Waller added that, despite the buzz, there is still some uncertainty about the business case for AI in several industries, including payments.
He said that although there are parts of the service sector where it is hard to see how AI is going to add value, the technology is clearly going to have an impact in the professional sector.
“If you were thinking about stablecoins and you wanted to look for illicit finance, AI can read a blockchain phenomenally and look for patterns,” he said.
“Generative AI would be a great tool for monitoring illicit finance and money laundering, and once things go into any kind of blockchain they could just read it 24/7 and flag anything that is suspicious. It may be wrong but this is a good thing.”
AI’s growing impact
Although the use of AI in the payments sector is in its early stages, the technology is already having an impact for financial firms in some areas.
Teresa Heitsenrether, chief data and analytics officer at JPMorgan Chase, told Vixio that “semantic models are very good at interpreting language, and it's very structured and usually coded”.
“Even small advantages in terms of how much more efficient you can be can have meaningful impacts. We bank half the households in the US — if they are calling in about their checking account, car loan, mortgage, our agents must be able to deal with all those questions. But if you can use AI to be able to interpret the question, make the answer come faster, take time off that call and give somebody better experience, that's another big advantage.”
Regulators considering rules on AI should be sensitive to company size, and be wary of imposing excessive burdens on small companies looking to innovate. One option is to use regulatory sandboxes to ensure ideas are nurtured.
Passing rules that are highly prescriptive about what must be done with technology risks locking even the bigger companies into specific paths that may appear more profitable, but might not be in the long run.
“One of the things I always encourage people to do when they're thinking about regulation is think about the actual cost or risk that they are trying to address, and how they can be targeted about that,” said Jason Kwong, chief strategy officer at OpenAI.
“Thinking that something needs to be regulated just because there's concern about risk otherwise is too general, when really it is possible to be a lot more surgical and progressive.”