WASHINGTON — Consumer Financial Protection Bureau Director Rohit Chopra said the shortcomings of the FICO credit-scoring model are becoming unacceptable and urged regulators and lenders to develop a new model based on artificial intelligence to replace it.
Speaking Thursday at an AI conference presented by FinRegLab, a nonprofit focused on technology and financial regulation, Chopra said the bureau is looking into actions that can curb the costs of credit scores to lenders, particularly in the mortgage origination market, and that government agencies should rethink longstanding policies that steer lenders toward traditional credit-scoring methods.
“In the short term, the CFPB and others are going to need to work on addressing this issue with price gouging in credit scores, especially when it comes to the harmful effect in our mortgage markets,” Chopra said. “But going forward I think we have to adjust government policies that push the market toward the use of traditional credit scores, and create the conditions for a meaningful, helpful and transparent use of AI.”
Credit scoring evolved out of an opaque and unresponsive market for credit reporting whereby consumers could often not know what was in their credit report, whether it was accurate or whether a report by one firm was comparable to a report from a competing firm, Chopra said. The scores, created by the Fair Isaac Corp., emerged in the late 1980s as a solution to many of these problems, most prominently the interoperability of borrowers’ rankings from one credit reporting bureau to another.
“This created standardization — and not many people talk about this — that would ultimately make consumer and loan securitization accelerate quite rapidly,” Chopra said. “FICO’s standardized score was seen as revolutionary, and I think it’s an important benchmark to recall from history, as we think about AI and consumer financial services, the rise of the FICO score ended up creating a new set of standardization, a new monopolistic middleman, and … enormous network effects.”
Chopra said the ability of consumer loans to be securitized via the FICO score has created a challenge for banks and other financial firms. Many firms no longer rely on the FICO score as a primary means of assessing creditworthiness because it leaves out many prospective clients with no credit history and is not a reliable predictor of whether a borrower will repay a loan. But because FICO is baked into many securitization schemes — notably by Fannie Mae and Freddie Mac for bundling mortgage securities — the firms continue to rely on it, and often pay a premium for a service they do not prefer.
“Lenders report to the CFPB that credit scores are really just not predictive enough anymore,” Chopra said. “To stay competitive, major lenders build their own proprietary scorecards to evaluate applications, and many would like to abandon standardized scores if they could, if not for that liquidity premium they get from it.”
The mortgage origination market is particularly incensed with the reliance on FICO as a means of assessing creditworthiness because lenders must purchase scores for prospective borrowers at a price that has gone up significantly without conferring any commensurate benefit.
“Mortgage lenders are pretty angry about the cost of FICO scores and feel that they’re being price-gouged,” Chopra said. “If you look at some of the pricing for FICO scores for mortgage lenders — and recall, they’re generally required to purchase a FICO score, though now they can choose a competitor — that has meant that the price of FICO scores has gone up, according to some public reports, 700-800% over a span of a few years, and it’s not actually that much more predictive than it used to be.”
To remedy that, Chopra suggested regulators, lenders and other stakeholders work together to develop a new open-source model whose inner workings are well understood by all concerned.
“Regulators and market participants could work to use artificial intelligence with a specific model that could be used by lenders, investors and others to standardize a score,” Chopra said. “Ideally, this usage of AI could be open source, using a cooperative model to finance its ongoing interactions and testing to make sure that it’s not discriminatory, and perhaps more importantly, it would be transparent about what the key inputs would be so that there is a sense of fairness, there is proper governance about what types of data are to be considered.
“This could serve as an important companion to advancing the overall goals of promoting competition and inclusion in a more open banking system.”