‘Fintechs in the crosshairs’: Lenders deploy fairness testing software

Octane Lending, a New York-based online lender, has a challenge when it comes to lending decisions. The company helps people buy powersports vehicles, such as motorcycles and all-terrain vehicles. These loans tend to be reported as auto loans or secured customer loans, not specifically motorcycle loans, so it’s hard to find comparable records.

So the company built its own AI-based credit scoring and underwriting model. It also uses the FICO Auto 9 score.

Recently, to confirm that its credit models do not inadvertently reflect biases or disparately impact disadvantaged communities, Octane, a $1.5 billion asset, began rolling out testing software of fairness. Coming from a major UK bank, chief risk officer Ray Duggins is aware of the need for fair lending and anti-discrimination efforts, which are tightly regulated in Europe. He was previously Chief Risk Officer at GE Capital and at Standard Chartered Bank’s consumer banking in Singapore.

“I never built a model where I intended to discriminate against anyone,” Duggins said. “But you always have to go back and test to make sure you’re not doing something inadvertently.”

Octane is not alone. Its equity provider, Los Angeles-based FairPlay, developer of what it calls “equity as a service” for AI-based lending software, says 10 financial services customers, including two major banks, use its software.

A new client and investor is Cross River Bank in Fort Lee, New Jersey, which has approximately $9 billion in assets. More than 35 fintechs issue loans through Cross River’s banking-as-a-service platform, including Affirm, Rocket Loans, Upgrade, and Upstart.

Cross River is building its own version of an app store in its banking platform as a service and plans to make FairPlay “the equity app”, alongside software for KYC, AML and others ancillary tasks to the loan. Both parties have started the integration work. Cross River declined an interview request.

This will make Cross River “more of a one-stop-shop for businesses looking to launch financial products on their platform,” said Kareem Saleh, CEO of FairPlay.

FairPlay raised $10 million this week in a Series A round led by Nyca Partners, with participation from Cross River Digital Ventures, Third Prime, Fin Capital, TTV, Nevcaut Ventures, Financial Venture Studio and Jonathan Weiner, a venture capital partner at Oak HC/FT . This follows FairPlay’s $4.5 million seed round in November.

Why now

When FairPlay launched in 2020, making automated lending fair was not a hot issue.

“When we started, fairness was on the agenda, especially when you were talking to people at risk, but that wasn’t really a priority; it certainly wasn’t high on the list,” Saleh said. “It was seen by lenders as something to be careful about, so as not to break the law and prevent the government from getting into their business.”

More recently, the bank regulators have expressed concern that lenders using AI could try to circumvent fair lending laws. The software can find patterns in past lending data that perpetuate an existing bias or find a proxy for a prohibited lending criterion, such as zip code, that ultimately informs lending decisions. The effect could be digital redlining, which is illegal.

“I think the main concern on the part of the industry as well as regulators is that too few stakeholders fully understand how algorithmic lending works,” said Manny Alvarez, founding director of BridgeCounsel Strategies. He was previously Commissioner of the California Department of Financial Protection and Innovation, General Counsel at Affirm, and Enforcement Counsel at the CFPB.

“It’s detrimental because it inhibits productive regulation,” he said. “If you don’t know how an algorithm works, it’s hard to fairly assess the lending results of a particular model as a regulator. And by the same token, if you don’t know how your models work as a lender , it’s going to be difficult to understand if you have an unintended proxy for a forbidden base, or to understand if and where in your portfolio you have disparate results that can be optimized.”

Saleh says he sees a the perception among regulators that the algorithms will discriminate and that “the fintech players that have come of age over the past few years weren’t paying enough attention to these kinds of things. So fintech is in the crosshairs and it’s believed that algorithms left to their own devices will harm either to consumers, or to the safety and soundness of the financial system.”

Additionally, over the past two or three quarters, some lenders have come to view equity checks as an opportunity for competitive advantage, finding borrowers that others are overlooking, he said.

“Companies themselves recognize that they cannot have subscription for the digital age and compliance for the Stone Age,” Saleh said.

How FairPlay Works

FairPlay’s software has two main components. The first is detecting biases in credit models, looking for signs of any algorithmic behavior that might cause an undesirable outcome. The other takes a second look at loan applicants who have been turned down, considering additional information that might show that someone with a low credit score or thin credit history is still creditworthy.

Saleh calls the second-look process “equity through awareness.”

“I like to say that for 50 years in banking we have rightly tried to achieve fairness or blindness, this idea that the only color we see is green,” he said. declared. “We just look at the neutral and objective variables from the offices or some other source and make our decision based on that.”

The problem is that certain populations are not well represented in credit bureau data.

FairPlay provides additional information about black applicants, female candidates, people of color and other disadvantaged groups.

For example, additional data on female borrowers could help a lender recognize that someone with irregular incomes may have had a career break, but is still creditworthy.

Adding data on black applicants could help lenders understand an applicant who doesn’t have a bank account.

“A Many black Americans live in banking deserts and as a result do most of their banking either at check cashing stores or using apps like Venmo or Cash App,” Saleh noted. “None of this data is sent back to the offices. and they are not officially considered to have deposit accounts.”

Using second-look software, one client increased their overall approval rate by 10% and increased their approval rate for black candidates by 16%, he said.

“What we find is that 25% to 33% of the time, the highest rated minority people who are turned down would have performed at least as well as the riskiest people these lenders currently approve,” said Saleh.

FairPlay’s software is “a highly technical tool that is easy for the layman to use,” Alvarez said. “And I think we need more of these solutions for the industry, as well as for regulators.”

How Octane uses it

Octane Lending has been building its own loan decision models since 2016; it is now in its third generation.

When the business started, it attracted near-prime and subprime customers. Automakers would pay rebates to companies that provide subprime and subprime loans because no one else would, Duggins said.

Today, about 60% of its loans are for preferred customers.

“We have to operate on all credit spectra right now,” Duggins said.

Octane’s personalized credit score is AI-powered. It uses non-traditional credit bureau data on how people pay their phone bills, how long they’ve worked or lived in different places.

“All of this gives an indication of the stability of the individual,” Duggins said.

Octane has been using FairPlay’s software to look for biases in its models for several months “to validate and confirm that what we’re doing is correct,” Duggins said.

Duggins, who has worked in the banking industry for more than three decades, reflected on the evolution of fair lending technology.

“There wouldn’t have been FairPlay in 1983 or 1985, nobody ever cared about those things,” he said. “To see the evolution from where we are today and the sophistication is truly amazing.”

The double-edged sword of automated lending

Alvarez acknowledges that the many online lenders and traditional banks using automated lending to provide credit in underserved communities should be viewed with skepticism.

“Algorithmic lending is a tool, and it’s possible to use it incorrectly or to the detriment of certain populations,” Alvarez said. “It is also possible to use it for the benefit of certain populations. There is reason to be skeptical as well as optimistic. the attitude of the bath water.”

Alvarez also warned that AI-based underwriting models can drift, especially those based on machine learning and consuming ever-increasing amounts of data.

“Model drift is a real phenomenon, and you need human intervention to observe that drift and correct course if necessary,” Alvarez said.

Automation is useful and inevitable, he noted.

“But human intervention is probably something that will always be needed to ensure lending decisions are made fairly and responsibly,” he said.

Comments are closed.