Advanced Computing in the Age of AI | Tuesday, May 24, 2022

AI Can Take on Bias in Lending 

Humans invented artificial intelligence, so it is an unfortunate reality that human biases can be baked into AI. Businesses that use AI, however, do not need to replicate these historical mistakes. Today, we can deploy and scale carefully designed AI across organizations to root out bias rather than reinforce it.

This shift is happening now in consumer lending, an industry with a history of using biased systems and processes to write loans. For years, creditors have used models that misrepresent the creditworthiness of women and minorities with discriminatory credit-scoring systems and other practices. Until recently, for example, consistently paying rent did not help on mortgage applications, an exclusion that especially disadvantaged people of color.

Duke University, the New School and the Federal Reserve Bank of Boston issued a report in 2015 that starkly illustrated how unfair lending perpetuates inequality. The average net worth of white households in Boston was around $250,000 at the time. The average Black household’s net worth was just $8. Low homeownership was the main differentiator. Redlining had prevented generations of Black households in Boston from receiving mortgages.

How AI Can Help

For lenders to uncover the biases that influence their decisions, they must closely examine the systems and processes that determine how they grant loans. Achieving the depth of organizational self-scrutiny and mindfulness necessary to accomplish that goal depends on whether they have deployed the best technologies.

Recent governance aims to improve how companies develop and deploy AI, including new Federal Trade Commission guidance promising stricter enforcement of laws prohibiting businesses from biased AI, as well as new EU regulations extending Europe’s rigorous privacy regulations to AI. Enterprises using the most advanced AI are positioned to take advantage of the new regulatory climate while using technology to foster fair lending.

The AI that consumer lenders should be deploying can identify preexisting biases in systems as well as those that develop over time, as inevitably occurs. By sifting through customers’ data according to demographics and lending outcomes in addition to income-to-debt ratios and other traditional criteria, the AI can help distinguish between biases that might exist and the legitimate factors that lenders should use to assess creditworthiness.

For example, was someone living in a certain Zip code denied a mortgage because the lender’s technology knew that many people from that neighborhood lacked qualifying income and so did not devote sufficient time and care on their application? Or had they thoroughly checked the individual’s income and credit report? Or was even the most generous interpretation of the applicant’s income simply not enough for the debt they sought to incur?

These questions can be difficult to answer. They require significant insight into one’s organization that only AI can help provide. But consumer lenders are often leery of deploying AI to find those insights in their data because they do not know if they can integrate the technology into their workflows, and they are aware of the downsides of failure. Well-meaning developers can unwittingly formulate algorithms that reproduce bias. Data sets that contain unrepresentative portraits of markets, loan applicants and other factors that can skew results. Even when following the letter of anti-discrimination laws, bias can creep in.

New York State regulators, for example, recently concluded an investigation into a high-profile case of a man receiving a credit card limit that was 20 times higher than his wife’s, even though they filed taxes jointly. The review did not find intentional bias in the bank’s financial technology but noted that implicit, structural sexism played a role in the unfair treatment.

AI Use Case

The Center for Creative Leadership, or CCL, has demonstrated how utilizing AI properly can pinpoint implicit structural bias. A premier executive development organization, the CCL conducts surveys to produce models for different leadership styles. Survey participants have an opportunity to write free-form responses to a single question that asks them to list the top three challenges they face in their offices.

Using AI, natural language processing, and topological data analysis to collect and crunch the responses, center mentors could draw correlations between the challenges participants listed and their demographic profiles, positions, companies, and industries.

The CCL could remove age, race, and gender from their AI-empowered analyses when those factors were irrelevant to leadership models. They could also add them, however, identifying potential biases in the models themselves. Comparing and contrasting variations of profiles with and without demographic data ensured that the models were not inadvertently biased against certain demographic groups. Subject matter experts in diversity helped the CCL interpret the findings.

Similarly, AI can identify biases in consumer lending by clustering subpopulations of customers from a bank’s global data (including demographic information), cross-referencing them with each other, and flagging results that humans can judge as biased or not. The analysis reveals how individuals in specific groups fare when seeking specific lending products.

If there is an overconcentration of lending in one group, AI can serve as a check that can help people see what made that group different. Lenders can overlay those findings on their current operations and deploy AI to reveal biases in new contexts in their lending. Or they can exclude criteria like race or gender to gain perspectives on their business without those filters at all.

These AI solutions do not necessarily eliminate biases. But they create multiple ways of identifying biases, thereby giving the lenders actionable intelligence into their enterprises. It is the many interpretative filters – AI models based on one data set applied to monitor other AI models – that keep biases in consumer loans from escaping notice.

AI can support efforts to recognize the biases that have afflicted consumer lending in the past. If companies choose, they can use AI to make those biases history, it could help deserving consumers obtain the credit and chance they need to change their lives for the better in the future.

About the Author

Eric Murray is vice president of customer success for enterprise AI software vendor SymphonyAI, which serves vertical sectors including retail, consumer packaged goods, financial services, manufacturing and media. 

 

 

 

Add a Comment

EnterpriseAI