More AI Bias? Seems Possible

September 10, 2021

Freddie Mac and Fannie Mae are stuck in the past—the mid-1990s, to be specific, when the Classic FICO loan-approval software was developed. Since those two quasi-government groups basically set the rules for the mortgage industry, their reluctance to change is bad news for many would-be home buyers and their families. The Markup examines “The Secret Bias Hidden in Mortgage-Approval Algorithms.” Reporters Emmanuel Martinez and Lauren Kirchner reveal what their organization’s research has uncovered:

“An investigation by The Markup has found that lenders in 2019 were more likely to deny home loans to people of color than to white people with similar financial characteristics — even when we controlled for newly available financial factors the mortgage industry for years has said would explain racial disparities in lending. Holding 17 different factors steady in a complex statistical analysis of more than two million conventional mortgage applications for home purchases, we found that lenders were 40 percent more likely to turn down Latino applicants for loans, 50 percent more likely to deny Asian/Pacific Islander applicants, and 70 percent more likely to deny Native American applicants than similar White applicants. Lenders were 80 percent more likely to reject Black applicants than similar White applicants. These are national rates. In every case, the prospective borrowers of color looked almost exactly the same on paper as the White applicants, except for their race.”

Algorithmic bias is a known and devastating problem in several crucial arenas, but recent years have seen efforts to mitigate it with better data sets and tweaked machine-learning processes. Advocates as well as professionals in the mortgage and housing industries have been entreating Fannie and Freddie to update their algorithm since 2014. Several viable alternatives have been developed but the Federal Housing Finance Agency, which oversees those entities, continues to drag its heels. No big deal, insists the mortgage industry—bias is just an illusion caused by incomplete data, representatives wheedle. The Markup’s research indicates otherwise. We learn:

“The industry had criticized previous similar analyses for not including financial factors they said would explain disparities in lending rates but were not public at the time: debts as a percentage of income, how much of the property’s assessed worth the person is asking to borrow, and the applicant’s credit score. The first two are now public in the Home Mortgage Disclosure Act data. Including these financial data points in our analysis not only failed to eliminate racial disparities in loan denials, it highlighted new, devastating ones.”

For example, researchers found high-earning Black applicants with less debt get rejected more often than white applicants with similar income but more debt. See the article for more industry excuses and the authors’ responses, as well some specifics on mechanisms of systemic racism and how location affects results. There are laws on the books that should make such discrimination a thing of the past, but they are difficult to enforce. An outdated algorithm shrouded in secrecy makes it even more so. The Federal Housing Finance Agency has been studying its AI’s bias and considering alternatives for five years now. When will it finally make a change? Families are waiting.

Cynthia Murrell, September 10, 2021

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta