Regulatory Arbitrage or Random Errors? Implications of Race Prediction Algorithms in Fair Lending Analysis

78 Pages Posted: 25 Apr 2023

See all articles by Daniel Greenwald

Daniel Greenwald

New York University (NYU) - Leonard N. Stern School of Business

Sabrina T Howell

New York University (NYU) - Leonard N. Stern School of Business; National Bureau of Economic Research (NBER)

Cangyuan Li

New York University (NYU) - Leonard N. Stern School of Business

Emmanuel Yimfor

University of Michigan, Stephen M. Ross School of Business

Date Written: April 12, 2023

Abstract

Proxies for race are commonly used in settings where race cannot be observed directly. In the context of small business lending, we examine the standard race prediction algorithm (BISG), which regulators use to assess compliance with fair lending laws. The algorithm relies on an individual’s name and geographical location. If these features are correlated with socioeconomic characteristics, BISG errors could bias fair lending assessments and incentivize lenders to manipulate who they serve, specifically to lend to non-Black borrowers who are falsely predicted to be Black by BISG. We explore these issues using two datasets: proprietary loan application data from an online small business loan marketplace and loan data from the Paycheck Protection Program. We develop a measure of perceived race using images, which we show better correlates with self-identified race than BISG. BISG poorly predicts whether an individual is Black, generating more false classifications than correct ones, and these errors are systematically related to measures of socioeconomic advantage. For example, BISG has especially high false positive rates when classifying Black applicants in areas with high racial animus, where fair lending evaluation may be most critical. In a horse race, image-based race predicts loan approval, while BISG-based race does not, showing that BISG fails to capture important characteristics linked to race that are observable to lenders. There is large variation across lenders in the rate at which they lend to individuals who BISG erroneously assigns to the wrong racial group, leading them to appear more or less compliant with fair lending rules than they would using image-based race. Overall, our study documents the systematic biases in race proxies that rely on name and geography and highlights their implications for racial disparities in lending.

Keywords: race prediction algorithms, racial disparities, small business lending

JEL Classification: G21, G23, G28, J15, C81

Suggested Citation

Greenwald, Daniel and Howell, Sabrina T and Li, Cangyuan and Yimfor, Emmanuel, Regulatory Arbitrage or Random Errors? Implications of Race Prediction Algorithms in Fair Lending Analysis (April 12, 2023). Available at SSRN: https://ssrn.com/abstract=4417513 or http://dx.doi.org/10.2139/ssrn.4417513

Daniel Greenwald

New York University (NYU) - Leonard N. Stern School of Business ( email )

Sabrina T Howell (Contact Author)

New York University (NYU) - Leonard N. Stern School of Business ( email )

44 West 4th Street
Suite 9-160
New York, NY NY 10012
United States
212-998-0913 (Phone)

HOME PAGE: http://www.sabrina-howell.com

National Bureau of Economic Research (NBER) ( email )

1050 Massachusetts Avenue
Cambridge, MA 02138
United States

Cangyuan Li

New York University (NYU) - Leonard N. Stern School of Business ( email )

44 West 4th Street
Suite 9-160
New York, NY NY 10012
United States

Emmanuel Yimfor

University of Michigan, Stephen M. Ross School of Business ( email )

701 Tappan Street
Ann Arbor, MI 48109
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
63
Abstract Views
204
Rank
553,002
PlumX Metrics