The Fairness of Credit Scoring Models
46 Pages Posted: 18 Feb 2021 Last revised: 23 May 2022
Date Written: May 19, 2022
Abstract
In credit markets, screening algorithms aim to discriminate between good-type and bad-type
borrowers. However, when doing so, they also often discriminate between individuals sharing
a protected attribute (e.g. gender, age, racial origin) and the rest of the population. In this
paper, we show how (1) to test whether there exists a statistically significant difference between
protected and unprotected groups, which we call lack of fairness and (2) to identify the variables
that cause the lack of fairness. We then use these variables to optimize the fairness-performance
trade-off. Our framework provides guidance on how algorithmic fairness can be monitored by
lenders, controlled by their regulators, and improved for the benefit of protected groups.
Keywords: Fairness; Credit scoring models; Discrimination; Machine Learning; Artificial Intelligence
JEL Classification: G21, G29, C10, C38, C55
Suggested Citation: Suggested Citation