Auditing Algorithms for Discrimination
166 U. Pa. L. rev. Online 189 (2017)
Washington University in St. Louis Legal Studies Research Paper No. 17-12-03
16 Pages Posted: 30 Dec 2017
Date Written: December 17, 2017
Abstract
This Essay responds to the argument by Joshua Kroll, et al., in Accountable Algorithms, 165 U.PA.L.REV. 633 (2017), that technical tools can be more effective in ensuring the fairness of algorithms than insisting on transparency. When it comes to combating discrimination, technical tools alone will not be able to prevent discriminatory outcomes. Because the causes of bias often lie, not in the code, but in broader social processes, techniques like randomization or predefining constraints on the decision-process cannot guarantee the absence of bias. Even the most carefully designed systems may inadvertently encode preexisting prejudices or reflect structural bias. For this reason, avoiding discrimination requires not only attention to fairness in design, but also scrutiny of how these systems operate in practice. Only by observing actual outcomes is it possible to determine whether there are discriminatory effects, and therefore, auditing must remain an important part of the strategy. Fortunately, the law permits the use of auditing to detect and correct for discriminatory bias. To the extent that Kroll et al. suggest otherwise, their conclusion rests on a misreading of the Supreme Court’s decision in Ricci v. DeStefano. That case narrowly addressed a situation in which an employer took an adverse action against identifiable individuals based on race, but the law still permits the revision of algorithms prospectively to remove bias. Such an approach is entirely consistent with the law’s clear preference for voluntary efforts to comply with nondiscrimination goals.
Suggested Citation: Suggested Citation