The Auditing Imperative for Automated Hiring
Posted: 19 Aug 2019 Last revised: 17 Aug 2020
Date Written: March 15, 2019
Even as laws have been enacted to grant equal opportunity to job applicants, new socio-technical developments have ushered in novel mechanisms for discrimination. The high bar of proof to demonstrate a disparate impact cause of action under Title VII of the Civil Rights Act, coupled with the “black box” nature of many automated hiring systems, renders the detection and redress of bias in such algorithmic systems difficult. This Article, with contributions at the intersection of administrative law, employment & labor law, and law & technology, makes the central claim that the automation of hiring both facilitates and obfuscates employment discrimination. That phenomenon and the deployment of intellectual property law as a shield against the scrutiny of automated systems combine to form an insurmountable obstacle for disparate impact claimants.
The first contribution of this Article then is an eye-opening, in-depth examination of how bias is introduced, replicated, and also hidden by automated hiring systems. The second contribution is a hybrid approach to remedies that moves beyond the litigation-based paradigm in employment law to include redress mechanisms from administrative and labor law. To ensure against the identified “bias in, bias out” phenomenon associated with automated decision-making, I argue that the goal of equal opportunity in employment creates an “auditing imperative” for algorithmic hiring systems. This auditing imperative mandates both internal and external audits of automated hiring systems, as well as record-keeping initiatives for job applications. Such audit requirements have precedent in other areas of law, as they are not dissimilar to the Occupational Safety and Health Administration (OSHA) audits in labor law or the Sarbanes-Oxley Act audit requirements in securities law. Conjointly, I propose that employers that subject their automated hiring platforms to external audits could receive a certification mark, “the Fair Automated Hiring Mark,” which would serve to positively distinguish them in the labor market. I also discuss how labor law mechanisms such as collective bargaining could be an effective approach to combating the bias in automated hiring by establishing criteria for the data deployed in automated employment decision-making and creating standards for the protection and portability of said data. The Article concludes by noting that automated hiring, which captures a vast array of applicant data, merits greater legal oversight given the potential for “algorithmic blackballing,” a phenomenon that could continue to thwart an applicant’s future job bids.
Suggested Citation: Suggested Citation