Automated Employment Discrimination

56 Pages Posted: 19 Aug 2019 Last revised: 28 Sep 2019

See all articles by Ifeoma Ajunwa

Ifeoma Ajunwa

Cornell University ILR School/Law School; Harvard University - Berkman Klein Center for Internet & Society

Date Written: March 15, 2019

Abstract

Employment discrimination may be likened to a many headed hydra, even as laws have been enacted to grant equal opportunity to job applicants, new socio-technical developments have ushered in novel mechanisms for discrimination. The high bar of proof to demonstrate a disparate impact cause of action under Title VII of the Civil Rights coupled with the “black box” nature of many automated hiring systems, render the detection and redress of bias in such algorithmic systems difficult. This Article, with contributions at the intersection of administrative law, labor law, and law & technology, makes the central claim that the automation of hiring both facilitates and obfuscates employment discrimination. That phenomenon and the deployment of intellectual property law as shield against scrutiny of automated systems combine to form an insurmountable obstacle for disparate impact claimants.

The first contribution of this Article then is the adoption of a hybrid approach that moves beyond the litigation-based paradigm in employment law to include redress mechanisms from administrative and labor law. To ensure against the identified “bias in, bias out” phenomenon associated with automated decision-making, I argue that the goal of equal opportunity in employment creates an “auditing imperative” for algorithmic hiring systems. This auditing imperative mandates both internal and external audits of automated hiring systems, as well as, record-keeping initiatives for job applications. Such audit requirements have precedent in other areas of law as they are not dissimilar to the Occupational Safety and Health (OSHA) audits in labor law or the Sarbanes-Oxley Act audit requirements in securities law. Conjointly, I propose that employers that subject their automated hiring platforms to external audits could receive a certification mark, “the Fair Automated Hiring Mark” which would serve to positively distinguish them in the labor market. I also discuss how labor law mechanisms such as collective bargaining could be an effective approach to combating the bias in automated hiring by establishing criteria for the data deployed in automated employment decision-making and creating standards for the protection and portability of said data. The Article concludes by noting that, automated hiring, which captures a vast array of applicant data, merits greater legal oversight given the potential for “algorithmic blackballing” that could continue to thwart an applicant’s future job bids.

Suggested Citation

Ajunwa, Ifeoma, Automated Employment Discrimination (March 15, 2019). Available at SSRN: https://ssrn.com/abstract=3437631 or http://dx.doi.org/10.2139/ssrn.3437631

Ifeoma Ajunwa (Contact Author)

Cornell University ILR School/Law School ( email )

Ithaca, NY 14853-3901
United States

Harvard University - Berkman Klein Center for Internet & Society ( email )

Harvard Law School
23 Everett, 2nd Floor
Cambridge, MA 02138
United States

Register to save articles to
your library

Register

Paper statistics

Downloads
169
Abstract Views
2,359
rank
180,578
PlumX Metrics