First Come, First Hired? ChatGPT's Bias for The First Resume It Sees and the Cost for Candidates to Overcome Bias in AI Hiring Tools

MIT Computational Law Report, Generative AI for Law - Special Release Part 3

13 Pages Posted:

See all articles by Alexander Puutio

Alexander Puutio

Office of the Assistant Secretary General

Patrick K. Lin

Harvard University - Carr Center for Human Rights Policy; Surveillance Technology Oversight Project

Date Written: February 02, 2025

Abstract

Artificial intelligence (AI) technologies have made significant inroads into the hiring landscape, promising to streamline everything from candidate screening to final selection. From dedicated “interviewer bots” to AI-enhanced resume screening processes, companies are increasingly relying on advanced language models to manage the overwhelming number of applications they receive. This shift greatly increases the procedural efficiency of hiring. At the same time, it raises difficult questions about the impact of AI on procedural fairness, given how numerous authors have found that AI systems both inherit pre-existing biases lurking in historical data and reinforce those introduced by user prompts. As a result, these AI-based hiring systems can produce outcomes that neither the employer nor the candidates desire as a result.


To better understand how AI might reshape hiring processes, and to what extent it may replicate or amplify existing inequalities, we conducted an experimental study using ChatGPT, the most commonly used large language model in both private and enterprise context, and recruited its services as a resume screener for an entry level finance job it in its browser, API and GPT-agent forms. Our approach was designed to simulate real-world hiring scenarios as closely as possible while testing for the impact of varying key terms in the resumes submitted to ChatGPT. Specifically, we asked ChatGPT to select one candidate from a set of ten entry-level-hopefuls who varied in race, gender, as well as the perceived cost of university attendance and extracurricular activities.

Over the course of more than 2,000 observations, we validated well-documented biases and discovered novel patterns of automated discrimination.

Keywords: artificial intelligence, ai, hiring, employment, resume, bias, ai hiring tools, chatgpt, openai, algorithmic bias, labor, ai model, large language model, llm, gender, race, protected class

Suggested Citation

Puutio, Alexander and Lin, Patrick K., First Come, First Hired? ChatGPT's Bias for The First Resume It Sees and the Cost for Candidates to Overcome Bias in AI Hiring Tools (February 02, 2025). MIT Computational Law Report, Generative AI for Law - Special Release Part 3, Available at SSRN: https://ssrn.com/abstract=

Alexander Puutio

Office of the Assistant Secretary General ( email )

405 East 42nd St
New York, NY 10017
United States

Patrick K. Lin (Contact Author)

Harvard University - Carr Center for Human Rights Policy ( email )

Littauer-G-11G
Cambridge, MA 02138
United States

Surveillance Technology Oversight Project ( email )

40 Rector Street
9th Floor
New York, NY 10006

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
16
Abstract Views
77
PlumX Metrics