Big Proctor: Online Proctoring Problems and How FERPA Can Promote Student Data Due Process
Notre Dame Journal on Emerging Technologies, Volume 3, Issue 1, 2022
67 Pages Posted: 25 Jan 2023 Last revised: 6 Mar 2023
Date Written: January 1, 2023
When the pandemic forced schools to shift to remote education, school administrators worried that unsupervised exams would lead to widespread cheating. Many turned to online proctoring technologies that use facial recognition, algorithmic profiling, and invasive surveillance to detect and deter academic misconduct. It was an “epic fail.”
Intrusive and unproven remote proctoring systems turned out to be inaccurate, unfair—and often ineffectual. The software did not account for foreseeable student diversity, leading to misidentification and false flags that disadvantaged test-takers from marginalized communities. Educators implemented proctoring software without sufficient transparency, training, and oversight. As a result, students suffered privacy, academic, reputational, pedagogical, and psychological harms.
Online proctoring problems prompted significant public backlash but no systemic reform. Students have little recourse under existing legal frameworks, including current biometric privacy, consumer protection, and antidiscrimination laws. Student privacy laws like the Family Educational Rights and Privacy Act (FERPA) also offer minimal protection against schools’ education technology. However, FERPA’s overlooked rights of review, explanation, and contestation offer a stop-gap solution to promote algorithmic accountability and due process.
Keywords: artificial intelligence, AI, proctoring, algorithms, education technology, algorithmic bias, academic integrity, surveillance, student privacy, FERPA, Family Educational Rights and Privacy Act, BIPA, biometric data, consumer protection, deceptive trade, discrimination, Title VI, due process
Suggested Citation: Suggested Citation