Algorithms As Legal Decisions: Gender Gaps and Canadian Employment Law in the 21st Century
20 Pages Posted: 18 Nov 2020
Date Written: July 31, 2020
Should judges and arbitrators in Canada use algorithms to assist with their decision making? Many commentators have argued that such algorithms are riddled with bias that make them unsuitable for use by the judiciary. In this paper, I explore gender bias in one particular type of predictive algorithm, an algorithm that predicts the “most likely” outcome if a case were to go to court based on existing case law data.
Here, I explore whether the existing case law data on reasonable notice decisions in Canadian employment law exhibits gender differences. I look at reasonable notice awards given to plaintiffs in 1,728 legal decisions. In contrast with some other studies, I find no statistically significant difference between the notice periods awarded to male plaintiffs and those awarded to female plaintiffs once all other factors are held constant. This is not the end of the story though. All other factors are rarely held constant. For example, the length of reasonable notice periods awarded is correlated with the type of job and compensation. Both these factors are correlated with gender.
I then explore the potential for judges to use algorithmic predictions in the decision-making process, arguing that there may be concerns about bias, depending on how the algorithm is implemented. In short, great care needs to be taken to ensure that the algorithm does not reinforce hidden biases in the law. I further explore the possibility of alternative types of algorithms that do not rely on judicial decisions as data.
Suggested Citation: Suggested Citation