AI and 'Equality by Design'
in Florian Martin-Bariteau & Teresa Scassa, eds., Artificial Intelligence and the Law in Canada (Toronto: LexisNexis Canada, 2021)
2 Pages Posted: 4 Dec 2020
Date Written: November 20, 2020
Artificial intelligence (AI) is increasingly offered by proponents of its use as a tool for overcoming human biases, errors in judgment, and irregularities in decision-making, promising to provide a greater measure of predictability and consistency in decisions and, therefore, in the rule of law. For its critics, generating consistency in decision-making is a means of entrenching and amplifying discriminatory practices in a way that reproduces privilege and inequality in human relations through the use of pattern discrimination and predictive analytics, while simultaneously rendering these practices less transparent, explainable, and contestable. In both cases, technology is catapulted to the forefront of law as constitutive of lived realities and the governance of human relations, whether in the private or public sphere, rendering law perhaps nothing more than a technology too soon to become obsolete. Algorithms feed neural networks with neo-liberalism’s relations of power and inequality, and thus have the capacity to “bake in inequality,” to reproduce bias, and effectively to pass off the descriptive output of “code” as law, and such “law” as the normative aspirations of justice. More than ex post remedial mechanisms in law are needed to address such harms: if the rule of law is to retain its comparative advantage, an “Equality by Design” enforceable default for computer programming, consistent with the aim of a “human-centred approach” to AI governance, is required. Borrowing from well-established “Privacy by Design” principles, this chapter argues for similar, strengthened implementation in domestic law and policies for equality as the default value-based inclusive design for machine-learning algorithms and predictive analytics.
Keywords: AI; equality; Canada; bias; inequality; powers
Suggested Citation: Suggested Citation