Gender as Emotive AI and the Case of ‘Nadia’: Regulatory and Ethical Implications
20 Pages Posted: 3 Jun 2021
Date Written: June 2, 2021
This article unpacks the regulatory and ethical problematics of the artificial intelligence (AI) powered virtual assistant, ‘Nadia’, developed for use in the Australian Government’s National Disabilities Insurance Agency (NDIA).
We explore how Nadia is gendered female, utilises high-risk AI technologies, including emotive-inducing AI and machine learning to monitor highly sensitive health and biometric data, and was developed for use by a group deemed vulnerable to further human rights violations under international law.
Drawing from the human rights frameworks of the EU and the European Convention on Human Rights, particularly the rights to data protection law and privacy, we explore how a system like Nadia poses interferences with, and potential violations to, the fundamental human rights of vulnerable groups, and discuss what regulatory provisions and frameworks could have been put in place to safeguard Nadia.
Keywords: AI, Australia National Disability Insurance Agency (NDIA), biometrics, dignity, emotion monitoring, ethics, facial recognition, governance, regulation, gender, impact assessments, privacy, protection of personal data, virtual personal assistants, vulnerable groups
JEL Classification: K10, C80, D80
Suggested Citation: Suggested Citation