Mapping the Use of Facial Recognition in Public Spaces in Europe – Part 3: Facial Recognition for Authorisation Purposes
62 Pages Posted: 25 May 2022
Date Written: May 22, 2022
This is the 1st ever detailed analysis of what is the most widespread way in which Facial Recognition is used in public ( and private) spaces: for authorization purposes. This 3rd Report in our #MAPFRE series should be of great interest to lawyers interested in data protection, privacy and Human Rights; AI ethics specialists; the private sector; data controllers; DPAs and the EDPB; policymakers; and European citizens who will find here an accessible way to understand all these issues.
Part 1 of our “MAPping the use of Facial Recognition in public spaces in Europe” (MAPFRE) project reports explained in detail what “facial recognition” means, ad-dressed the issues surrounding definitions, presented the political landscape and set out the exact material and geographical scope of the study. Part 2 of our Reports pre-sented, in the most accessible way possible, how facial recognition works and pro-duced a “Classification Table” with illustrations, explanations and examples, detailing the uses of facial recognition/analysis in public spaces, in order to help avoid conflat-ing the diverse ways in which facial recognition is used and to bring nuance and preci-sion to the public debate.
This 3rd Report focuses on what is, undoubtedly, the most widespread way in which Facial Recognition Technologies (FRT) are used in public (and private) spaces: Faci-al Recognition for authorisation purposes.
Facial recognition is often used to authorise access to a space (e.g. access control) or to a service (e.g. to make a payment). Depending on the situation, both verification and identification functionalities (terms that are explained in our 2nd Report) can be used. Millions of people use FRT to unlock their phones every day. Private entities (such as banks) or public authorities (such as the French government in terms of the now abandoned ALICEM project) increasingly envisage using FRT as a means of providing strong authentication in order to control access to private or public online services, such as e-banking, or administrative websites that concern income, health or other personal matters. FRT is increasingly being considered as a means of improving security when controlling and managing access to private areas (building entrances, goods warehouses, etc.).
In public spaces, FRT is being used as an authentication tool for automated interna-tional border controls (for example at airports) or to manage access in places as di-verse as airports, stadiums or schools. Pre COVID-19, there were a lot of projects to use in the future FRT in order to “accelerate people flows”, “improve the customer expe-rience”, “speed up operations” and “reduce queuing time” for users of different ser-vices (e.g. passengers boarding a plane or shopping) but the advent of the COVID-19 pandemic has further boosted calls for investment in FRTs in order to provide con-tactless services and reduce the risk of contamination.
Supermarkets, such as Carre-four, which was involved in a pilot project in Romania, or transport utilities in “smart cities”, such as the EMT bus network in Madrid, which teamed with Mastercard to conduct a pilot project that enables users to pay on EMT buses using FRT, have im-plemented facial recognition payment systems that permit consumers to complete transactions by simply having their faces scanned. In Europe, similar pilot projects are currently being tested enabling the management of payments in restaurants, cafés and shops.
Despite this widespread existing use or projected use of FRT for authorisation pur-poses we are not aware of any detailed study that is focusing on this specific issue. We hope that the present analytic study will help fill this gap by focusing on the specific issue of the use of FRT for authorisation purposes in public spaces in Europe.
We have examined in detail seven “emblematic” cases of FRT being used for authori-sation purposes in public spaces in Europe. We have reviewed the documents dis-seminated by data controllers concerning all of these cases (and several others). We have sought out the reactions of civil society and other actors. We have dived into EU and Member State laws. We have analysed a number of Data Protection Authority (DPA) opinions. We have identified Court decisions of relevance to this matter.
Our panoramic analysis enables the identification of convergences among EU Mem-ber States, but also the risks of divergence with regard to certain specific, important ways in which FRTs are used. It also permits an assessment of whether the GDPR, as interpreted by DPAs and Courts around Europe, is a sufficient means of regulating the use of FRT for authorisation purposes in public spaces in Europe – or whether new rules are needed.
What are the main issues in practice in terms of the legal basis invoked by data con-trollers? What is the difference between “consent” and “voluntary” in relation to the ways in which FRT is used? Are the “alternative (non-biometric) solutions” proposed satisfactory? What are the positions of DPAs and Courts around Europe on the im-portant issues around necessity and proportionality, including the key “less intrusive means” criterion? What are the divergences among DPAs on these issues? Is harmoni-sation needed and if so, how is this to be achieved? What are the lessons learned con-cerning the issue of DPIAs and evaluations? These are some of the questions exam-ined in this report.
Our study ends with a series of specific recommendations that we are making, in rela-tion to data controllers, the EDPB as well as stakeholders making proposals for new FRT rules.
We make three recommendations vis-à-vis those data controllers wishing to use facial recognition applications for authorisation purposes:
1) Data controllers should understand that they have the burden of proof in terms of meeting all of the GDPR requirements, including understanding exactly how the ne-cessity and proportionality principles as well as the principles relating to processing of personal data should be applied in this field.
2) Data controllers should understand the limits of the “cooperative” use of facial recognition when used for authorisation purposes. Deployments of FR systems for authorisation purposes in public spaces in Europe have almost always been based on consent or have been used in a “voluntary” way. However, this does not mean that consent is almighty. First, there are situations (such as the various failed at-tempts to introduce FRT in schools in Europe) where consent could not be justified as being “freely given” because of an imbalance of power between users and data controllers. Second, consensual and other “voluntary” uses of FRT imply the exist-ence of alternative solutions which must be as available and as effective as those that involve the use of FRT.
3) Data controllers should conduct DPIAs and evaluation reports and publish them to the extent possible and compatible with industrial secrets and property rights. Our study found that there is a serious lack of information available on DPIAs and evaluations of the effectiveness of FRT systems. As we explain, this is regrettable for several reasons.
We make two recommendations in relation to the EDPB:
1) The EDPB should ensure that there is harmonization on issues such as the use of centralised databases, and those principles that relate to the processing of personal data. A diverging interpretation of the GDPR on issues such as the implementation of IATA’s “One ID” concept for air travel or “pay by face” applications in Europe could create legal tension and operational difficulties.
2) The EDPB could also produce guidance on the approach that should be followed both for DPIAs and evaluation reports where FRT authorisation applications are concerned.
Finally, a recommendation regarding policy makers and other stakeholders formu-lating new legislative proposals: there is often a great deal of confusion about the dif-ferent proposals that concern the regulation of facial recognition. It is therefore im-portant for all stakeholders to distinguish the numerous ways in which FRT is used for authorisation purposes from other use cases and to target their proposals accordingly. For instance, proposals calling for a broad ban on “biometric recognition in public spaces” are likely to result in all of the ways in which FRT is used for authorisation purposes being prohibited. Policy-makers should take this into consideration, and make sure that this is their intention, before they make such proposals.
Keywords: Facial Recognition, Artificial Intelligence, Biometric Data, Biometrics, European Law, Data Protection, Privacy, GDPR, Law Enforcement Directive, Human Rights Law, AI Regulation, AI Ethics
Suggested Citation: Suggested Citation