Continuation of this series on facial recognition. After having defined facial recognition among the biometric processes (see here) and seen the techniques used for its application (see here), let’s see what the current laws say about it now…
Facial recognition in the face of the law
The fields of application
Facial recognition techniques (FRTs) are used in more and more circumstances, but in Europe they are used in particular:
- In national e-government services as authentication.
- For border controls. It is useful in the Schengen area for example (illegal immigration control, cross-border crime, fraudulent identity or detection of persons threatening national security.)
- During criminal investigations.
- For remote identification (e.g. eKYC, electronic Know Your Customer. The electronic, remote version of KYC. KYC refers to the process by which a company verifies the identity of its customers.)
- To facilitate and secure access to restricted areas.
Some FRTs have biases based on the ethnicity (especially skin color) and/or gender of the person being identified. This leads to discriminatory problems. For example, in the United States, the number of false positives is much higher for a person of color. In France and in Europe, such a bias is contrary to the ordinary presumption of innocence, and violates Article 21 of the Charter, prohibiting, among other things, any discrimination based on sex, race, color, ethnic or social origins, genetic characteristics. 
Recognition bias also brings risks of moral harm to individuals who have been victims of FRT misuse. An example would be a system that is supposed to speed up a process that only works for a portion of the population, conversely increasing the time needed for a portion of the population to achieve the same result. This amounts to adding an additional administrative barrier for an ethnic minority, for example.
The risk of drifting towards mass surveillance is real. The current legislative proposals of the European Union do not exclude the use of FRTs in the context of video surveillance cameras, which worries the various European Data Protection Board (EDPB), who call for clarification of the EU’s position on the subject, and for a ban on this type of use for FRTs. 
There is also the risk that an FRT system and its associated databases will be used later for purposes different from, or additional to, those initially authorized.
Widespread surveillance also implies an infringement of the freedom to come and go anonymously in the public space, as well as a risk of imposing conformity on the individual at the expense of free will. 
It also impacts, among other things:
- Children’s rights (there are more false negatives on changing faces)
- The freedom of opinion and expression (threat linked to identification)
- Freedom of assembly and association (identification threat)
- Freedom of religion 
The main concerns are therefore a combination of the risk of inaccuracy, bias of the technology, and a drift towards an abuse of control after the initial implementations.
Current legal framework
There are currently no federal regulations for the use of facial recognition by private companies as part of law enforcement activities. It is in anticipation of such laws that some leading companies in the development of facial recognition AI, such as Amazon and Microsoft, have placed a moratorium on the use of facial recognition for an indefinite period of time, waiting for clear laws before resuming the commercialization of their facial recognition solutions.
Rest of the world
There is a global rise in facial recognition surveillance. By 2021, in addition to China, which makes extensive use of it, at least 63 other countries are reportedly already using FRT systems.  In Russia, its use against political dissidents is increasing.
In other countries, such as India, FRTs are currently used for authentication systems, notably in the Indian social security system; but the legal framework remains mostly insufficiently precise. 
Regulation at the international level has been the subject of two international forums.
The United Nations Human Rights Council passed a resolution in 2020 condemning the use of facial recognition in non-violent public demonstrations to identify and control crowds. 
In early 2021, the Council of Europe adopted the guidelines established by the Human Rights Council on facial recognition, which regulate both the private and public use of FRTs to avoid human rights violations.
For the legal framework, the European Union is still debating the exact regulation, and has been for about a year. As a result, the only laws currently applicable are very partial and generally come from laws that interact with the use of FRT without being specifically laws designed for FRT. The major laws that apply to the use of FRTs are going to be those on basic data protection and privacy rights.
The initial visual recording, the conservation of this recording and the comparison with databases is governed by these charters. These acts are therefore subject to the same obligations of information, authorization of the persons concerned, and controls (in particular by the GDPR) as classic personal data. To this day, some companies like Clearview AI continue to ignore the summonses of the EDPB to comply with the directives of the GDPR , .
The processing of facial images must therefore among other things:
- Be lawful (must meet the requirements of specific European legal bases),
- Fair (the data must not be used in a way that is prejudicial, discriminatory or misleading to the data subject),
- Be transparent (the data subjects must be aware of the storage of the data, the framework of its use, etc.),
- Follow a specific, explicit and legitimate purpose (data cannot be used for purposes other than those explicitly described at the outset, and these purposes must be lawful),
- To comply with security requirements (protection against unlawful access or use of data, loss, etc.),
- Comply with minimization requirements (only data that is useful for the purpose originally described should be stored),
- Comply with the requirements of data retention limitation (different articles govern the maximum retention periods of data depending on the use).
 European Union Agency for Fundamental Rights, 2020, Facial recognition technology: fundamental rights considerations in the context of law enforcement: https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper-1_en.pdf
 European Data Protection Board, 2021, EDPB_Press Release_statement_2021_05: https://edpb.europa.eu/news/news/2021/edpb-edps-call-ban-use-ai-automated-recognition-human-features-publicly-accessible_en
 European Commission, 2021, Impact assessment accompanying the Proposal for a Regulation of the European Parliament and of the Council, p. 18: https://data.consilium.europa.eu/doc/document/ST-8115-2021-ADD-2/en/pdf
 ROWE Elizabeth A., 2021, Regulating Facial Recognition Technology in the Private Sector: https://www-cdn.law.stanford.edu/wp-content/uploads/2020/12/Rowe-FINAL-Facial-Recognition.pdf
 FELDSTEIN Steven, 2019, The Global Expansion of AI Surveillance: https://carnegieendowment.org/files/WP-Feldstein-AISurveillance_final1.pdf
 TAMBIAMA Madiega and MILDEBRATH Hendrik, 2021, Regulation of facial recognition in the European Union: https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/698021/EPRS_IDA(2021)698021_EN.pdf
 United Nations Human Rights Council, 2020, Resolution on the promotion and protection of human rights in the context of peaceful demonstrations, A/HRC/44/L.11.
 CNIL, 2021, Decision n° MED 2021-134 of 1st November 2021: https://www.cnil.fr/sites/default/files/atoms/files/decision_ndeg_med_2021-134.pdf
 CUSHING Tim, 2022, Clearview’s World Tour Continues With A $21 Million Fine From The Italian Government: https://www.techdirt.com/2022/03/18/clearviews-world-tour-continues-with-a-21-million-fine-from-the-italian-government/