Decorative page background

Face recognition – Use of the technology in private sector

Face recognition – Use of the technology in private sector

In today’s digital age, face recognition technology is becoming increasingly accessible and sophisticated. While camera systems are already a common part of operations, new advanced cameras bring to the table not only higher resolution and faster transmission but also automatic face recognition capabilities. Although suppliers may promote these technologies as “GDPR compliant,” their implementation and use present several legal challenges.  How is the private sector coping with these new technologies, and what are the conditions for their operation?

Camera systems are primarily used for monitoring security and protecting property or individuals. The processing of personal data, such as capturing or storing recordings of individuals, typically occurs based on legitimate interest. As the data controller, the camera system operator is required to prepare a so-called balancing test to assess the necessity and proportionality of implementing the camera system.  While doing so, it can take inspiration from Methodology of the Office for Personal Data Protection (the “Office”), of which we previously informed here. In the case of a positive outcome from the balancing test and the fulfilment of other obligations, such as properly placing informational signs or appropriately installing camera angles by the controller to avoid conflict with Act No. 262/2006 Sb., the Labour Code, and other rules, the procedure will comply with legal regulations. This will also ensure that the personal data of all individuals captured by the cameras (employees, customers, visitors) are protected and processed in compliance with GDPR.

With technological advancements, more sophisticated high-resolution cameras, including those with automatic facial recognition capabilities, are emerging on the market.  This technology has primarily been used by security forces and other public authorities to prevent, seek and detect crime, or ensure public safety and order. However, due to its availability, the technology is gradually penetrating the private sector. Suppliers of this technology may promote their products as “GDPR compliant,” but this is often very inaccurate. It is important to realise that in their use the controller processes so-called biometric data (e.g., face contours, fingerprint, iris structure, or voice) for the unique identification of individuals. By doing so, the controller steps into the realm of more strictly protected special categories of personal data according to Article 9 of GDPR, whose processing is generally prohibited, and where the legitimate interest alone is not working as a legal basis for processing.

How face recognition works?

Photographs or video recordings, even if they contain distinctive features of a specific person, do not per se meet the criteria for biometric data. Biometric data are generated through additional operations or a series of technical solution operations (conversion into templates and comparison within facial recognition technology), resulting in identification with a certain degree of accuracy. To compare the currently captured photograph, a database of templates is required.  For example, this could involve employees who have access to areas protected by a face recognition camera system. The employer creates a database of likenesses of its employees, which the system converts into templates.  When a person enters the monitored area, such as through a turnstile, the camera takes a second photograph and compares it with the template in the database. If it determines with sufficient accuracy that the person is an authorised employee, the turnstile automatically allows entry. 

Even if the person is not in the database, and the system does not find a match, biometric data processing still occurs as the system attempted to identify the person.

Not all biometric identification is the same

It is essential to distinguish between identity verification/confirmation through, for example, the aforementioned turnstile, and remote biometric identification as defined in the new Artificial Intelligence Act.[1]

Identity verification, which in practice leads to gaining access to premises (turnstile) or unlocking devices (computer), can mostly be set up in a way compliant with legislation. Since legitimate interest alone is not sufficient for processing biometric data, an exception under Article 9(2) of GDPR must be found. This exception could be the explicit consent of the data subject. However, this consent must be obtained under conditions that ensure its freedom and informativeness, which is, for example, problematic in employer-employee relationships. An important factor for the validity of consent will be as to whether the data subject has viable alternatives available, such as the option to use an access card instead of a turnstile. Historically, the Office has approved the use of turnstiles with automatic face recognition for access to restricted areas not based on consent, but on Article 9(2)(b) GDPR (fulfilment of obligations in the field of labour law) in conjunction with Article 6(1) GDPR (compliance with a legal obligation).[2] According to the Office, this was a specific case where the monitored entity was able to demonstrate that the processing of biometric data was necessary to fulfil the controller’s specific obligations (ensuring safety at the workplace), and where previous, less invasive measures were ineffective. 

The use of remote biometric identification, such as identifying or attempting to identify multiple individuals in a monitored area to meaningfully facilitate identification of individuals without their active involvement, will likely be significantly hampered in the private sector. From a privacy perspective, we draw similar conclusions for emotion recognition systems, which the Artificial Intelligence Act prohibits from being used in the workplace and in educational institutions.

Always verify the legality of implementing new tools

While the use of face recognition technology in the private sector is not excluded, it is essential to conduct a comprehensive assessment of its specific use and set appropriate measures prior to its implementation, due to very strict legal regulations. As explained, the processing of biometric data requires meeting stringent conditions for special data categories. 

Although suppliers of technical solutions may promote – and we know that some are promoting – face recognition systems as meeting all GDPR requirements and specifications, it is crucial to carefully examine and consider all legal aspects. All this with the aim of ensuring that the processing of biometric data complies with applicable regulations and respects the rights and freedoms of individuals. In the case of implementing standard camera systems we recommend consulting experts due to the complexity of the issue and numerous legal requirements that apply. In the case of implementing automatic face recognition solutions, we consider compliance assessment to be a necessity.

Related articles