Facial recognition
You know when, in the movies, you see the police compare the images of the security cameras with the faces of the criminals on file to identify the culprit? Here, this technique, defined as "facial recognition", is now also used in Italy not only in the field of national security, but also in airports, hotels, as well as on our smartphones.
The positive aspects of this tool are different. One of all: it reduces the time to achieve the goal we have set ourselves. In identifying a wanted person, in the disposal of queues during the check-in or boarding phase of passengers, in accessing our electronic devices. All showing only our face.
But as happens in any situation, there is always some compromise to be made. The use of video devices capable of carrying out intelligent filming and, therefore, of identifying a person based on the intersection of biometric data inevitably entails a reduction in our privacy.
Biometric data But let's go in order. What is biometric data? The GDPR indicates them as: "personal data obtained from a specific technical treatment, relating to the physical, physiological or behavioral characteristics of a natural person and which allow or confirm their unique identification, such as facial image or fingerprint data".
Their collection takes place through increasingly advanced technological tools and takes place in two distinct phases. In the first there is a special reader who physically carries out the biometric recognition.
In the second phase the software component is used through which the collected data are compared with other data already present to verify the correspondence with a specific person. However, in order to be used, these data need to be collected, stored, to put it in a single expression: treated. And the regulation on this point is very clear.
Art. 9 of the GDPR, in the first paragraph, expressly prohibits the processing of certain categories of data, including biometric data, to ensure maximum protection for the rights and freedoms of the natural person.
GDPR and exceptions
The protection provided by the Community legislator therefore provides for the prohibition of the processing of data that can be particularly delicate and that can have a greater impact on the more personal sphere of the interested party, such as biometric data, in fact, and how states of health, sexual orientation or religious and political opinions.
For them to be treated lawfully, it is necessary to identify, as a motivation, one of the exceptions indicated in paragraph 2 of art. 9 of the Regulations. Where possible, the key word for the correct functioning of the processing is to "counterbalance" this intrusion by respecting the principles of legality, proportionality, transparency and data minimization as well as always guaranteeing a very high level of security.
Facial recognition and video surveillance
The thorniest issue is linked to the spread of an increasingly used tool: the systematic and automated surveillance of a specific space by optical or audiovisual means, mostly with the aim of protecting property, o to protect the life and health of people.
This activity involves the collection and storage of graphic or audiovisual information on all the people who enter the monitored space, identifiable on the basis of their appearance or other elements specific. The identity of such persons can be established on the basis of the information thus collected, through the so-called facial recognition.
The risk that arises in these cases is that of an incorrect use of such data, which grows in relation the size of the monitored space, the number of people filmed and the type of data processed.
For this reason, the use of video surveillance tools with the biometric recognition function, installed by private parties for their own purposes , requires as a further condition the explicit consent of the data subjects.
Guidelines of the European Data Protection Committee
The European Data Protection Committee, as a body that contributes to 'correct application of the rules on data protection, has recently intervened on the subject to try to solve this node: how to allow the use of intelligent video devices to capture biomedical data and, at the same time, protect sensitive information in accordance with the GDPR.
Within the guidelines, the Committee has established that, in order to detect the processing of particular categories of personal data, it is necessary to analyze:
the nature of the data (relating to physical, physiological or behavioral characteristics); the means and methods of processing; the purposes of the processing, aimed at unequivocally identifying a subject. Once the category of particular data has been identified, these video surveillance systems equipped with artificial intelligence must be subjected to stringent guarantee measures.
The Committee has identified them in the guidelines, reserving the right to update them with the time.
For example, it has been envisaged that the data controllers will have to guarantee the subdivision of data during the transmission and storage of the same, the conservation of biometric models and raw data or identity data in banks separate data, the encryption of biometric data and the identification of an encryption policy for key management, the provision of a fraud detection system, the association of an integrity code to the data and the prohibition of any access external to the aforementioned data.
Conclusions
With the recently adopted guidelines, the European Union has tried to give a more outlined or less permitted in terms of biometric data and audiovisual tools.
Starting from the identification of the type of data processed, passing through the methods of processing and arriving at its purpose, availability, the data integrity and confidentiality.
The measures to be adopted will undoubtedly be subject to continuous updating, in line with the evolution of technological progress. We will see what further obstacle the European Committee will have to face to manage the advancement of new artificial intelligences in harmony with the needs for the protection of the personal data involved.