Facial recognition in school fined

Use of facial recognition is a privacy violation according to new decision by Swedish privacy regulator
Use of facial recognition in school subject to GDPR fine. Photo by Arvin Febry

A public school in Sweden filmed the students to register class attendance. The school is fined with EUR 20 000. The use of facial recognition violated the GDPR since the use was too intrusive and did not have a valid reason.

Reading time: 4 min

The decision in short

On 20 August 2019, the Swedish supervisory authority Integritetsskyddsmyndigheten (IMY) fined the use of facial recognition. Using facial recognition software to take class attendance is a violation of privacy. Facial recognition software collects biometric data which is sensitive data under the GDPR. Many restrictions apply to the use of biometric data. There are less privacy intrusive ways to take class attendance. A school cannot use consent from its students as the legal basis. Students cannot give their consent since they are in dependence on the school. Besides, the risk assessment did not fulfil the GDPR requirements.

The school board was fined SEK 200 000 (EUR 20 000). The greatest fine Swedish public authorities can receive is SEK 10 million (EUR 1 million).

Facial recognition software used in a trial project

The technology was used in a trial project of three weeks with 22 students participating. The purpose of the trial was to take the students’ attendance in class in an efficient way. Cameras filmed the students when entering classrooms. Face recognition software compared and matched faces with pictures of identified students.

Advertisement

Need templates, second opinion or support for your DPAs?

Connect with leading experts with a multitude of templates. Reviewing a customer DPA – ask for a second opinion from our experts. Track record with leading European startup, mid-size companies and listed global enterprises.

Get a quote today from the business law firm Sharp Cookie Advisors


Pictures and names of identified students are saved on a local computer. This computer did not have an internet connection and was stored in a locked cabinet. The public school relied on consent as the legal basis for the use of the novel technology.

The Swedish DPA carried out a limited review. Information security and information provided to students was not part of the review.

Biometric data is sensitive personal data

Using biometric data such as facial images is very restricted under the GDPR (Article 9). Use of facial recognition may be justified if it is necessary by law, medicine, or in case of consent. Valid consent must be freely given. Consent cannot be used as a lawful basis when there is no balance between the parties. There is a clear inequality in the relationship between the school and the students. Moreover, the registration of attendance is a one-sided control measure by the school. Consent was not the appropriate legal basis. The processing was not either justified as necessary for reasons of substantial public interest.

Too much data used to take class attendance

The main point of privacy legislation is that use data in a fair, transparent and lawful way. Use personal data with care and choose alternative ways to achieve the aim with less data.

In the case at hand, the Swedish DPA found that less invasive ways to take the class register were available. For example, a key card system to register the students’ presence in class could fulfil the aim. Hence, the trail project’s use of data was disproportionate.

Safeguarding the privacy of the students is an important task. Use of facial recognition software in combination with video surveillance was too intrusive. Data collected for one purpose should not be used for purposes beyond the original aim. Data must also be adequate, relevant and limited to what is necessary for relation to the main purpose.

Facial recognition can be legal under certain circumstances. In most of the EU, it is legal if rigorous privacy and security mechanisms are in place. It is not legal to use facial recognition in schools or around children if other means are available.

The incoming European Commission is looking into regulating the automatic identification of individuals. In June 2019, EU’s high-level expert group on Artificial Intelligence presented recommendations and ethical principles for the use of AI. In early 2020, these guidelines will be presented as a revised document and take effect.

What is the purpose of facial recognition?

Facial recognition is a biometric software that can identify a person. A person is identified by a comparison of patterns based on facial contours and features. The technology is mostly used for security purposes (authentication) which is legally accepted.

There is a growing interest to use the technology in marketing to better target a customer in real life. Using facial recognition in this way is complex to justify as there may be alternatives using less data.

Interpreting the decision (impact assessment and prior consultation are requirements)

An interesting point with the decision is that impact assessments are a must under GDPR. Any form of a risk assessment will not due. This decision supports a strict interpretation of the GDPR’s rules on DPIAs in Article 35.

Perform a DPIA before using new technology. Technologies as facial technology, machine learning, sentiment analysis, or emotional tracking are delicate. Acknowledge that it is likely that you must consult with your supervising authority. This will be the standard approach and not indicative of any wrongdoing.

The decision is currently appealed. We will update this analysis as the case evolves.

LEAVE A REPLY

Please enter your comment!
Please enter your name here