Swedish Regulator fines Skellefteå school board SEK200,000 for facial recognition trial...
The Skellefteå school authority in Sweden conducted a trial using video camera surveillance and facial recognition technology in order to assess attendance at class of high school students. The trial lasted around 3 weeks and involved 22 students. Consent was apparently obtained from the parents or guardians.
The Swedish supervisory authority did not receive any complaint but, rather, became aware of the trial following media coverage. Acting on this observation, the supervisory authority investigated and published its decision on 21.08.19.
The investigation was limited in scope to what might be described as the low-hanging fruit issues. It appears that several additional potential failures were not considered, including (and not limited to) whether or not compliant Article 13/14 notifications were provided on first processing or whether appropriate Article 30 record keeping had been put in place. Additionally, there is no mention of any involvement of third parties such as potential joint controllers or processors. Of course it is entirely possible that the school board did the work themselves, or contracted the system design and implementation which was operated and managed in-house. We don't know as a result of this decision. In general, it would seem prudent for any regulatory investigation to cover these points in addition to those covered here.
Supervisory Authority Findings
The school authority appeared to rely on consent as the lawful pathway - however, the regulator found that this was not valid (irrespective of the data subjects' ages) because the school authority was in a position of power over the data subjects. This raises an interesting nuance regarding consent, especially for minors, where consent is claimed to be freely given but some leverage exists putting the data controller in a position of power over the data subject.
The supervisory authority also considered the Camera Surveillance Act, Swedish national legislation covering camera surveillance. As part of its consideration, the supervisory authority determined (arguably challengeable) that the classroom is not a place to which the public has access and, therefore, a permit under the Camera Surveillance Act is not required. It is not clear (nor especially relevant to this article) whether the supervisory authority is in a position to make such a determination.
In particular, the collection of biometric data (special category data) was regarded as serious. No effective risk assessment or DPIA had been carried out and so there was a clear breach of Article 35. There was a risk assessment which determined, erroneously, that no special risk assessment was required in the case of sensitive personal data. No Article 36 consultation had been sought with the supervisory authority regarding the deployment and the supervisory authority also found that there was no provision within the GDPR for the law to be relaxed in the case of trial deployments.
The supervisory authority found that the school authority was in violation of Articles 5, 9, 35 and 36 of the GDPR.
As to penalties, the supervisory authority considered that this case was not suitable for a warning other than in addition to a monetary penalty. The monetary penalty took into account that no complaint had been made and that the surveillance was for a short period of time with a small number of data subjects. The warning was issued as the school authority had indicated its intention to continue with the deployment of the facial recognition surveillance system.
Conclusions of significance.
- Consent cannot be relied upon where the party seeking consent is in a position of influence over the party giving consent; This decision relies on Recital 43 of the GDPR which provides that: "In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation." This element of the decision, it is submitted, follows an entirely predictable approach based on a straightforward analysis of the Regulation text.
- Use of facial recognition technology to monitor classroom attendance was disproportionate to the task at hand, therefore breaching the GDPR principle of data minimisation. This represented a clear breach of Article 5. Referring to Recital 39 which provides that personal data may only be processed if the purpose of the processing cannot be satisfactorily achieved by other methods. While the school board had considered this point, their assessment differed from that of the Data Inspectorate. It is submitted that the supervisory authority took a very predictable view of the school board's approach. Indeed, a cynic might say that school attendance always used to be accurately assessed by reference to the class register. Employing sophisticated but arguably invasive technology did not significantly improve the simpler approach but at the cost of greater risk to the privacy of the data subjects.
- Noting that the starting point is that the processing of sensitive personal data (i.e. biometric data) is prohibited unless a valid exception exists under Article 9(2). While an exception does exist under Swedish national law (provided for by a Member State derogation under Article 9(2)(g) GDPR), the exception was not met by the use that the school authority was making of the data gathered or the underlying purpose. Moreover, the gathering of such data constituted a form of search which was also outlawed under national derivative legislation (the Swedish Data Protection Act). The inference from this part of the decision is that there has to be a very good reason, in Sweden at least, for employing facial recognition technology. The school board had no such reason.
1. Facial recognition technology gathers biometric data which is special category data;
2. A DPIA is required BEFORE commencing any trial, let alone full deployment, of such a system;
3. A consultation with the supervisor should be sought prior to trial or deployment of such a system;
4. Penalties will be considered in a manner proportional to the harm and risk.
My conclusion from this case is that, while the school authority clearly made a number of mistakes, the most significant part of this decision is the determination by the supervisory authority regarding consent. Consent has been at the centre of a great deal of concern regarding the GDPR, much of it misplaced, but if this ruling withstands a court challenge it is likely that obtaining consent in certain circumstances will become significantly more difficult. Whether there will be a court challenge is not clear at this stage. However, the conclusion that facial recognition technology does not pass the necessary national law test in this circumstance ought to put users of such technology on notice that indiscriminate use of this technology requires significant care and thought. Failure to carry out a DPIA or to consult the supervisory authority was an invitation to a monetary penalty, even though this was a limited 'trial' and no special exception exists for trials, limited or not.