UK ICO Approves Unconsented Facial Recognition At Security Conferences

Published Feb 05, 2020 13:53 PM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

The UK's data protection agency has declined IPVM's GDPR complaint against Dahua for using face recognition without consent at IFSEC last year, explaining that Dahua's processing was "acceptable" given the "setting it was used in" and "for demonstration purposes" only.

The denial effectively greenlights conference face rec demos, as long as they are not used to identify individuals, the data is quickly deleted, and specific signage is included. In this note, we examine the decision and its broader meaning, including:

  • Complaint Summary
  • ICO Response
  • Main Takeaways
  • Remaining Questions/Loophole issue
  • Conclusion

Complaint Summary

IPVM's GDPR complaint was based on the following factors:

  • Facial recognition requires a GDPR Article 9 justification. At IFSEC, "explicit consent" was the only conceivable justification, yet Dahua obtained consent from no one.
  • Dahua was clearly identifying natural persons (a condition for the GDPR to apply) as its demo labeled some people "stranger", indicating they were comparing everyone's face to an existing database of booth staff, as often takes place at security shows.

ICO Response: Dahua Face Rec OK "Due to the Setting it Was Used In"

After 6 months of deliberation, the UK Information Commissioner's Office (ICO) denied our complaint, stating they considered Dahua's demo unproblematic as it was "for demonstrational purposes and not for the purpose of identifying a particular person".

Importantly, the ICO also said Dahua's processing was "acceptable" "due to the setting it was used in", effectively greenlighting facial recognition at show demos. Below is the ICO response in full:

It is understood that Dahua were processing biometric data during their exhibition, however it was for demonstrational purposes and not for the purpose of identifying a particular person. All of the data captured during the exhibition was deleted. Dahua had erected signs to state that facial recognition demonstrations were being displayed and that facial images may be captured. IFSEC International also displayed signs to remind delegates that they were entering an area where facial recognition and biometric technology could be in active use. We consider Dahua’s processing of biometric data acceptable on this occasion due to the setting it was used in. It was used only for demonstration purposes in an arena where a facial recognition demonstration would reasonably be expected to take place and personal data from the demonstrations was not retained. We will, however, take this opportunity to remind Dahua of their data protection obligations when processing special category data and to ensure signage relating to the use of facial recognition technology is adequately displayed.

Main Takeaway: Context Matters

The chief takeaway from the ICO's response is that the context of sensitive processing matters. The ICO clearly determined that it would not apply strict GDPR principles given the setting of a security conference where processing was only "for demonstrational purposes" without ID'ing specific passerby.

Deleting Data

The second main takeaway is the importance of deleting data, with the ICO emphasizing that "all of the data captured during the exhibition was deleted".

Using Appropriate Signage

The final takeaway is the importance of signage. This was the only point the ICO rebuked Dahua for, stating that it should have used signage which specifically discloses facial recognition was being used. The Dahua privacy notice did not disclose facial recognition, see below:

However, this was clearly considered a minor oversight by ICO, since it did not formally penalize Dahua in any way, only giving them a reminder.

Remaining Questions Unanswered

After ICO's denial, two questions remained from IPVM's perspective:

  • there is no exception in the GDPR or the UK Data Protection Act allowing non-consensual biometrics processing if it is done for demonstrational purposes only. It is not clear to us what specific legal justification ICO is using.
  • the ICO determined Dahua's face rec did not require consent as it was "not for the purpose of identifying a particular person". But it was clear that booth employees were being recognized. The European Data Protection Board has specifically stated that a hotel identifying VIPs with facial recognition has to get consent from everyone, not just the VIPs:

IPVM followed up with ICO on these two points, but they declined to elaborate, simply telling us:

After making enquiries with Dahua, we do not have concerns over their facial recognition demonstration at IFSEC International.

Loophole Risk

One potential loophole this ICO decision creates is that exhibitors deploying face rec demos could keep people's face images and falsely claim to have deleted them; it would be very difficult for the ICO to know this was happening, as there is no way the ICO is going to audit every face rec demo at a security show.

Conclusion

The ICO is the UK government agency with the right to interpret the GDPR and national privacy regulations as it sees fit. From this case, it is clear the ICO gives significant weight to the context and purpose of the processing, rather than penalizing violations on a strictly technical basis.

The ICO's decision conforms with a trend IPVM has previously identified: despite fears that the GDPR would unleash an avalanche of eye-watering fines for minor mistakes/technical GDPR violations, this has not taken place.

Comments are shown for subscribers only. Login or Join