GDPR / ICO Complaint Filed Against IFSEC Show Facial Recognition
By IPVM Team, Published Jun 20, 2018, 04:19pm EDTIPVM has filed a complaint against IFSEC’s parent company UBM based on our concern that the conference violates core GDPR principles on biometrics at its London conference.
The complaint was filed with the Information Commissioner’s Office (ICO), the UK’s data supervisory authority which monitors GDPR compliance. Any IFSEC attendee can make such a complaint here.
In this note, we explain what is being done at IFSEC, what the GDPR regulations are for this and why the complaint was made.
Biometrics Processing Based on Informed Consent
According to the GDPR, which went into effect on May 25 and which the UK is party to, biometrics processing like facial recognition is considered a "special category of personal data" and is generally prohibited with important exceptions.
One of those exceptions – the one which would apply to IFSEC – is informed consent with specified purposes. Article 9, section 2(a) of the GDPR states that biometrics are allowed if:
the data subject has given explicit consent to the processing of those personal data for one or more specified purposes [emphasis addded]
Article 7 also states consent notices must be written:
in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language [emphasis added]
IFSEC Booths Facial Recognition Illegal?
Easily a dozen IFSEC booths included demonstrations which had cameras film attendees and analyze their faces, even making age, gender, and behavior estimates. This included Chinese government-owned Hikvision, widely feared Chinese government-controlled Huawei, Chinese mega-manufacturer Dahua plus a variety of small companies from around the world. The images below show a sample of the sheer number of companies using biometrics at IFSEC 2018:
IPVM contacted IFSEC about this and was given the documents attendees agreed to in order to register.
IFSEC Notices
However, none of the notices conference attendees agreed to have any mention whatsoever of biometric/special category processing taking place.
UBM told IPVM the data notice on the back of attendees’ badges provided consent. However, this notice makes no mention of biometric processing nor anything like it. Read it here and see it below:
Neither does IFSEC’s privacy policy nor their terms and conditions make any mention of biometrics/facial recognition/special category processing.
Vendors Explanation Delete Anyway
A common explanation from facial recognition vendors we spoke to was that they are going to delete this anyway after the show. Whether they are, we cannot be sure. However, the GDPR does not allow non-consented, random biometric processing simply because the company claims they will delete it later. Moreover, vendors were consistently unclear about the need for there to be specified purposes, beyond their desire to sell this.
Case Filed
IPVM filed the case today, June 20th. Receiving a response may take time. IPVM will update our readers on further developments as they happen.
Update August 2018: The investigation has officially started and the UK ICO is contacting UBM / IFSEC:
Why We Filed
Reason 1: Despite heavy manufacturer GDPR marketing, actual practices and products are overwhelmingly unchanged. We are hoping that by drawing attention to this issue, that manufacturers will think more carefully about their use of facial recognition.
Reason 2: There is no 'case law' on how facial recognition can be used under GDPR. Can a generic privacy policy cover biometrics? Can the 'purpose' of using biometrics be no greater than simply selling security systems? We are hoping that a response from the ICO can help clarify when, where and how facial recognition can be used.
Update December 2018
UK ICO Denies IPVM GDPR Complaint Against IFSEC, Decides Each Exhibitor Responsible
7 reports cite this report:
Comments (37)
(pssst...edit your story title to say "complaint" rather than "compliant")...A Friend.
Have you had a chance to look at some EU/Western based facial recognition companies and quiz them re: GDPR, for example Herta (Spanish company)?
This is going to get interesting, one could argue that the very act of recording faces and the possibility of a human looking at them to determine or process individual data as defined in art.9 GDPR also contravenes GDPR.
This would effectively mean most if not all CCTV installations are not GDPR compliant.
Using the same argument you can also say that no more data is recorded in the act of facial recognition than a standard CCTV recording, and no more processing is carried out automatically than would be done by a human looking at the recording.
I am sure the intended purpose of GDPR is not to effectively outlaw CCTV so the second scenario would seem logical
However I think that once you get in to recording and comparing captured images against stored images in a database, the question of EXPLICIT permission really does become a real question that needs looking at.
I'm just going to stir the pot here a little, but do the photos that have been taken covering the show and uploaded here also violate the GDPR ?
Should all faces be blurred before uploading ?
Genuinely curious to see where the line gets drawn...
Good luck.
How did you identify who to file the complaint with?
Actually those data collection/handling was done by exhibitor 3rd parties not by UBM. UBM is about the bar codes as you shown a copy of a badge.
You should file complain against those manufacturers who used those technologies.
It is pretty much the same, if IPVM have an event, and I start to collect data on that, and somebody make a complaint to you.
Hi John,
I'm curious if you will also be filing with other trade shows. There have been several since the GDPR that I've attended where facial/biometric data has been shown/captured. I imagine with ASIS (GSX) coming up in a few months, everyone will be showing/capturing data there as well.
Reading this:
biometric data for the purpose of uniquely identifying a natural person
I am taking this to mean facial recognition, not just detection. The images you posted have gender/age/expression type of information, not "This person is John H." Also, they are not identifying anything like religion or political affiliation or anything else in the "prohibited" list. So is this in violation?
Good to see that people are taking action to cases that break the privacy law. The combination of the enormous increase in the number of installed security cameras and the recent gain in image recognition possibilities with deep learning makes privacy a serious issue. "Big Brother" is becoming a serious thread nowadays. With the new GDPR law, Europe is trying to protect its people by limiting the amount of privacy that will be invaded by the mass surveillance concept. To imply this rule, every citizen can file a complaint to a offender of the law. That is a great way to maintain the law. Kudos for IPVM for taking action in this case. Let's take privacy very serious!

Interesting case and wondering whee it will go.
On the same day GDPR went into effect, Belgium issued a new CCTv law effectievely forbidding ANY automated processing of personal data based on CCTV footage. Only exception is ANPR. All other, such as facial recognition, are forbidden in Belgium at the moment.
Would be interested to know if there are any other countrys, EU or non-EU, that go this far.
To be clear, it concerns CCTV images that are compared to personal data stored in a database or other file. Hence face detection is allowed, recognition is not
Interesting case, lets see what happens.
When it comes to using facial detection the guidelines (published in Dutch) are not conclusive and refer also to the previously applicable laws.
That set of laws gave the following info, it could be allowed if:
• a. The research serves a general interest,
• b. the processing for the relevant investigation or the relevant statistics is necessary,
• c. asking for explicit consent proves impossible or requires a disproportionate effort
• d. the performance is provided in such a way that the privacy of the person concerned is not disproportionately harmed.
Next to this, the GDPR demands data minimization.
For a case where a face recognition camera is used at a mall, you could argue that point C is applicable: 'requires a disproportionate effort´. You can´t ask everyone entering the mall to sign a waver.
When you combine that with data minimization you could build it in such a way that only shoplifters are registered and blacklisted to alert the security team when they enter.
If the face camera only gives a notice when the blacklist is triggered and if the proces of who views the video and how they do so is registered... it should be no problem.
No faces would be registered, only a trigger when an unwanted guest appears.
Update: We have not received a response yet from the ICO. However, we did send a copy to the UK Surveillance Camera Commissioner (Tony Porter's office) and they provided the feedback below, emphasizing this is the role of the ICO to cover:
The Surveillance Camera Commissioner regulates the overt use of surveillance camera systems by relevant authorities in England and Wales pursuant of the Protection of Freedoms Act 2012. Relevant authorities are defined in the legislation and include the police, local authorities and parish councils. The Surveillance Camera Commissioner does not have authority to regulate the use of surveillance camera systems operated by other organisations, nor does he have powers which enable him to inspect or audit CCTV systems, enforce laws or otherwise impose a financial or other sanction.
The Commissioner does not have access to legal advice and the question that you raise relates to matters which are outside of his scope, namely compliance with the General Data Protection Regulation (GDPR) by a non-relevant authority at a recent IFSEC event.
The EU’s GDPR is supplemented by the UK’s Data Protection Act 2018, which is separately regulated by the Information Commissioner’s Office (ICO). I note that you have already submitted a complaint to the ICO and they are the appropriate regulatory body to liaise with in respect of any enquiries arising from that legislation.
Update: A response from the ICO acknowledging our complaint and confirming that it is in queue:
Thank you for your email regarding your data protection concern about UBM Plc.
Your case is currently in our work queue waiting to be allocated to a case officer. We deal with a large number of concerns and aim to deal with them in date order.
Once your case has been allocated, the assigned case officer will contact you to advise you of the next steps.
In the meantime, if you have any additional information which you would like us to consider, please forward it on to the case quoting the above reference number.
Thank you for your patience in this matter and we shall be in touch shortly.
We will update as we get more feedback though ICO gives no sense of how long the queue is nor how long we might have to wait.
Update: The investigation has officially started and the UK ICO is contacting UBM / IFSEC:
"Your case is currently in our work queue waiting to be allocated to a case officer. We deal with a large number of concerns and aim to deal with them in date order. "
two things:
1. Replying to cases in chronological order - without any prioritization effort - means that nobody even looks at each complaint as it comes in. So, egregious cases with obvious violations take no precedence over neighbor squabbles about cameras pointing at each others yards.
This indicates that the ICO doesn't really care about what they are purportedly enforcing, and instead have already become just another governmental agency who can be expected to fight for more and more tax dollars each year so they can hire more case officers.
2. Even unintentionally, the amount of complaints seems to be more than the ICO can handle effectively. Any intentional effort to overload the ICO with complaints could render the effort to enforce the GDPR more useless.
Just curious on IPVMs view on the GDPR. Whilst taking advantage of the right to challenge under the act, is the intention to “test” the legislation or is it to challenge the exhibitors on a genuine concern of the abuse of the handing of personal data? If there is genuine support of the GDPR regulation, why is IPVM not pushing for a similar standard within the US that has a pretty appalling record on personal data and total disregard for CCTV data use and retention.
Could IPVM not push to clean up the domestic situation as well as challenging the efforts in place within the EU? To me, that would be worthwhile campaign for IPVM to hang its hat on and a real benefit to the industry.