UK Camera Commissioner Calls for Regulating Facial Recognition

Published Apr 15, 2019 11:41 AM

IPVM interviewed Tony Porter, the UK’s surveillance camera commissioner after he recently called for regulations on facial recognition in the UK.

IPVM Image

There is currently no clear legal framework for the technology despite the passage of the GDPR last year, and important questions remain, such as:

  • What would future regulations on facial recognition actually look like?
  • Is a retail store using facial recognition to catch shoplifters legal under the GDPR?
  • Should private facial recognition use be banned outright, as Belgium has done?

In this note, we ask Porter the above questions, explain the current state of facial recognition law in the UK, and where it’s headed.

Who ** *** ************ ****** ************?

*** ******** *** ******* ** **** to ******** ******** ****** *** ** the ** *** ** *********** * part ** ******* ******– *** ******** *** *********** ********.

** ****** ** **********, *** ************ ****** ************ *** no ****** *********** ** ********** ******. Its ***** ******* ** ********* ***** ** *********** ***** ************ *** ** ***** authorities (*** *** ******* ************* ** households.)

Tony ******: ******* ***** “******* *** ********”

** ***** ************* ** ***** **, ****** ****** for “* ****** ********** ** *********** regime ******* ******” ****** ***********. ****** did **** *******, ** ** **** IPVM,

*** ********** ** ****** **** *** mainstream - **’* **** ** ********** and **** ** * *******. [***] the *********** ** ********** ********* **** oversees ** ** ******* *** ******** and * ***’* ***** **** **’* helpful

****** **** **** **** ******* *************, such ** *******, ***** **** ******, ************* ************, *** ********* ********* ***************** ******** ****** *********** ***. ** turn, ***** ******** **** ** * patchwork ** *********** ***** ***** ** direct ******* ** ****** ***********, ******* the********** ** ******** *** ** ****, ******* **** ********** ***, *** *******.

What ****** ** **** *******

****** **** **** ** ******** *** specific ******** ** ***** ** *** legal **** ****:

  1. ******** *************** ****** **** ** ********-**** ** ******* *** ****** ** finished **** *** **** ** ******. But ****** ** ******* *** *** outcome ***** ****-******* ************ *** ***** ****** ************************** ****** ** ***** *** ***** facial *********** ** ******. *** **** has ** ***** ** *********** *** breaking **; *******, *** **** ** admissible ** ***** *********** (*** ******* 1.12). ***** *** **** **** ******* to ********** ******, ******* ****** ******** can ** ***** ** *** ***** (Section *.*), ** ******** ** ***** also **** ** ****** ** ******* sector ********* ** ****.
  2. ******* ******** *********** ****** ** ********** about *** ********, *** ******** ******** “a ****** ****” **** ******** ****** recognition. *******, **** ***** ** ***** away, ************ ** *** ** ** engulfed ** ****** ************.

Core ******: **** ***** *********** ******, ******* *** ******* ***********

*******’* ******* ******* **** ********** ********** ** ****** except *** *** ***** ******** ** video ************: ***** *** ******** *******, which ******* ***** *** ** *** case *** **** ****** *********** ************ outside ** ****** *******, ** “*********** public ********.”

*******, *** **** **** *** ******* what “*********** ****** ********” ******** *****. Porter **** *** **** ** “************ loose” **** ** *** ** *********** in **** ****, *********** ******** * legal **** ****:

* ***** ** **** *****, '*** jury's ***** ***' ** * **** kind **************

** *** ******* ******, ****** **** IPVM ****** *********** *** ** ******* unregulated:

********** *** ****** *** ********* **** in *** ** *** **** ****** very ******* **** ******* [** ****** recognition]…yet ********, **’* *** ********** ****** which ** ***** *** **********, ***** it **** **** **** [******] ***, and ****’* * *** **** *********.

Minimum ************ *** ****** ***********

** **** ********* ***************** ** ** **********, ****** ****** that *** *** ***, ** * minimum, ******:

  • ********* ******* **’* ********* ** ****** security *******;
  • ********* ******* *** ***** ** ************;
  • ******* ** ********** ********* ********* *****.

Facial *********** ** ***** ***********

***** **** ******** ** *** ********** illegal ** *** **, ***** ***** so **** ******* ** *** ***** that **** ********** **** *** ****'* "substantial ****** ********" ********* - **** though ***** *** ** *********** ********** what **** ********* ******** **.

(*** *******, ** **************** ******** ****** *** ******* ******** *** ***-********** facial *********** ******** ***** *** "*********** Public ********" **** *** ** ********* GDPR-compliant.)

****** ********* **** **** ***** *** indeed ***** ****** *********** *** **** purpose ** *** **, *** **** he *** "**** ********". ****** **** IPVM ** ********* *********** ****** ** clear **** ****** ****** *********** *** by ***** **** ** ***** *********** is ****** *** *******:

***** ****** ** * **** ****** statement **** ****’* *** **********.

** ** ** ***** ** ** used, *** ****** ***** ** ** informed ************* **** **'* ***** ****, and **** **** *** *********** ** reject. [**** *** ** ********* ****] but **'* *** ***** ** **** a ******* **** ****** ***** ********* information ********.

For **** ******* ********* - '***************' ** ***

**** **** ** ***** ** * shop ***** ****** *********** ** *********** with ****** ** ***** ******-***** ********* or **** ******* ******, ****** *** substantial ************. ** **** **** "********** standards ** ***************" *** *** ** this ********.

******************** **** **** * ******** **** in ********** ******** **** ****** *********** to ******* ******** ** ******** ******* its ******* *** *** ******; *** goal *** ** **** ******* ** a ****** ********* ** ** ******** and ******* ******. ****** ***** *** processing ** ** **************** ***** ** his ************** ** "******* ** ****** law".

*** ****** **************** ** *** ********, ****** ****:

** *** * *********** ****** *** caused *** ********* ** *** *******.

Should ******* ****** *********** ** ****** **** ** *******?

** ******* ** ****, ******* **** **** ******** ****** the *** ** ****** *********** ** security ******* *** ******* ***, ******** it ******** ***** *** *** ***********.

****** **** ** ****’* ******* ** radical * ********, ****** *** ******** opportunities ******* ** ********** ****** *********** (such **********):

*** ***** **** ***** * ******* and *** ************* *** **** **** of **********. * ***** ** ****** explore **** **** ********. **** *****’* mean ** ****** ********** ***** *******, nor **** ** **** ** ***, like *******, ** **** ** *** completely

**********

******’* ***** *** ********** **** ***** given *** **** ** ****** *********** use. ****’** * **** ******** **** the **********'* ****** ******* ***** ** legal * **** **** ** **** of ******, ** **** ******* ***** to **** **** "*********** ****** ********" means *** ***** *** *******.

***** **** *** **** *** ********** between ******* *** ********** **** ***** the ********** ** ****** **** *** right ** ** **, ******** ******** should *** ***** *********.

Vote / ****

Comments (10)
CM
Clive Mason
Apr 17, 2019

Facial recognition when used appropriately can offer protection to both public and law enforcement sectors. As technology in this area improves, here at Webeye we would at some point consider adding a white list and even potentially a black list system to our cloud alarm platform, one, to reduce false alarms the bane of most central stations  and two, to potentially identify a  real intruder.  Of course this would be consensual for the legitimate person entering the location but not for the real intruder. In such instances the latter  would definitely need to be regulated but could be immensely useful, time saving and efficient particularly for law enforcement. 

(1)
JH
John Honovich
Apr 17, 2019
IPVM

Of course this would be consensual for the legitimate person entering the location but not for the real intruder.

Clive, how does one make it 'consensual for the legitimate person' 'but not for the real intruder'? I am struggling to understand it in general. For example, if I go to a store how do I 'opt out' of the system? I can tell them don't put me on a blacklist but I could still be mistakenly matched to someone on a watchlist, no?

CM
Clive Mason
Apr 24, 2019

Yes absolutely, it’s going to be difficult to regulate in environments completely open to the public but not so in controlled environments such as a home, office or yard secured by fencing. Technology is here with things like Facebook for example where you tag a photo with the name of the people in the photo then any other photo added with the same people in it,  the software works it out. Now we can get to the same thing with alarm systems going through to  a CMS,  the alarm operator sees 1000’s of videos a day, he sees a human, but can’t tell the difference between home owner or intruder, but software could. A white list would in this case stop the alarm operator calling the police because it would id the guy as a good guy. You are right however that blacklists are more problematic and serious consideration would be needed on how to allow them, each and every circumstance would need to be assessed, laws taken into consideration etc, this would make it problematic to regulate but probably not impossible. 

da
doru asmarandei
Apr 24, 2019
Azitrend Distribution SRL

Face Recognition analysis takes pictures already saved in a video archive, analyzes the image and if it detects human faces it will compare it to a database. If the face of the person appearing in the image is not found in the database then no action is taken and thus no damage is caused to the person who was surprised at a time by a system that has this type of analysis. In the private sector, I think the most important thing is to regulate who and with what purpose a person can enter the database. Any other faces are already recorded in the video archive of the system.

JH
John Honovich
Apr 24, 2019
IPVM

If the face of the person appearing in the image is not found in the database then no action is taken and thus no damage is caused to the person who was surprised at a time by a system that has this type of analysis.

Doru, the concern is when the system makes a mistake. Let's say you and I walk into a store and the system incorrectly matches us up against the picture / identity of a child abductor and then we get detained / arrested, etc.

These systems will make mistakes, even the 'best' of them. The concern becomes how do you handle those mistakes without damaging the lives of innocent people. What do you think about that?

da
doru asmarandei
Apr 24, 2019
Azitrend Distribution SRL

Hi, very good point !

Still, I think this is only about regulation and limited actions of police or public/private guardian when an identification occurs.

I think, when we talk about drug dealers or child abductor, we should use all that technology can offer us. Even if we have to spend 2, 3 or 5 minutes of your or my life to talk with an authority. Of course, all actions should be limited and not to disturb our lives more than necessary. 

No one should be detained or arrested because of such an system !!! It should require more detailed investigation of authorities. Of course the system will be great help for them in investigation process! 

Avatar
Morten Tor Nielsen
Apr 24, 2019
prescienta.com

The way I see it, there's a difference between constant real-time identification and offline. Furthermore, there's an issue of creating large repositories of data based on this - still - unreliable tech.

I don't mind being recorded when I shop - others do - but I don't. If there's an incident, not involving me, and the staff sees me on the footage. No problem with that either.

If the police is looking for a kidnapper, and they have a couple of pictures, and run an automated search which leverages facial recognition, I would expect that there would be a number of false positives. Those would (hopefully) be quicklydismissed by the operator. Some false positives may lead to the police showing up, and I wouldn't have a problem with that either. This is no different that a person seeing an APB on the news and calling the cops saying that I look like the fugitive they just saw. But I don't want the store to cache the search results, or make some sort of database where details about me are stored ("Male, 84 years old, Excited, No Hat")

If I really push the envelope, I might even accept my face popping up on a screen based on constant real-time monitoring. IF the data is immediately discarded, and not stored for ages in some random, unsecured AWS bucket.

The problem with the gathering hoarding data is that some people start to abuse it. They start stalking ex-girlfriends, politicians, and celebrities, they see a hot chick one day, and they start rummaging through the archives to see if she has a boyfriend, kids perhaps and so on.

The other problem is that you may have some heinous crime, and like John points out, the facial recognition system erroneously points to you. If that "hit" is published on social media, the consequences could be dramatic. Even if you're totally cleared, there's going to be people who will think that you probably did something. Unfortunately, the presumption of innocence is not something the general population subscribes to - and almost certainly not if the crime involves children. So your health is in jeopardy too (naturally, a human accuser will create the same problems, but that doesn't justify increasing the likelihood of it happening via FR).

In this case, I am all for regulation, but I have little faith in useful and meaningful regulation being made.

(1)
da
doru asmarandei
Apr 24, 2019
Azitrend Distribution SRL

We think same way  

Using ordinary Face recognition camera of software on an NVR all your concerns are real. But using correct VMS with good users rights admin and with blured faces export limitation for regular operators you can control the internal rules. You can limit acces to faces database only to admin. This way none can use this feature to see when the boss is coming to office or other personal interest. 

JH
John Honovich
Apr 24, 2019
IPVM

Doru, I agree that restricting access is prudent but it still begs the question - what if other facial recognition users do not? Do you regulate it? Do you require it by law?

And then you have the question of what if the boss wants to surveil employees or customers s/he finds attractive?

To be clear, this can be done with conventional video surveillance systems manually with effort but with facial recognition, you are automating the process.

And then take gender / age identification tools, e.g., from this Hikvision marketing video:

What's the risk of misusing this type of technology?

da
doru asmarandei
Apr 24, 2019
Azitrend Distribution SRL

GDPR requires internal procedure. In order to implement and respect internal rules you need correct tools. Using Hik/Dahua you can ot control those rules anything can happen. 

But, not using FR analytics and one operator share video export with you on youtube/facebook what is the difference? Should we ban all surveillance systems? So, your concerns are not regarding on FR but generally on surveillance systems. 

 

Regarding age/gender, if they are only for statistics not for search of persons like female/10-20yrs but only for statistics reports like people counter I think not do not mind. 

I repeat Hik/Dahua/TVT are not providing this kind of protection !! But because those solution cannot it doesn’t mean that we have to ban all other that can give you such protection  

 

i always ask why we still agree with LPR at malls to open the barrier and not be concerned about personal data? Only because it is more popular?