Latest London Police Facial Recognition Suffers Serious Issues

By Kayleigh Long, Sean Patton, and Charles Rollet, Published Feb 24, 2020, 06:27am EST (Info+)

On February 20, IPVM visited another live face rec deployment by London police, but this time the system was thwarted by technical problems and poor weather conditions, even getting called off early. New cameras we also used, although they were of a model that wasn't well suited to the task.

IPVM Image

With no arrests and poor implementation, privacy rights group Big Brother Watch described this deployment as a "complete police failure". In this post, we examine the system's issues, new specs, and more, including:

  • Background
  • Video Walkthrough
  • New Cameras With Lower Resolution Used
  • Weather-related Challenges: Umbrellas, Hoodies
  • Deployment Called Off Due to Software Glitch
  • Passerby Reaction
  • Activists Call Out "Complete Police Failure"
  • Final Analysis: Long List of Problems
  • Met Police Response
  • Conclusion

**********

** **** *******, ******'* ************ ****** (the ***)************* *****, *** *** ***** ****, begin *********** *** ** **** **** recognition (***) ** ***** ** ***** down ** "******* *****".

**** ********** ******* * ******** ** deployment** **** ******. ******* ********** **** good, ******** ** ***** **** ***** have ****** ********. ******* ********* ***** was ****** ******** *** ********* **** or ******** ***** ***** ** **** walked *******.

Latest ********** ** ****** ******

** ******** **, *** *** ******* its *** *** ** ********* ****** ************** *** ***** ** ******. **** is *** ** *** **** *********/******************** ** *** ** ****** *** a ***** ****** ****.

IPVM Image

**** *** * ***** *********** ** the **** ** ****:

New *******

*** *********** ********** ********** *** **** an ******** ********* ****** ***** *** used. ** *** ******** ********** ** East ******, **** ******** ******* *** ******,*** **** ****, ********* *****-********* **** ****:

IPVM Image

**** ** ** *** ****** ******** to *** *** ** ** ** lower ********** - ***** ** *** - *** *** ******* ********* ** the ***** ** *** **** **** with ****-***** **, ***** ***** *** be ****** ******* * *******-****** ****** at ********** ***** *****.

*** *****-** ***** ** ******** *** oil ****, *********, *** ***** ******* environments *** ** *** ***-**** *******. It ** ********* *** ****** *********, **** ** *** ****. *******, Met ****** ******** ** *** ***** told **** **** **** *** ** "upgrade" ******** ** *** ******* ******** last ****.

Rain * *******

** *** ******* ****** *** ********** and **** *********** *** ********* ** hoodies, ********* *** ****** *********** ***** in **** ***** ********'* ***** **** entirely ******** ** *** *******, ** shown *****:

IPVM Image

****** **** **** **** ** *** long ***, ** *** ****** *** to ** *********** ******* *** **** people **** ***** *********, ****’* ******** suspend **. **** ** * **** problem *** ******, ***** **** ** common.

Deployment ****** *** *** ** ******** *****

*** ********* ******* ** **** *** ran *** ***** *.* ***** *** was ****** *** ***********. (*** ***** deployment ****** * *****). ****** *************** told **** **** **** ************ ********* issues***** ******** ****** *** ************ ******* the ******** *** *** ******* **** carried (*********** ***********) ***** ****** **** of **** *******.

***** ** * ***** ** ****** packing ** *** ******** *** ******* from *** ***:

IPVM Image

Passerby ******** *****

******** **** ********* ** *** ****, the ******** ** *** *** ** LFR ********** *** *****.*** ***** *** **** ** ****’* aware **** *** ****** ****, **** he *** ********** ** ****** **** would ******** ** ******* *** ******** people’s ********* ******* **** *** ******, and **** ** *** ******** ***** cases ** ******** ********:

********’* *** * ************, ***’* ****?

******* *** *** **** ********:

*** **** ** *** ** ***** it, ** *****?

*** *** **** ****, *******, ****** made ***** *** ******* *** *** area ******* *** **** *******/**********. ***** seemed ** ** **** ******* ********* about ** **** ** *** **** London**********, ******** *** ** *** ******* foot ******* *** ***** *******.

Deployment *******: ** *******/*******

** ******* ** ******* **** **** during *** **********, ****** *********. *******, police ** *** ***** ********* *** stop-and-searches. *** *** ** * ***** man *** ****** ** ****** ********* the *** ****** *** *** *** hands ** *** *******. *** ****** was ******* ***** *** *** **** witnessed ***** ******* ******* *** *** zone *** ** ******* ******. **** men **** ******** ******* ******, ******** the ***** *** ********* ****** *** placed ** * ****** ***, ********* to *** ******* *****.

**** ****** ******** **** ****** ** the ***** (*** **********) ***** ***** incidents *** *** ******** ** ********.

Big ******* ***** *********: "******** ****** *******"

* ******** ******** ** *** ***********, privacy ****** ******** ******* *********** ** ** ******* *** *********. IPVM ***** **** *** ********,****** *****, ***** *** *** ** ***, and ***** ******** ***** *** ************ on ***** *********:

***** ** * ********** [******** ****]:

**** *******, *** ************ ****** ********* that **** **** ***** ** *** live ****** *********** *** **** **** done *** *** **** *** ***** in ********** ****.Instantly, *** ********** ** ******, ****'** ******* ** *** ***** **** ***** ** **** ******* *******. But for us this is exhausting. This whole technology, this ***** ********* *** **** ****** **** *** ***** really. There's no way that a mass surveillance technology should be being used in London, a democratic city. This isn't something anyone's voted for, this isn't something anyone wants, and it *****'* **** ****. Yet again we've **** **** * ******** ****** *******. We've seen two young black boys being stopped and made late on their way to work, searched, cuffed, put in the back of a car for no reason before being let go again. This is atrocious and Parliament urgently ***** ** **** ****** ** *** **.

Final ********: **** **** ** ******

**** **** *** ********** ********* * number ** ******, *******:

  • ******* **********: ***** *** **** ****** can ***** ******* **** ** ****, such ******* ***** *********** **** *** umbrellas *** **** *******/***** **** ******* their *****. **** ** ********* **** no **** ********* ******** ***** **** moving *** ******* ** ****** ********.
  • ********* ********: **** *** ********** *** called *** ***** *** ** * software *****. *******, ******* ***** ***********/**********, such ******** ****** * ******* ** effectiveness.
  • ****** *****: **** ** ********* **** the *****-*** ***** ******** ** *** police *** **** ********** *** ***********. Lower ********** ***** **** ** **** errors ***** *** *****'* **** ****-***** function **** *** **** ** ******* use **** *** **** ********.
  • **** *******, ************** ********: **** ********** resulted ** ** ******* ** *******. However, *** ****** ******** *** ** the ****-***-****** ** *** ***** *** who **** ******* ** *** **********, which ****** **************/****** ********.

Met ****** ********

IPVM Image

*** *** ****** **** **** **** statement **** ** ***** ***** ****** surrounding *** **********:

*** ***’* ********** ** ****** *********** technology ** ** ******** *** ******* on ******** ***** ****** *** ******* and ******* ********, **** ** **** pose * **** ** *** ******. The ****-***** ******** ** **** ******** meant **** ** *** *********** ** flag *** ***** *** **** ****** Recognition ** **** *** ***** *** corporate ******* ******** *** *** *** website **** * *** **** ****** period. **** ***** **** ** *** opportunity ** ****** ****** ****** ****** our ****** *********** *** *** ****** signage ****** ******* ** *** ********** area ******* ****** *** **** ***** of *** **** ****** *********** ********** before ******** ** **** ******* **.***** ** ****** ******** ** ***** governing *** ***’* *** ** **** Facial ***********, ********* ** *** ***’* website. ** ******. **.** ****** *** operation, ***** *** **** ****** *********** system *** ******* *********, ***** *** a ************ ***** ******* *** **** database *** *** **** **** ******* given ** ********. *** ******** ****** which ******* *** ***** ***** ** required ** *** ******* ** **** a ******** ** ******* ** *** to ****** **** ** **********. ***** this, *** ******** *** ***** ** stop *** **** ****** *********** **** of *** *********.

**** *** ******** ** ***** *** other ****** (****** ******, ****, ***) and **** ****** ***********.

**********

***** *** *** ******** *********** ***** and *** ******* ******** **** *********, IPVM ** ********* ** *** ******* of *** ****** ***'* *** ******. Without **** *******/****** ***********, *** *** case *** *** ** ******* *******.

Comments (21)

Very odd choice on the Axis Q6215-LE. One would think that when testing new technology, you'd want to remove any variables unrelated to the performance of the facial recognition technology by ensuring the specs on your hardware are up to the task.

The Axis Q6215-LE excels at things like port, perimeter, airport, and military installation surveillance, not so much in scenes like this. Something 4K+ would have made a lot more sense for forensic detail.

I know they're concerned about image and whatnot, but if it were I, that whole operation wouldn't have been nearly as overt. Use the test to test the technology as well as the procedures for rapid deployment without drawing attention to yourselves. That way people wouldn't be prepared to avoid it. Blend in to the surroundings...

Ok, Monday morning quarterbacking rant over. I hope everyone had a great weekend.

Agree: 3
Disagree
Informative: 1
Unhelpful
Funny

This reminded me of the article: Testing Camera Height vs Image Quality

Wouldn't it be better to mount the cameras lower, perhaps on the side of the van just above average height (to see over other's heads)? I get that the camera "upgrade" choice was confusing, but it seems like their deployment couldn't be very successful. Plus, if you need to advertise what you are doing, then I don't see how it could ever be effective. If you are looking down, wearing a hat, you won't see the sign and you also won't be recognized by the camera. just don't look suspicious, because they may pull you to the side anyways.

Menards' newer Sony IP camera deployment (upgrade from the Bosch analog cameras they had) took camera height into consideration. They lowered the main choke point camera to around 5-6'. They also added cameras near the doors at around 7'. Home Depot has done a similar design with low mounted Axis cameras near the exit doors. However, when entering a Home Depot, I haven't seen low mounted cameras like I do at Menards.

Agree: 2
Disagree
Informative: 1
Unhelpful
Funny

Got to love how governments waste money on feel good technology we all know doesn’t work that well. Yes in perfect situations facial recognition works. In the real world not so much. Not right now at least.

Agree: 3
Disagree
Informative: 1
Unhelpful
Funny

If "big brother watch" are worried why care the system doesn't work? Surely a system which doesn't capture their faces is a good thing? Governments will always burn money on stuff to keep budgets, besides hindsight is a great thing.. If we didn't try new technology we would have square wheels.

Funny thing is it doesn't matter when a copper stands there and identify 10 gang members, using data in their head. But when a camera doesn't it BBW get triggered..

Agree: 1
Disagree
Informative: 1
Unhelpful
Funny

If approaching from the front of the van, by the time you read the signs (plus the small font and QR codes), it appears you are already being scanned without consent. If you stop, do a 180 and quickly leave, wouldn't that create some sort of probable cause to be tackled?

The Met can't be serious if they say posting to Twitter and a website 2 hours prior constitutes any sort of consent either... The public had the option of not leaving their homes to go to their jobs, and they should be tracking what we do hour by hour 24/365... LOL!

If they want to pass the smell test on consent, the Met should make two lines on the sidewalk with big signs: a line for those who explicitly give their consent by signing a long form (like a software license agreement 'I agree') before they are scanned, and those who just want to pass through without consent (and we'll see how many dangerous criminals they arrest). There's a reason the US police ask the person being arrested if they understand their (Miranda) rights. It seems the only way facial recognition will work is without consent, like under an authoritarian government.

Agree: 5
Disagree
Informative: 4
Unhelpful
Funny

Do we know if their system was actually logging and storing facial data? Or if they were just taking faces and comparing to a database of known "serious criminals"?

If the first scenario is true, then I understand the privacy concerns. However, I am curious as to how they would be enrolling new facial data into a database. Without any other details besides a face, wouldn't the system only be able to log a record as;

FACE #: 3625 [FORENSIC DETAIL = X]

IMAGE: 3625.jpg

LOCATION: VAN

TIME: 12:38PM

If the second scenario is true, then I could essentially walk back and forth past that van smiling at the camera, and each time the system would take my face, compare to a database of "serious criminals", return no match, and move on to the next face. The only data logged would be; an instance of the system seeing a face, a negative return when compared, and a time associated with the 'match = false' moment. At the end of the day, all they would have would be a log of 1500 faces seen, and 2 returned possible matches of potential "serious criminals". No actual facial details would be logged, the system wouldn't be keeping tabs on where people are, or where they were going.

Even if I returned as a possible match, I doubt I would get an instant knee to the back, with an officer shouting "WE FINALLY GOT YOU! You're under arrest for the [insert serious crime here]!". I assume that they would want to verify by looking at my face, and seeing my ID. They would probably approach me cautiously just in case the match was legitimate. Yes, it would be annoying to be stopped for 30 to 45-ish seconds each time you walked past the van. One would hope that the tech would eventually get good enough, resulting in less false positives.

Full disclosure: I know nothing of UK privacy laws.

Agree
Disagree
Informative: 1
Unhelpful
Funny

Do we know if their system was actually logging and storing facial data? Or if they were just taking faces and comparing to a database of known "serious criminals"?

in the signs posted at the face rec zone, the Met says that "unless an alert is triggered, the system will immediately and automatically delete your biometric data". so what's happening is the second scenario you described.

As to your other points:

No actual facial details would be logged, the system wouldn't be keeping tabs on where people are, or where they were going.

Yes, it would be annoying to be stopped for 30 to 45-ish seconds each time you walked past the van. One would hope that the tech would eventually get good enough, resulting in less false positives.

I would say that getting stopped and searched by police is more serious than 30-45 seconds of disruption. Sometimes, police cuff people and place them in vans before releasing them, as happened during this deployment (although this was not based on a face match). And recall that last year, a man who deliberately obscured his face as he walked through an LFR system was fined 90 pounds ($116).

I think the broader criticisms from activists are the following and still stand regardless of whether unmatched face data is deleted or not:

  • LFR is anti-democratic. No one voted for this to be in place.
  • There is also no UK legislation or even clear regulations that specifically govern use of LFR.
  • Even if unmatched face data is deleted immediately, comparing someone's face to a watchlist represents an unreasonable invasion of privacy on principle. It's like being forced to stand in a police-lineup. There's no consent/warrant/etc.
  • The risk of false positives, especially for minorities, results in needless stop-and-searches - which could lead to disruptions, confrontations, etc.
Agree
Disagree
Informative: 1
Unhelpful
Funny

Do we know if their system was actually logging and storing facial data?

They say not but I’m sceptical...

another UK Police Force has recently reduced its ANPR image retention from 2 years to 1 year over privacy concerns!!!

ANPR images retained are plate crop plus front/rear of vehicle crop and over view.

With the employment of Tony Porter as the UK’s camera commissioner coming to an end shortly, one can only surmise that the live FR will likely becomes more prevalent (TP has stopped previous use of FR)

Agree
Disagree
Informative
Unhelpful
Funny

Given the overt nature of the operation and the questionable setup that almost seems designed not to work I can't help but think this is simply a public opinion campaign.

They are not doing this to catch criminals, as others have pointed out they are doing this to gauge public interest and concern.

I would be willing to bet money the London government will not measure the success of these operations based on criminals caught, success will be when the citizens of London view facial recognition as normal and do not pay any more attention to it then they do to the cameras already lining their streets. At that point London can make facial recognition standard across all camera deployments.

Agree: 2
Disagree
Informative: 1
Unhelpful
Funny

I would be willing to bet money the London government will not measure the success of these operations based on criminals caught, success will be when the citizens of London view facial recognition as normal and do not pay any more attention to it then they do to the cameras already lining their streets. At that point London can make facial recognition standard across all camera deployments.

I agree... it’s like the whole speed camera van thing from 15 years ago!

we used to phone our buddies, check websites and send to Twitter, drivers would flash oncoming vehicles to warn them...

now no one bothers to even tag it on Waze!

Agree
Disagree
Informative
Unhelpful
Funny

Great review, so facial recognition.... covered under Data protection guidance under GDPR DPA2012 and PECR. There are facial Recognition companies that boast they are GDPR compliant, but what about DPA and PECR? Now the UK is in the transition period of brexit - now known as EUE (European Union Exit) what changes are there? Paul Anderson ♛ on LinkedIn: #Brexit and the future of #dataprivacy

We have talked at professional security event TWENTY20 in Birmingham last week and PrivSec a couple of weeks before that.

FPR is not just about Data, it is about human rights, the Legality of use. What is the process for capturing data? What is the process for falsely stopping someone detected ? If I had been stopped, i would probably make a scene, claim i had been affected by the false statement - issue them with a DSAR and slapped them with an Article 82 for good measure!

I am all for using technology but feel it can often be installed, without good process and controls, leaving it open to failure.

Also to mention -PECR 2109 ....

Agree: 1
Disagree
Informative: 1
Unhelpful
Funny

Thanks for the compliment, glad you liked this post!

Just to be clear, the London Met's face rec deployment is not covered by the GDPR or PECR or DPA2012. It is covered by the DPA of 2018 and the Law Enforcement Directive. All EU police face rec is governed by the LED, not the GDPR. And the LED gives substantially more leeway for biometrics processing than GDPR.

There are facial Recognition companies that boast they are GDPR compliant, but what about DPA and PECR?

For non-law enforcement use of face rec, GDPR and DPA 2018 do apply. However I don't see how PECR is relevant to face rec. PECR is chiefly about regulating direct marketing, cookies, etc. However, even the "GDPR compliant" claim can be dubious given that the UK government hasn't officially interpreted what face rec means in GDPR terms. For more, see our post UK Facewatch GDPR Compliance Questioned.

FPR is not just about Data, it is about human rights, the Legality of use. What is the process for capturing data? What is the process for falsely stopping someone detected ? If I had been stopped, i would probably make a scene, claim i had been affected by the false statement - issue them with a DSAR and slapped them with an Article 82 for good measure!

Article 82 (Right to Compensation) and DSAR (Data Subject Access Right) are GDPR obligations. But as mentioned, the London police face rec - and all EU police face rec - is governed by the Law Enforcement Directive, which is separate from the GDPR and doesn't have these mandates. So invoking these wouldn't do you much good if you passed through a London Met LFR zone.

Agree
Disagree
Informative: 2
Unhelpful
Funny
Agree
Disagree
Informative: 1
Unhelpful
Funny

Update: the top official for the Met, Commissioner Cressida Dick, just gave a lengthy speech in which she strongly defended LFR use. Anyone interested in this issue should check out her comments on LFR, which start at 30:50.

Dick's defense rested on 5 main points:

Let me bust some current and apparently pervasive myths about the Met’s use of LFR:
- The tech we are using does not, repeat, not, store your biometric data.
- Human officers will always make the final decisions on whether or not to intervene – Not the machine.
- Surveys tell us it is for serious crime.
- The tech we are deploying is proven not to have an ethnic bias.
- We’ve been completely open and transparent about it - check out our website. Its use is very clearly signposted.

The Commissioner said law-abiding people shouldn't be worried, especially in the age of social media:

speaking as a member of public, I will be frank. In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR and not being stored, feels much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.

The Commissioner doubled down on the claim that the Met's LFR (which is NEC's NeoFace) has no ethnic bias:

We know there are some cheap technologies that do have bias, but as I have said, ours doesn’t. Currently, the only bias in it, is that is shows it is slightly harder to identify a wanted women than a wanted man.

Agree
Disagree
Informative: 1
Unhelpful
Funny

Very interesting that Cressida Dick felt the urge to comment that the technology had no ''ethinic'' bias, yet it required a large Police van, rigged with three huge (and highly visible) cameras - in situ for 5 hours - to stop and search two black men.

The Met do not need the use of all this tech (and cost) to achieve such a result as they have form when it comes to ethnic bias on stop and search.

As for the tech - it only proves to me that it's another case of someone over-selling product to a client whose knowledge is not as strong as it should be.

Agree
Disagree
Informative
Unhelpful
Funny

Below is a transcript [emphasis ours]:

This morning, the Metropolitan Police announced that they were going to use live facial recognition and they have done for the past few hours in torrential rain. Instantly, the technology is broken, they're packing up and going home which is like natural justice. But for us this is exhausting. This whole technology, this whole operation has been broken from the start really. There's no way that a mass surveillance technology should be being used in London, a democratic city. This isn't something anyone's voted for, this isn't something anyone wants, and it doesn't even work. Yet again we've just seen a complete police failure. We've seen two young black boys being stopped and made late on their way to work, searched, cuffed, put in the back of a car for no reason before being let go again. This is atrocious and Parliament urgently needs to take action to ban it.

Update: police at the sc

This seems to have been cut off below the transcript. What was the update?

Agree
Disagree
Informative
Unhelpful
Funny

Sorry, that was an error/typo leftover from the draft so I removed the text - there's no update.

Agree
Disagree
Informative
Unhelpful
Funny

All good. Thanks!

Agree
Disagree
Informative
Unhelpful
Funny

There's an update from the Met Police on their LFR deployments, showing quite poor results:

So the LFR deployment we covered earlier in Stratford (East London) had exactly zero alerts with 4,600 faces seen. The Oxford Circus deployment this article is about was a total bust, i.e. no results due to the technical issues. The Met's repeat Oxford Circus LFR deployment on Feb 27 (which we did not cover) got 7 "incorrect" alerts and 1 "positive", resulting in 1 arrest.

In sum, that's three separate high-profile, resource-intensive police operations all for one arrest.

Agree
Disagree
Informative
Unhelpful
Funny

Good update.

So does this imply NEC's facial recognition (what the Met is using) is really bad or China claims of facial recognition are really really exaggerated? I assume the latter but am genuinely curious.

Agree
Disagree
Informative
Unhelpful
Funny

It's hard to know exactly, but I do think that doing face rec out of a giant visible van with signs all around must make capturing people's faces more difficult. In China LFR is much more ubiquitous and covert, so they probably get somewhat better results. China police also reportedly integrate face rec with cellphone tracking and set up cameras in residential buildings,.

At the same time, no one is independently checking PRC face rec results either, and the PRC government has a habit of throwing money at high-tech domestic security solutions.

Agree
Disagree
Informative
Unhelpful
Funny
Login to read this IPVM report.
Why do I need to log in?
IPVM conducts reporting, tutorials and software funded by subscriber's payments enabling us to offer the most independent, accurate and in-depth information.
Loading Related Reports