Very interesting, thanks for the report. I am also skeptical it will be useful in this type of deployment but only because of all the signs that basically say "Don't walk through here if you are a criminal"!
I agree and read it similarly to the way you did. They are using it as a way of opening the door and making people that do not care too much think that those in opposition are overreacting thinking, much like grandma, "Just walk around if you don't want your face to be seen. While they mitigate the backlash this time in an easily avoidable set up it will help to keep the general public from revolting when they begin actual deployments as they will have "seen this before and it wasn't a big deal."
While it is true that this particular facial recognition deployment is unlikely to be a success it may be, as others have noted, a proving ground for the underlying technologies. Although, to an extent, it does almost seem to be set up for failure. One would've thought, given the limitations of the technology and best deployment practices that an indoor location with consistent lighting would've been a better bet. Not to mention the likely quality of the images that are being used this particular "blacklist".
This should not blind us however to the potential of facial recognition and more controlled environments where it has been proven to be very much more effective.
Very important post! Thanks for this. I have tight interaction with the Swedish National Police, and they are looking into the promises of facial recognition. Data Protection Authority has limited live use, but allowed Swedish National Police Forensic Center to use facial recognition. I am curious about the Forensic use - I was asked by a major Swedish newspaper about it, and my guess was that the forensic use would be to increase the efficiency in scanning larger submitted video-sequences to identify possible persons of interest. Do we know anything about this specific use case?
I don't know about Europe but forensic facial recognition has long been used by various US police departments though generally limited to specific databases, not extracting social media pictures / links.
In the UK, police definitely do "forensic use" or "facial matching". For example, Scottish police have two programs called AFR Identify and AFR Locate. AFR Identify involves "retrospective identification of individuals from older (that is, not live) CCTV footage or from still images" whereas AFR Locate involves vehicles that go around scanning people's faces live, an ICO report stated.
I recommend reading this detailed academic report on Scottish police's use of AFR Identify (which also used NEC) from 2017-2018. One interesting finding was that a major issue was poor image quality of officers' photos:
It rapidly became apparent, however, that a significant proportion of the images being submitted by officers were not of sufficient quality. The team were finding that many probe images submitted to them from across the force, were stills from CCTV displayed on screens and then photographed using the officer’s mobile phone.
The report indicated decent results, with the software's latest algorithm returning 675 "possible matches" (i.e. suspects) and leading to 350 "charging decisions" [i.e. criminal charges.]
One other takeaway. While both live face rec and forensic use involve biometrics under the Law Enforcement Directive (the GDPR for EU police), it's clear that live FRT provokes more controversy. If it's done in plain sight, it becomes a very visible sign of police surveillance, and evokes "Black Mirror" style concerns. If it's done covertly, the moment the public finds out, there's probably going to be an outcry, as happened for a private development last year. In the UK, most of the scrutiny on face rec is on its live deployment rather than forensic use, e.g. Big Brother Watch has focused largely on live FRT.
An interesting report but it doesn't really capture the general public sentiment in the UK.
As many will know the UK has, perhaps the most permissive view on public space CCTV in the world and in general conversation you will rarely (if ever) hear a single voice against it. Indeed, the biggest stories are regarding town centres "switching off" large open space installations.
When asked by a journo with a police van parked on a pavement, two big cameras on the roof staring right at you what you think - this is a different story as this it is made to feel deliberately uncomfortable. Indeed, if truth be known, the whole set up of the vans and signage is only to make a very visual point that facial recognition is widely used in London (but rarely rammed down your throat in this fashion).
Big Brother Watch may be the go to people for comment and sound bites - but they are considered an irrelevance and annoyance amongst most people. They gain column inches as it's makes a story - but no-one takes a blind bit of notice of anything they say.
London is not Britain in the same way as NY does not represent the US (thank god for that). In the real UK, we have had decades of public space CCTV and have embraced it at every level with facial recognition being the logical progression from ANPR etc. Indeed, the comment you nearly always hear (although I accept it is colloquial) is "if you've nothing to hide - why worry about it". In any report the author often seeks to find the extremity of an opinion to add colour when the story is actually pretty bland. That said, I'm pretty sure in almost every other country there would be more push back and negativity - just not in the UK. In the same shopping area where this was placed you'll find opinion in favour of ISIS, anti-Semitism, Nazi's and legalising Class A drugs - together with Big Brother Watch and pedestrian's shielding their faces. Such is a democratic society.
While it is certainly true that widespread "public space" CCTV is widely deployed in the UK it is, to my mind, far from clear how effective it is been in its primary purpose. Where it has, undoubtedly, been effective is, as you say, in desensitizing the public concept they are "under surveillance". The problem, with a modern state-of-the-art of facial recognition, is that it is not particularly effective when deployed from the sorts of locations currently dominated by public space CCTV. High angle views of wide areas are good at spotting the movements crowds and, in the future, may well see practical deployments of different AI based metrics. But for now, for face recognition, thay are universally poor. Similarly, as was mentioned, the quality of the initial facial images is extremely important when it comes to decent positive matches. Public space style CCTV, typically, does not generate images of sufficient quality.
However, for private sector deployments of controlled and limited entry or exit ways that technology can be very efficient indeed.