UK Facewatch GDPR Compliance QuestionedBy IPVM Team, Published Aug 27, 2019, 05:08am EDT
Even as the GDPR strictly regulates biometrics, a UK company called Facewatch is selling anti-shoplifter facial recognition systems to hundreds of retailers. Facewatch even keeps a database of suspects, only deleting profiles after two years "if you’re never seen stealing again".
While Facewatch says it "fully complies" with the GDPR and meets its critical "Substantial Public Interest" standard, we find some issues with this, chiefly:
- Facewatch's main GDPR compliance claim comes from a private sector lawyer it hired, rather than a government or independent GDPR authority
- Experts IPVM spoke with questioned whether catching shoplifters actually is Substantial Public Interest and criticized the private lawyer for stating Facewatch's system could potentially prevent "terrorism"
- The Substantial Public Interest standard, contrary to the claims of Facewatch's CEO, is very vaguely defined in the law
- While Facewatch says it is so GDPR compliant it can be used outside the UK, its retail solution would be illegal in at least two GDPR countries
- The EU has recommended much shorter storage periods for retail
Facewatch does say it closely follows existing guidance from the ICO, the UK's data protection agency/GDPR enforcer, which it submits GDPR compliance documentation to (such as Data Protection Impact Assessments.)
But the UK currently has no facial recognition legislation, and a national outcry over the technology is sparking demand for clear regulations, potentially leading to a tougher interpretation of the GDPR and a harm to Facewatch's business model. In this post, we take an in-depth look at Facewatch and the GDPR, including:
- Facewatch Basics: How It Works, Pricing, Scope, Etc
- Facewatch GDPR Compliance Claims
- "Businesses in other Countries Can Use Our Solution With Confidence"
- What the GDPR Says About Facial Recognition
- "Substantial Public Interest" Justification Questioned
- EU Recommends Far Shorter Retail Storage
- Potential Regulatory Risk Amid Unprecedented Scrutiny
- Facewatch Response
How Facewatch Works
Facewatch sells systems to retailers that allows them to scan the face of every customer going into a store and instantly compare it to their own watchlist:
Facewatch scans the face of each customer as they enter the premises. Each face is converted into an algorithm and sent to the Facewatch watchlist where algorithmic templates are compared. If there is no match, the data is deleted. Where there is a match, an alert is generated. Alerts can be sent to multiple users enabling appropriate action to be taken.
The company claims on its website its solution can reduce theft "by at least 35% in the first year". Their promotional video, embedded below, overviews their offering:
Watchlist Kept for Two Years
Facewatch says faces on its watchlist are kept for two years, after which they are deleted "if you’re never seen stealing again". (Unmatched faces are not stored).
Facewatch members can add someone to the list as long as they claim that person has been involved in shoplifting or other misbehavior. There is no need for a legal court order, charge, or conviction.
Facewatch is the one maintaining the watchlist, not the stores themselves, making the company a "Controller" under the GDPR, i.e. it is directly responsible for the data. Because its stores span across the UK, Facewatch speaks of its "unique position as a national data controller", but this is not a GDPR/legal term - it is not mentioned anywhere in the law, and "national data controller" has only been used by the EU to describe national-level data authorities like the ICO, not private firms.
Price "A Lot More Affordable Than You Think!"
The price of the system is not public, however, Facewatch is not aiming at the high-end enterprise market, stating on its website that its solution is "a lot more affordable than you think!"
Facewatch also sells facial recognition to UK police; a pricing document from 2019 shows it charges about $2,700 per camera annually for a minimum 3-year license (around $8,100 total):
Facewatch's website states that its solution is "simple to deploy", using a "standard HD CCTV camera" and an Intel NUC mini-PC. While Facewatch does not identify camera brands, it lists itself as an "authorized partner" of Axis on its homepage:
Facewatch Claims "Fully GDPR Compliant"
Facewatch claims it is "Fully GDPR Compliant":
Facewatch does comply with the GDPR in some ways, stating for example:
Facewatch also told IPVM it submits Data Protection Impact Assessments (Article 35) to the ICO for each retail outlet and for its national suspect database. The ICO did not respond to our request to confirm this, but it's highly unlikely Facewatch wouldn't submit DPIAs, which would be a very obvious GDPR violation and is something a Swedish school was recently fined for.
"Businesses in other Countries Can Use Our Solution With Confidence"
Facewatch claims its GDPR compliance is so good, you can use its system pretty much anywhere:
The solution meets General Data Protection Regulation (GDPR) compliance, protecting businesses from being held liable for violating privacy laws. “In setting up Facewatch, we worked with multiple government agencies to confirm the rules of using our watchlist,” said [Facewatch CEO Nick] Fisher. “The laws in the UK on meeting the Substantial Public Interest test are the toughest in the world. That means businesses in other countries can use our solution with confidence.”
This is false. Belgium bans private sector facial recognition explicitly based on its interpretation of the GDPR. IPVM also reached out to France's Data Protection Agency, the CNIL, and they told us private facial recognition for retail "is outlawed in France" based on both the GDPR and prior French law.
GDPR Issue: "Substantial Public Interest" Justification Questioned
The GDPR doesn't mention facial recognition but bans biometrics processing in Article 9 with a few key exceptions. Facewatch's core legal justification rests on one those exceptions -"substantial public interest":
Facewatch says it meets this standard, even stating that its technology has the "potential to prevent" terrorism:
because Facewatch is processing data on a national level and is demonstrated to reduce/prevent crime in subscriber properties with the further potential to prevent and detect crime at all levels including terrorism, it is in the Substantial Public Interest. [emphasis added]
Expert: Facial Recognition for Shoplifting Disproportionate
However, whether Facewatch meets the Substantial Public Interest standard is not so clear-cut. Its chief justification is the quote above, which is a legal opinion from a London lawyer Facewatch hired named Dean Armstrong, rather than a governmental official or agency.
IPVM found experts who questioned Armstrong's opinion, particularly given Facewatch's focus on shoplifting. Subhajit Basu, an associate IT law professor at the University of Leeds said facial recognition for shoplifting was disproportionate:
I have not seen any evidence that the technology is necessary for the retail sector. I am yet to be convinced that it is proportionate and effective considering how invasive the technology is.
Basu said that while "public interest" has an "air of democratic propriety", it could be interpreted in many ways, especially given the lack of clear guidance:
It is not that straightforward, so they are using the concept of public interest, which is a nebulous concept
Terrorism Claim "Artificially Inflating the Public Interest"
Eerke Boiten, director of De Montfort University's Cyber Technology Institute, said Dean Armstrong's opinion about Facewatch having the capability to stop terrorism "sounds like artificially inflating the public interest":
As terrorism, unlike shoplifting, isn’t restricted to taking place in shops, accepting this would mean that there’s a valid justification for any facial recognition taking place anywhere.
Boiten also critiqued Dean Armstrong's opinion that "it is necessary to provide alerts to business subscribers to prevent or detect unlawful acts" [emphasis not added]:
It certainly isn’t necessary in the logical meaning of the word – there are other things that prevent crime besides facial recognition
Key Issue: No Detailed Definition of Substantial Public Interest
The GDPR itself has no explanation of what meets this standard. Neither does the latest EU GDPR for Video Surveillance guidelines. The UK's 2018 data protection act (DPA), which implements the GDPR, also does not define this standard, even though Facewatch's CEO has claimed UK laws on Substantial Public Interest "are the toughest in the world".
The closest the 2018 DPA gets to adding clarity is stating that biometrics are allowed if "preventing or detecting unlawful acts", which are "necessary for reasons of substantial public interest" which is not defined:
(This is the same clause interpreted by Dean Armstrong in Facewatch's favor).
Other Issue: Long Length of Storage
Facewatch keeps suspect data for two years "if you’re never seen stealing again". This is a very long time, and EU authorities typically recommend far shorter storage periods, even though the GDPR has no set storage limits. The EU's provisional GDPR for Video Surveillance Guidelines state that for fighting vandalism in everyday retail, "a regular storage period of 24 hours is sufficient":
The EU adds:
In general, legitimate purposes for video surveillance are often property protection or preservation of evidence. Usually damages that occurred can be recognized within one or two days.
While the EU does not mention shoplifting per se, "property protection" for retail is essentially the same.
IPVM asked the UK's data regulator and GDPR enforcer, the ICO, if it thought that Facewatch met the "substantial public interest" and other potential GDPR issues with the firm's business model. However, we received a generic response:
Organisations wishing to automatically capture and use images of individuals in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances and that there is a legal basis for that use.
Facewatch Response: Public Interest "Matter of Judgement"
Facewatch told IPVM over email and in spoken interviews that it was confident it met the substantial public interest, regardless of the experts' criticisms:
Clearly substantial public interest is a matter of judgement and in our judgement and that of our QC [Queen's Counsel, i.e. the lawyer they hired] and DPO [Data Protection Officer] we are comfortable that we can meet this test. There are views on both sides - if like us you spoke to the shopkeepers and shop staff being put out of business, being assaulted and losing £X per week you would get a very different view.
Every single thing you buy has been increased in price by 1% if not more because of people stealing, the extra cost of guards, cameras, cabinets, extra staff.
What’s important to remember is that Facewatch is preventing and deterring crime: it’s not just shoplifting. One of the biggest problems is retail violence and abuse of staff and everything else
Regarding the EU's recommended storage period for retail, Facewatch said IPVM was "comparing apples and pears!":
The rules for CCTV retention are quite different from FR data. Facewatch does not keep/store or hold FR data at all unless the data has matched a watch list of known thieves from an individual store
If anything, though, facial recognition data (as a "special category of personal data" under the GDPR) is even more sensitive than CCTV footage, although the GDPR gives no explicit storage limits in either case.
Regarding the fact that Facewatch's retail solution is banned in at least two GDPR countries, Facewatch responded:
It’s often accepted that the UK tends to set the standard when it comes to establishing legal position and [CEO Nick Fisher's] comment earlier this year was supporting that. Facewatch confidently believe that facial recognition and the communication of its use in a particular area acts as a deterrent to crime.
Unprecedented UK Face Rec Scrutiny
Facial recognition is under heavy scrutiny in the UK after the Financial Times recently revealed that a prominent London development at Kings Cross is covertly using the technology. This sparked an ongoing ICO investigation which is also looking into "whether the current [regulatory] framework has kept pace with emerging technologies", while the UK Biometrics Commissioner has called for laws tackling both private and public use.
Facewatch itself is facing rising scrutiny from the media and activists, with major UK newspaper The Observer publishing a detailed profile of the company and similar ones, stating they raise "concerns about civil liberties" and quoting the prominent privacy organization Big Brother Watch calling for a total "moratorium" on facial recognition:
On August 19, Big Brother Watch announced it had lodged a complaint with the ICO "about the epidemic of private facial recognition surveillance in the UK" after the organization released a report about private facial recognition use (which did not mention Facewatch.)
However, Big Brother Watch's director has published a blog post directly criticizing Facewatch and its business model:
The watchlists used by retailers are privately compiled, often populated with 'undesirable’ people not guilty of any crime whatsoever.
The link goes to this Big Brother Watch tweet criticizing a Facewatch ad:
Potential Regulatory Risk for Facewatch
The risk for Facewatch is that, with all the (mostly negative) attention about UK facial recognition, new and much clearer laws/regulations interpret the GDPR much more strictly. However, Facewatch said it is not concerned and in fact embraces the prospect of legislation:
We are very keen for legislation to be introduced and are confident that our business model is a positive force for good.
Facewatch said that unlike Kings Cross, it is not being investigated by the ICO and is actually "working with them closely as part of their current FR review." The ICO did not reply specifically when we asked about Facewatch being investigated, instead, they sent another generic statement:
The ICO is currently investigating the use of facial recognition technology by law enforcement in public spaces and by private sector organisations, including where they are considering partnering with police forces. This is a priority area of work for the ICO.
The lack of clear facial recognition rules in the UK has led to various interpretations of the GDPR but this looks likely to change.
How that impacts Facewatch remains to be seen. It could end up being a boon for the company, but if it does not, it will clearly highlight the risk of touting GDPR compliance claims prior to the actual rules being fully detailed.
As Facewatch themselves told us in one of our interviews:
There aren’t really any regulations at all and that’s part of the problem.
2 reports cite this report:
Back to Top