Eagle Eye CEO Falsely Tries To Redefine Facial Recognition

Published Feb 02, 2024 13:09 PM

Eagle Eye competitor Verkada denied that it offered face recognition before admitting the truth. Now, Eagle Eye's CEO is promoting similar statements.

IPVM Image

In this report, we examine what Eagle Eye is doing, why it is misleading, and how face recognition is defined and regulated.

**********

***** *** ******* ************ ** * ****-**************** **, ****, ******* ****** *********** ***** *** ***** ***** ** "video ************ **** **********": "**** **********, face ********, *** **** ***********."

IPVM Image

********* ** *****, **** **********:

********** ** ***** ** * **** in * ***** ***** *** ********** where ** *** ***** *** **** is *******

** ********, ***** ******* **** ******** as:

**** ******** **** ********** ** **** faces **** *** *******...

There ** ** ****, ** ******, ****** ****, *******, ** ********* *********** ********* ** *** **** *****. The ******’* ******** ** *** *****. [emphasis added]

***** ******** **** ******** **:

******** ****** ***** ** ************, ***********, **items **** *** ********, but uses the person’s face instead. [emphasis added]

**** ** ** *** ****** ***** gait ** **** ******** ** * biometric (*** *** **** ** ******** used, *** * ********* ***********, ***, e.g.,**** *********** ********).

****, ** ******* **** *********** ** a ****** ********** ***:

**** *********** ******** *** ***** ** a ******’* **** **** ******** *********** by ******** ***** ****** ***** ****information **** ** ***** ********, ****** **** *******, **********-****** **, ******** *******, ****** *******, ** ***** ******* [emphasis added]

**, ********* ** *****, *** *** difference ******* **** ******** *** **** recognition ** **** *** ****** *** connect * ******'* **** ** ***** identifying *************. ** **** *** *** that *** ********** ********** (*.*., *** analytics **** ** ********/*** *****) *******.

***** **** ** ** **** **** face *********** *** * ******* ** privacy *** ********** ******** *** **** not *** **** **** ******** ****** to **** ********.

*****, ** ******* *** *****'* *********** between *** *** **** *** **** up.

Verkada ****** **** *********** ****** ******** ******

*****'* ********** ****** **** **** ** Verkada, *** ******* ** **** ***** facial ********* "****** ***********" ***** ******** 2023.

**** *****, ******************* **** **** ***** ************** ******* ********* "* **** ***** feature," ******* ** *** *** ******* faces ** ** "******** ********" **** references "***** ********** ************ ***********," **** where *** ****** *******. *** ********, Verkada ******* *************** ********** ** ********* ******* ******** ********** (***)******* ** ****** *, ****, ****:

****** ***********, *** **********, ** *** as **** **** ** *** **, what **'* ******* ** **, ** course, ******** * ***** **** *** like * ******* ***** **, *** know, *something **** **** **** ** ******** ********, right, that lists a number of faces and profiles and identifies this person as a specific individual where they live, what their name is, maybe other personally identifiable information. Verkada **** *** ***** ****. ** ** *** ** ****** ***********. We ***** ****'* **** ** **** * **** ***** *******. [emphasis added]

******* *** *** ****** **** ***** December ****, **** ********* ******** **** *** ********** *** face ***********.

Face ******** ** **** ***********

******* ***** **** **** ******** ********** do *** **** *** **** *********** between **** ******** *** **** *********** or *** **** * ********** **** connect ** "******** *********" ** ** a **** ** ********* **********. **** like*****'*(***** ***** *** ** *****),********'*, *************'***** ***** **********, ***** **** **** matching *** **** *********** ***** **** under *** ******* ******** ** ************ that ******* "**** ********" ** "*********[*]." Those **** ** *** *** **** face ********* ********** ** ****** **** because ** **** *** ****** *******'* government ** ** ******* *******.

**** ********* ******* ******* ******* ** 2021 ***** *******'* *********** ******* **** matching *** **** ***********, *** **** agreed **** ** *** **********.

***** *********, *** ********* ******** ** ***** Alliance, **** ** **** **** ******** and **** *********** *** "**** *** parcel ** *** **** *******":

**** ******** ** *********** *** ***** step ** * ****** *********** *******. The ***** **** ** ****** *********** is ** ******** * ****** **** print **** * ***** ** * set ** ***** ******.

*** ****'**** *************, ******* **** ******** * "****** of '**** ***********'":

*'** ***** ***** **** ********** ******* to ***** ***** **** *********** ******. I'd *** **** ** *** ******** of "**** ***, *** **."

When ***** ******** ****** ** *** ***** **** *** **** "**** ***********" **** **** ***-**-**** *** *:* ******** *** ********** ** *******. And it certainly doesn't matter whether the database matched against is first or third party.

1:1 **** *********** *** **** ** ****** "**** ********" ** *** ****, *** ** * ********** ** *** *** ***** *** ******** **** ** *** ********, **** ** * ****** ** "**** ***********."

** *** ** ********** ** ******* policy ******** ** *********** ******* ***-**-**** searches *** *:* ************, *** ****'* not **** ****'** ***** ****.

Eagle *** *******

***** ********* ** ** ******* **** IPVM, ****** **** *** ******* *** not * ******* ** "*** ***** interpretation" ** **** ***********:

***** *** **** *** ********* **** any ****** ******** ** ****** *********** technology ** *** ******.

** ******* **************, ** *** *** ********** ** make *** ******* ***** *** ***** interpretation ** ****** ***********. *** ****** will **** **** *************. ** **** follow **** *** ****** *** *** law *******. ** **** ****** ***** by *** ***.

***** ********* **** ***** *** ***** for ******-**** *********** *** ****** ***********:

** *********** **** ****** *********** ** a ****** ******* *****. *** **** is ** ****agreed-upon *********** of the different aspects of the technology so we can have meaningful and constructive conversations. [emphasis added]

**, ******* ******* *** ******** ********** regulations ******** **** *****.

Eagle *** *******

** ***** **** ** *** ********* to ****, ***** *** "**** ************[*****] *** ****** ******** ** ****** recognition **********" [******** *****].

**** ******** **** *** ****** ** the **** ******. ** ***** ** odd *** ***** ** ******** ******** for * **** ******* ************** ** face *********** **** ******** **** ******** if ***** *** **** *** ******** to **** *** * ******* **********. IPVM **** **** ***** ** *** update ** *** *** ************.

**** *******, ***** *** ** ******* marketing ***** ********* / ** ******** following*** **** *********** ** ******* ******.

Regulatory **************

** ***** *** **** **** ** roll *** * "**** ********" *******, Drako's ******** ***** ** ** ***** attempt ** **** *** ******* **** around ***********. ** ***** *****, ******* states -********* *****, ***** ***** *** ** ***** - **** ********* ******* **** **** require ********* ***** **** *********** *** similar ************ ** ****** ******* ****** capturing ********* ***********.

****** *** **** ***** ** "**** matching," *********** ** ** ********* ******** from **** *********** ***** ** * way ** ******** ****** ** ****** government ***********. ** ***** ** ************ helpful ** ****** **** *****, ***** the ***** ******** ******* ** *** only **** **** ** *** ********* (unlike ** ********,***** ******* ******** *** ** **). *******, ***********, *** ***** **** **** ******, has **** **** ****** ************* *****, ********, *** * ** cities.

** ********* ***** ** ********** *** approach.

Comments (13)
UI
Undisclosed Integrator #1
Feb 02, 2024

Well, I happen to see the difference. I agree there is a difference between maintaining an identity database and not. That’s the “hair” being split.

Let’s say a person is seen doing something and you can run a search across many cameras, let’s say like the Boston Bomber, and find the person appears multiple times across many areas leading you to identify this is the suspect.

Is that the same as taking that image and now running it through a comparative analysis of DMV images and matching it to a specific person?

Yes, the underlying technology is the same, the use and results are different.

IMHO

(5)
(1)
(1)
JH
John Honovich
Feb 02, 2024
IPVM

Well, I happen to see the difference

Thanks for responding. It's that both of these types, albeit different, are facial recognition, that is "face matching" is indeed "facial recognition" (like in your Boston Bomber) example even if there is no ID analysis, etc. being used.

If Eagle Eye thinks otherwise, we encourage them to deploy "face matching" in Illinois and inevitably watch themselves being sued.

Also, see Motorola Warns Customers To "Consult Your Legal Advisor" Over Avigilon Appearance Search

(1)
(1)
(1)
U
Undisclosed #2
Feb 02, 2024

Biometric Update had an article on the Gardaí (Irish Police) using this distinction in their use of facial recognition:

The police would not match individuals against a database with a 1:N algorithm, but would instead use face biometrics to assist in finding each instance of an individual engaging in criminal activity contained in huge volumes of footage.

Their spokesperson did specifically claim they are looking for "facial identification" rather than facial recognition, although it seems like what they are doing is similar to the "Boston Bomber" example above:

“Facial recognition technology is not actually what we’re seeking, we’re seeking facial identification,” he said.

JH
John Honovich
Feb 02, 2024
IPVM

article on the Gardaí (Irish Police) using this distinction

Thanks for sharing!

The GDPR has some exceptions on its biometrics / facial identification regulations for "substantial public interest" / police, from our GDPR guide:

the GDPR recognizes a large number of exceptions to these Article 9 prohibitions. "Reasons of substantial public interest", which is not further clarified by the GDPR, is the main one cited for video surveillance, and typically used for crime and law enforcement-related purposes.

UI
Undisclosed Integrator #1
Feb 02, 2024

John, I watched many an episode as a child of the original Perry Mason show and later in life Matlock. So, I think I know something about the law!

I don’t believe the Illinois law specifically and only references the use of “facial recognition” so the subtleties wouldn’t apply there in my very educated opinion.

(1)
JH
John Honovich
Feb 02, 2024
IPVM

I don’t believe the Illinois law specifically and only references the use of “facial recognition” so the subtleties wouldn’t apply ther

The Illinois law is broad, excerpts:

"Biometric identifier" means a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry...

"Biometric information" means any information, regardless of how it is captured, converted, stored, or shared, based on an individual's biometric identifier used to identify an individual.

This has resulted in large corporations typically disabling cloud face features, e.g. Google Nest has such a functionality that they call "Familiar face detection" (not admitting by name its facial recognition), but they still disable it citing Illinois law:

Nest’s familiar face detection feature is not available on Nest cameras used in Illinois. Certain state legislation may affect Illinois customers’ use of the feature, so we disabled it as a precaution. If the home where you’re using your camera is in Illinois, you shouldn’t be able to turn on familiar face detection in the Nest app.

Companies and people can come up with various excuses or interpretations (e.g., the Illinois law says "face geometry" but modern deep learning-based systems are arguably not based on geometry (unlike the older tech), however, the same privacy concerns apply).

(1)
UI
Undisclosed Integrator #1
Feb 03, 2024

I’m pretty sure Illinois is general enough to encompass all subtleties.

Personally I think Pandora left the box, smoke left the processor and we will succumb to a registry of full biometric cataloging if not by our country, then by another.

UM
Undisclosed Manufacturer #3
Feb 06, 2024

Thanks for publishing on this topic which helps clarify the nuances on this topic. I always heard the term being used loosely as well. However, I still draw a distinction between face recognition(matching) and face identification. The first term I refer to matching a face that is an image captured at different times and or locations. No information about them other than what can be gleaned from the image, camera location and the time stamp. The latter, face identification, usually requires connection to external databases such as DMV, NCIC, etc., which matches the captured face to determine the actual identity of an individual.

(2)
UI
Undisclosed Integrator #4
Feb 07, 2024

From a technical standpoint, Drako's pointing out a pretty clear and fairly obvious distinction in functionality; whether or not the system is attempting to provide an identity, versus searching for only similar appearances.

The difference between identity and similarity is not difficult to understand.

Several people may look like Tracy Chapman, which means they would all look alike to greater or lesser degree.

But only one, at most, could BE Tracy Chapman.

If I say, "Hey, you look like Tracy Chapman!" I'm not saying "Hey, you ARE Tracy Chapman!"

One is a description; the other is an identification.

The concept of identity is the root of any concept of privacy.

(1)
WH
William Howard-Waddingham
Feb 07, 2024

Thanks for your comment! Two responses: (1) under relevant regulations, the distinction between “provid[ing] an identity” versus “searching for… similar appearances” is not applicable, and (2) face matching against live video carries significant privacy risks itself.

First, from a regulatory perspective (including in Texas, where Eagle Eye operates), what matters is whether a technology can analyze “face geometry.” It is not applicable whether the analysis is being done to “searc[h] for… similar appearances” versus “provid[ing] an identity.” One focus of Drako’s article is pointing out that face recognition might face regulatory roadblocks, with the implication being that face matching does not have the same obstacles:

News stories around the world document the public’s concern about face recognition uses that invade personal privacy, are discriminatory, insecure, or that are intended for nefarious or unethical purposes.

Legislative bodies have responded with a variety of comprehensive and far-reaching proposals. [emphasis added]

It is not true that face matching does not face regulatory obstacles, as we explain in the report. Since face matching can analyze "face geometry," it gets the same regulatory treatment as face recognition.

One reason why the regulatory treatment is the same is because, to use Drako's language, face matching can also "invade personal privacy." Drako mentions in his article that "Face matching can be used to search historic video, multiple cameras, or live video." There are major privacy concerns about face matching against live video - for instance, someone can be wrongly booked for a crime if their face resembles a wanted person’s. In one recent instance, a woman named Porcha Woodruff was arrested by Detroit police in August 2023 after face recognition wrongly identified her as the perpetrator of a carjacking, whose face had been captured on surveillance footage.

UI
Undisclosed Integrator #4
Feb 07, 2024

For that to be really be pertinent, it would be necessary to prove that any particular system was using "face geometry" as opposed to some other method of comparison & analysis of visual data, which would then devolve into a rabbit hole discussion of the precise technical and legal definition of that term... and a short bout of web-searching does not provide a comprehensive legal definition of "face geometry"; technical definitions of it pivot around mathematic aspects of single polygons, so are hardly applicable.

It is possible to compare images or faces without strictly using "geometry;" so the term and the definition are important... which was possibly the point Draco was trying to make. Can't really discuss much of anything without agreed referents for significant terms.

But given that Draco already replied that he 1) wasn't speaking from a regulatory or legal aspect, only from a terminological one, 2) his company currently doesn't offer this function, and 3) that they intend to abide by the law (Texas or otherwise) in any case for anything that they may develop in future... I'm not really sure how supported the speculation that he/his company may be doing 'battle-field prep' to "work around regulations" really is.

I'm not a Draco apologist but seems like at least three levels of abstraction at work, here, even if "working around regulations" was not a normal & time-honored human function going back millennia.

If "face matching against live video" truly becomes illegal, every security guard in a SOC just lost their job the moment they claim to recognize the same 'person' from two different frames of video.

(1)
(1)
(1)
WH
William Howard-Waddingham
Feb 07, 2024

It is possible to compare images or faces without strictly using "geometry;" so the term and the definition are important

Companies offering facial recognition are certainly free to challenge regulations like Illinois's and Texas's if they believe their offerings do not use face geometry and shouldn't be covered. But when the regulations were introduced ~15 years ago, face geometry was a common method of conducting face recognition and the intent was to regulate face recognition, broadly. Companies have understood the regulations in that way - Verkada, for instance, shut off its face recognition systems in Illinois, Texas, and several other places between 2022 and 2023.

If a company did challenge the laws on the basis that product offerings do not use face geometry, we expect privacy advocates would strongly oppose the challenge. If such a challenge were successful, we expect lawmakers would revise the laws to more expansively cover face recognition. The concern is with facial recognition, broadly.

To the original point, when Drako distinguishes between face matching and face recognition, he is not saying that one uses face geometry while the other does not. We mentioned face geometry in our reply just to illustrate that the distinction that Drako does make - that face recognition accesses caches of other sensitive identifying information like government ID - is not applicable under the relevant regulations, which do not mention such databases.

If "face matching against live video" truly becomes illegal, every security guard in a SOC just lost their job the moment they claim to recognize the same 'person' from two different frames of video.

Neither we nor regulators are suggesting that humans might be violating biometric privacy laws by using their eyes to match faces against live video. The laws are clearly about computer systems.

UI
Undisclosed Integrator #4
Feb 07, 2024

Of course, there was a time when privacy advocates and regulators were concerned with officials doing exactly that - using live video feeds to "be there without being seen being there."

Go back far enough, and there were challenges to the use of photographic evidence as being an invasion of the right to anonymity - - that is, to privacy.

At least as far back as 1890; Brandeis and Warren. And they referenced much older concepts

(1)
(1)