This is an IPVM report available to you via this special link only until Jan 01, 2025 05:00 AM

Verkada Shuts Off Facial Recognition In Four Jurisdictions

Published Jan 26, 2023 13:28 PM

In 2021, Verkada blatantly and falsely declared its facial recognition to not be facial recognition in a failed attempt to circumvent local bans and restrictions on the technology. Now, the company has quietly shut off facial recognition functions in four US jurisdictions.

IPVM Image

In this report, IPVM examines the changes Verkada made, the legal issues in those specific jurisdictions that make facial recognition an especially tough sell, and the growing controversy surrounding it.

2021 Position No Facial Recognition

Eighteen months ago, Verkada was adamant that the same features they have now shut off in four jurisdictions were not facial recognition. For example, a Verkada engineer argued what Verkada offered was "not facial recognition, because we're not tying anybody to that third party database":

Verkada defended this statement at that time, saying that it "speaks for itself and should be taken in full." Verkada was advocating, unsuccessfully, for its products to not be rejected due to a city prohibiting "acquiring facial recognition technology."

"Person Detection" Availability

Sometime in the past six months, Verkada has quietly added a "cameras availability" webpage that shows its "Person Detection" functions are not available in Texas, Illinois, Baltimore, Maryland, and Portland, Oregon.

IPVM Image

It describes "face detection" as cameras keeping "an ongoing log of faces that appear on recorded footage."

"Gender appearance" is described as making a prediction of gender "based on a variety of datapoints," and allowing footage to be searched by predicted gender.

"Person of interest" notifications are described as "proactively set alerts for when a person is detected who has a face that matches a preselected profile."

Representatives for Verkada did not reply to IPVM's request for comment on the decision not to sell facial recognition in those jurisdictions.

Legal Issues

Although many states and municipalities either have rules or are considering rules aimed at limiting the use of facial recognition by law enforcement, only a handful of jurisdictions restrict or ban the use of the technology by private companies, according to Nate Wessler, deputy project director for the ACLU's Speech, Privacy and Technology Project.

Those include all four of the jurisdictions where Verkada is not offering the technology, Wessler told IPVM. Some additional jurisdictions not on Verkada's list, such as New York City, allow private companies to use facial recognition and other biometric identifier information if they notify patrons using conspicuous signage.

Washington state, also not on Verkada's list, has a law that requires companies to obtain consent when gathering or selling biometric data for a "commercial purpose," but there is a broad exception for "security."

IPVM Image

Illinois

Passed by the state legislature in 2008, the Illinois Biometric Information Privacy Act is the only state law that not only bans the collection of biometric data by private companies without consent but also allows individuals to sue companies for violations. The ACLU of Illinois led the initiative.

The ACLU and other public interest groups sued Clearview AI in May 2020 under the Illinois law, alleging that its "face surveillance activities" violated the act and represented "an unprecedented threat to our security and safety."

Clearview AI reached a settlement with the groups in May 2022 in which it agreed not to sell services to any private company in the U.S. The company also agreed not to sell services to any government entity in Illinois for five years.

The ACLU wants to see other states implement measures similar to Illinois'. "Illinois has the best in the nation law, and it is way overdue for other states to catch up," Wessler told IPVM.

Texas

Texas passed a similar biometric privacy law in 2009, requiring consent for gathering data for a "commercial purpose." Only the state attorney general can sue under the law to recover penalties of up to $25,000 per violation.

Long considered "dormant," the law was finally used in 2022 by the attorney general to sue Facebook parent Meta and Google over alleged violations resulting in billions of dollars in penalties, according to the Wall Street Journal and Bloomberg.

During a recent webinar, Verkada Senior Product Marketing Manager Liam Sosinsky fielded a question about why the company was not offering "people analytics" and "people identifying software" in Texas.

That restriction of people analytics for Texas is a statewide requirement. So it's not specific to Verkada. It's specific to anyone who provides technology similar to ours. And the reason we turn it off by default is to protect you as a customer.

He continued:

Really, it's up to Texas whether they want to change the regulations around the use of people analytics and people identifying software. We do not see that changing anywhere in the near-term, or is that under our control, unfortunately.

Baltimore

In 2021, the city council passed a sweeping bill to ban private sector use of facial recognition technology, as well as banning the city from contracting with businesses to obtain the technology.

The bill prohibits the city government from "purchasing or obtaining certain face surveillance technology" and also prohibits "any person in Baltimore city" from "obtaining, retaining, accessing, or using certain face surveillance technology or any information obtained from certain face surveillance technology."

Portland, Oregon

Portland instituted a facial recognition ban in September 2020. The city adopted two ordinances. One banned "the use and acquisition of facial recognition technologies by City bureaus" and went into effect immediately. The other took effect Jan. 1, 2021, and banned private entities "from using facial recognition technology in paces of public accommodation" within Portland.

Growing Controversy

Although the use of facial recognition by law enforcement and private companies concerns privacy and civil liberty advocates, high-profile situations involving private companies have attracted significant public outcry over the past few months.

Specifically, New York's MSG Entertainment has used facial recognition to deny entry to attorneys working for law firms involved in litigation against the company. One lawyer denied entry to Radio City Music Hall in December was trying to attend a Christmas show with her daughter.

The company has also refused entry at Madison Square Garden to lawyers who were holders of New York Knicks season tickets.

New York Attorney General Letitia James sent a letter to the company in January 2023 requesting more information about its use of facial recognition technology to prohibit entry to valid ticket holders.

IPVM Image

James said the actions "could violate local, state, and federal human rights laws, including laws prohibiting retaliation."

Comments (5)
UM
Undisclosed Manufacturer #1
Jan 26, 2023

*** ***** ********* ***** *** *********/********** is **** *********** *** ********** ******* using ****** ***********? **** ***** ********** systemes ***** *** ****** *********** ******* is *** ******, *** ***** ** made ** ** ******?

****** **** * ********* **** ***** out *****.

(2)
JH
John Honovich
Jan 26, 2023
IPVM

*** ***** ********* ***** *** *********/********** is **** *********** *** ********** *******

* ** *** ***** ** **** but * *** **** **** *** seen **.

********* *** **** ** **** *** the **** *** *** ******* ********, e.g. ********* ***** **** ** *** **** Employee, ******* ********* ******** ********** ***

***, ******* ************-******* ****** -******** ***** ********* ** "******* **** Legal *******" **** ******** ********** ******. ***** ******** ** *** ********* cloud *** / **********, *** ***** to ******** * ******* *** ***** (i.e., ******* **** **** **** *** product, **** ***'* ******** ******* *** facial *********** **********). *******, ** ***, which ** ***** *** / **********, does ****, * ***** ***** *** risk *** ********* ** ******** ***** be *******.

(3)
Avatar
Chris Lemay
Jan 30, 2023

* ***** **** **'* * *********, especially ** *** ** ***** *** where *** ******** **** *** ** different ** ********* *************, *** **** exist ** ********* ******.

******* ** *** ** *********/********** *** be **** *********** **** ****** **** not **** ** ************, *** **** to **** ****** ***** ** *** contract, ** **** *********** ******** **** the ******** ************ ***** *** *** what *** ******** *********, ********** *** made *********** (*** ** ****). * would ****** **** * ******** *** specifically ***** *** * ****** **** could ****** **** **** * "****** of ********" ******** * ******* ** file ****** *** ******** ***** **** a ****** **** ******* *** **** to ** ********* / ********** (** even ************) *** *** ******** *** with ***** ************ **** ** ********* / ********** *** *** ******* **** capability ** * ********'* ****** ******* them **** ******* ****** *** ** in * ****** ** ****.

**** ****, ** ***'** * ******** integrator, ********** ** ***'** ***** ******** in *** ** *** ************* ********* in **** *******, *** *** ** to **** ********* ** ** ******** of ***** ***********, *** ****** **** appropriately ***** *** **** *** **** of ***** ******** ********** **** ****** recognition. **** ** **'* ********* ***** to *** *** ********** ** **** jurisdiction, *** ************ ************ ** ***** your ******** ******** ****** ** **** people *** ** **** ******** *** reasons ********* ** ******** *** ****** safety ****** ** ********* **********.

(1)
UI
Undisclosed Integrator #2
Jan 26, 2023

* ***** ** **** ****** **** Verkada,

****** ** ********* **** *** ******* an ******** ******** *** *********** ***** ramifications **** ***** *** ***** ****' or ** * *********** ** ****** stop *** ****** ****** ** ****** at *** *****.

***** ****** ** ** ****** *** a ******* ******* **** ***** ***** facial ******* *** ********* ****.

* ***** **** *** **** ****** be ****** *** **** ** ************ outside ** *** ****** *** **** otherwise ************** *** ** * **** that ***** ***** ****** ********* ** use *** ****** ** ******.

***** ******** ********* **** ** *** build *** ******** *** *** ******* between ******** ** **** ****. *** that ******** ***** ** ** ****-***** and ********* ** **** ** ******* laws.

Avatar
Chris Lemay
Jan 30, 2023

******** * ***** **** *** **** face *********** (** ******** *** **** to **** *** ********** **** ****** you ** ******** * "****** ** interest") ********** *** ******* **** *** has * ***** ** ******** *** safety *******, * *** *** ***** with ******* **** * ********* **** their ****** ** "*** ****** ***********, because **'** *** ***** ******* ** that ***** ***** ********".

***** ** ***, ** ** *****, as **** ** ***'** ******** ******* some **** ** "****" (*** "********"), then ****'* "****** ***********" ** **** sort. ****'* **** ******* *** ******** the **** ******** ** ******* ******* else ("* ***** *****") **** ** for ***. ** **** *****, * would **** *** **'* **** ** that **** *****'* ************ ******** *** people ** ** (**** ***** ******** name ** ***** ********** ************ ***********.)

****** ** ***, *** **** ** question ** **** ** *** ************* mentioned ** *** ******* *** ******* than **** ****, *** *** ******** of *** **** ** **** ** absolutely *** ******** ** ** * covered *** ** "********* **********". (*** law ** ********, ****** ** ************ broad, *** *******.) *** (*** *******, and ***** *******) *** **** ** argue **** *** **** ****** ** different, *** ********* ************ ** **** market (********** *********** **** ** ********** who *** ******* ** *** ***** laws) *** ** ** ***** ********* to ****** **** ******** ** *** to *** ******** ********** (**** ** facial ***********) ******* (** ***** ************) and *********** (******* ********** ** ******'* human ****** *** *******) ******** *** law ***** ***.