Dahua Provides "Uyghur Warnings" To China Police

Published Feb 09, 2021 14:06 PM
PUBLIC - This article does not require an IPVM subscription. Feel free to share.

Dahua provides "real-time Uyghur Warnings" to China police, as Dahua's own technical documents reveal, discovered by IPVM. This video explains what Dahua is doing:

This proves Dahua lied when Dahua claimed it "does not sell products that feature [an] ethnicity-focused recognition function" after it was revealed that Dahua had Uyghur tracking in its code.

IPVM collaborated with the LA Times on this report, see: LA Times Race detection and ‘real-time Uighur warnings’ — inside Dahua’s facial recognition technology.

Dahua China Support Documents

Dahua's China product support portal, via a link on Dahua's own website, documented Dahua police video surveillance projects including numerous explicit references to Uyghurs.

In China, police security camera networks integrate 'Uyghur alert' AI software that notifies police when a Uyghur-looking face appears on video surveillance footage.

Dahua Heart of City

IPVM Image

Heart of City (HOC) is Dahua's smart police solution. Dahua describes HOC:

Improves the ability of police intelligence" by creating "a new 'online police' model that enhances prediction, early-warning and prevention capabilities of the police [emphasis added]

"Real-Time Uyghur Warnings" For "Smart Police"

"Real-time Uyghur warnings" (维族人员实时预警) are listed by Dahua in a December 2019 document for Dahua's Smart Police Heart Of City program:

Real-time Uyghur Warnings

  1. web operating terminal, human face system- research and analysis, configure/deploy Uyghur person model

  2. Client, facial recognition-data model-model search, choose Uyghur person model, search for corresponding data

  3. prerequisites require the front end to support reporting Uyghur [face] attributes [emphasis added]

IPVM Image

This means if a Dahua security camera sees what it believes is a Uyghur, it immediately/automatically reports that to police. The reference to "front end" supporting "reporting Uyghur attributes" means security cameras must be technically advanced/well-placed enough to support automated detection of Uyghur faces. (This is common - Hikvision, Dahua's rival, once promoted an Uyghur-detecting AI camera.)

Big Face Platform Supports Uyghur Warnings

Another Dahua HOC support document of "standard features" dated March 2020 shows "real time Uyghur warnings" are "supported" by V1.1 of the C9505 Big Face Platform:

IPVM Image

Dahua says the C9505 Big Face platform processes a "massive [amount of] face information" for things like "suspect tracking":

provides capability of network-based access, analysis, processing and visualized display of massive face information, which effectively addresses such issues as difficult query and suspect tracking [emphasis added]

"Real-Time Warning Mode For Non-Local Uyghurs"

A Dahua test report of its own face analytics from March 2019 mentions "real time Uyghur warnings" thirteen separate times. The document targets "non-local" Uyghurs specifically, i.e. Uyghurs that don't already live within a police jurisdiction:

Adding real-time warning mode for non-local Uyghurs

  1. Log in to the web administrator's face system to study and analyze, click 'Add'
  2. Select the real-time warning mode for non-local Uyghurs, enter the task name, enter the task remarks, and click 'Next'
  3. Select the monitoring range, select the corresponding channel, and the settings appear... [emphasis added]

IPVM Image

As can be seen above, Dahua marked this as "Pass", meaning this feature was included/functional. The guide does not explain why "non-local" Uyghurs are targeted, however, in practice, they would likely draw extra scrutiny compared to those already known to authorities.

Tracking "Uyghurs With Hidden Terrorist Inclinations"

Another Dahua document is a "big data" guide from December 2020 for Dahua's "Sharp Eyes" projects, the name given to the PRC's wide-ranging police video surveillance projects.

The guide mentions a number of categories of people Dahua tracks under "Police Data", including "Uyghurs with hidden terrorist inclinations" (隐性涉恐维族人员), given code "3185":

IPVM Image

Other categories next to "Uyghurs with hidden terrorist inclinations" includes drug addicts, thieves, prostitutes, fugitives, gangsters, fraudsters:

IPVM Image

The PRC government considers many mundane things as signs of "hidden terrorist inclinations" such as having a full beard, owning multiple knives, going to a mosque, etc. Uyghurs suspected of such "inclinations" are typically rounded up and sent to Xinjiang's brutal 're-education' camps.

Background: Dahua Under Fire For Racist Software

In November 2020, an engineer exposed in a viral tweet that Dahua's SDK included lines of code specifically for identifying Uyghurs. This was covered by several outlets, including The South China Morning Post, which quoted Dahua denying it sells "ethnicity-focused recognition":

Dahua Technology does not sell products that feature [an] ethnicity-focused recognition function

Clearly, the evidence in Dahua's own technical documents shows the contrary.

No Response from Dahua

IPVM contacted Dahua more than a day and a half prior to publishing, detailing the evidence discovered, but Dahua did not respond. Furthermore, Dahua did not respond to the LA Times' multiple requests for comments.

Dahua Boasted About Human Rights Sanctions

Dahua boasted that human rights sanctions showcase its "strong technological capability" after it was sanctioned by the US government for being "implicated in the implementation of China’s campaign of repression, mass arbitrary detention, and high-technology surveillance" against Uyghurs. Prior to that, IPVM found that Dahua won almost $1 billion in massive police surveillance projects in Xinjiang.

Conclusion

This is clear proof that Dahua is involved in the repression of an ethnic minority. Video surveillance is increasingly at risk of abusing human rights, as Dahua is here, which poses a risk to the public acceptance of video surveillance as a whole and to the people who governments surveil.

UPDATE April 2021

Since Dahua has made a number of public comments including denying offering Uyghur warnings, we are adding in full screencaps of the Dahua webpages where we obtained the information to show it in context how Dahua designed, tested, and documented their Uyghur warning solution:

IPVM Image

IPVM Image

IPVM Image

Comments (22)
UM
Undisclosed Manufacturer #1
Feb 09, 2021

IPVM Image

(16)
Avatar
Tobias Steiger
Feb 09, 2021

Who pays commands. So the client, the state, is responsible. Not the order recipient.

(4)
JH
John Honovich
Feb 09, 2021
IPVM

Dahua can choose to sell or not sell. They choose to sell.

Even Huawei when caught with much less extensive evidence than Dahua has been caught with responded:

Huawei opposes discrimination of all types, including the use of technology to carry out ethnic discrimination

Dahua's position is to refuse to say anything and keep making their money, daring any government or organization to do something.

(8)
(2)
UI
Undisclosed Integrator #2
Feb 09, 2021

What happens to Dahua if he refuses? It will die?

(1)
(3)
JH
John Honovich
Feb 09, 2021
IPVM

Refuses what? All evidence points to Dahua being a willing participant in this. Dahua is being paid handsomely (at least $900 million) to do these projects.

(1)
(2)
Avatar
MICHAEL PETROV
Feb 09, 2021

So what? Never mind that the tech does not work that well for sorting out Uyghurs, it's not the tech company but the END USER who puts it in action! There was a market need, and it got fulfilled. It is the end user who should be held responsible, which in this case is the PRC gov. How is it different from companies selling the PRC government hard drives and computers for video surveillance storage? AI is a ubiquitous enabling tech that correlates whatever is fed into it. We are just a couple of years ahead (if that!) of the Chinese in terms of the AI corporate social responsibility, and up until last year Amazon, Microsoft IBM and others were selling face rec for the U.S. police surveillance, which is largely not regulated in the USA and morally questionable (though much less effective than the internet user tracking via browser ad ID that we are all subjected to). We also fund research in the topics like correlating facial appearance with sexual orientation (Stanford U), with criminal intend (Harrisburg U), with IQ, terrorism and pedophilia (Faception), or the reconstruction of facial appearance from speech (MIT). We are one of the few developed countries in the world (though not my New Jersey!) that let the police operate clearview.ai face search engine that contains billions of photos illegally stolen from different social media sites. In other words, we may have issues similar to the one disclosed above closer to home, and those where we can actually comprehend the business process and the END USER intend, and held the END USER responsible.

(2)
(4)
(1)
JH
John Honovich
Feb 09, 2021
IPVM

it's not the tech company but the END USER who puts it in action!

No, the manufacturer developed something specifically to persecute an ethnic minority, this is not some general purpose product that might be misued by an end user.

Take, for example, an American company, like yours, imagine you developed a solution called 'Mexican warnings" that scanned video looking for people who looked Mexican and sold that to the Customs and Border Patrol. Are you saying that your company would not understand or recognize what would be unethical about this?

To be clear, your company is not doing this and there is no evidence that Customs and Border Patrol would do this but can anyone seriously say that if a manufacturer developed and sold such a 'Mexican warning' to the border patrol that it would be both unethical and the responsibility of the manufacturer and user?

(8)
(1)
Avatar
MICHAEL PETROV
Feb 09, 2021

If CBP puts an RFP asking for ID of Mexicans there will be a dozen US companies replying to it, no doubt! Obviously, CBP would not do that as it will not pass their internal legal assessment. But case in point: DHS procurement last year of a security system for protecting TSA management. It called for an investigative system of facial recognition across all images posted on Facebook, Instagram, LinkedIn, etc. - they listed a dozen specific sites! I.e. they implicitly called for the procurement of clearview.ai.

(2)
(2)
(1)
UI
Undisclosed Integrator #4
Feb 09, 2021
(2)
(2)
(2)
JH
John Honovich
Feb 09, 2021
IPVM

DHS procurement last year of a security system for protecting TSA management. It called for an investigative system of facial recognition

Facial recognition is fundamentally different from what Dahua developed here - Uyghur warnings.

Facial recognition can be used in many different ways - to try to find people who disappeared or recognize a VIP or to spot a political adversary, some ways are ethical, some are not.

Uyghur warnings by China police, like hypothetical Mexican warnings by Customs, is developed for a specific unethical task.

Do you see that difference?

(3)
(1)
(1)
Avatar
MICHAEL PETROV
Feb 09, 2021

John, absolutely agree with you that police tracking Uyghurs is immoral and unethical. But by analyzing documentation leftovers you don't know for certain what the business application is, if the police comprehends individuals, ignores alarms altogether, or, for example, Uyghurs get a 10% bus ticket discount since they are expected to be local to the town (either is unethical and immoral, from where I stand!). And you squarely put the blame on the tech vendor, which is always questionable. When we say that Macy's use of facial recognition is probably not right (as in to track shoppers who did not opt in), we don't point fingers and expect the vendor to be sued! It's Macy's fault! In fact, in none of the BIPA violations the tech vendor was named a defendant I believe (well, there was one but it got immediately dismissed).

(1)
(5)
(2)
JH
John Honovich
Feb 09, 2021
IPVM

you don't know for certain what the business application is, if the police comprehends individuals, ignores alarms altogether, or, for example, Uyghurs get a 10% bus ticket discount since they are expected to be local to the town

Michael, Dahua was free to explain this to the LA Times. Maybe in your imaginary world, the PRC police were receiving Uyghur warnings to run out and give Uyghurs red envelopes and hugs.

That said, keep in mind, in Dahua's documentation, Uyghurs are listed alongside drug abusers, gangsters, prostitutes, and thieves:

IPVM Image

(4)
(1)
(2)
Avatar
MICHAEL PETROV
Feb 09, 2021

Good point John! Evil company, and rightfully sanctioned in the USA. I also like there "People who damage social stability"! That's sooo Chinese - we will never understand. Just an idea for you closer to home - dig in how the U.S. retail uses facial recognition. I have not seen a single good article.

(1)
(1)
(1)
CW
Charles Walker
Feb 11, 2021
IPVMU Certified

I typed up a huge rant on this before reading the other replies and I changed my comment. Everyone else has clearly made their point and its extremely tiresome beating a dead horse.

(2)
UM
Undisclosed Manufacturer #3
Feb 09, 2021

To those who say it is the end user who is responsible, that would be the case if the user was looking for a certain type of person, etc. However here the manufacturer specifically wrote code, developed products, firmware, etc. for the end user for the use case. It would be different if it was a generic product and the end user modified it or wrote their own code. But that is not the case here.

Of course they always deny it and point to their documentation of glorious esteem and they do the right thing. And slammers will continue to use low end junk regardless.

(7)
(1)
JH
John Honovich
Feb 10, 2021
IPVM

Update, the US Security Industry Association has provided comment:

The Security Industry Association (SIA) believes that the application of technology, such as advanced video analytics, to target a specific population for suppression based on ethnicity or race runs counter to our industry’s goal of creating safety and security for all people. At SIA, we are developing programs that strive to cultivate diversity and inclusion, so any application of technology intended to suppress or exclude a unique population is contrary to the vision and mission which SIA has defined for itself.

(1)
(3)
Avatar
Ross Vander Klok
Feb 10, 2021
IPVMU Certified

Was there also a sentence (I know there wasn't I am being facetious) that said "So we at SIA are cutting all ties with Dahua globally until they denounce and end this reprehensible practice!" I'll bet you didn't miss that sentence seeing as they did not even mention the company by name in their statement. The statement itself, I guess, is better than nothing but not by much.

Put a little more teeth into it SIA!

(5)
UD
Undisclosed Distributor #6
Feb 11, 2021

The Security Industry Association (SIA) believes that the application of technology, such as advanced video analytics, to target a specific population for suppression based on ethnicity or race runs counter to our industry’s goal of creating safety and security for all people. At SIA, we are developing programs that strive to cultivate diversity and inclusion, so any application of technology intended to suppress or exclude a unique population is contrary to the vision and mission which SIA has defined for itself, until it affects our income or junkets, or may upset our valuable (give us more cash) partners..

(1)
UE
Undisclosed End User #5
Feb 10, 2021

I find the argument that the tech developer and seller is not responsible particularly silly. By that logic: Cyberdyne Systems is off the hook for Skynet. After all they were just producing the tech their end users wanted...

(1)
(3)
JH
John Honovich
Feb 12, 2021
IPVM

Update, Dahua has provided a new response, copied in full:

We here reconfirm:

In the regional markets reported by the LA times, i.e., Xinjiang, PRC, Dahua never provided products or services for ethnicity detection.

For the countries and regions outside of China, Dahua never provided products or services for ethnicity detection.

For the regions other than Xinjiang in China, according to days’ investigation, we haven’t found any products or services for ethnicity detection.

That Dahua did not provide Uyghur detection to Canada or the UK, etc., is totally believable since no Canadians or British desire to get warnings about Uygurs.

The issue remains for Uygur warnings inside of China. Our response to Dahua:

Can you clarify what you mean by 'any products or services'? The documentation we found shows products / services for ethnicity detection. Or asked in another way, when Dahua specifies 'real time Uyghur warnings', what does Dahua mean by that, if not ethnicity detection?

(2)
JH
John Honovich
Feb 25, 2021
IPVM

Dahua has not responded to this question despite us asking/following up 3 times in the past 2 weeks.

(1)
JH
John Honovich
Apr 08, 2021
IPVM

UPDATE April 2021

Since Dahua has made a number of public comments including denying offering Uyghur warnings, we are adding in full screencaps of the Dahua webpages where we obtained the information to show it in context how Dahua designed, tested, and documented their Uyghur warning solution:

IPVM Image

IPVM Image

IPVM Image