Dahua Offers Race Recognition - "Black", "White", "Yellow"
By IPVM Team, Published Feb 09, 2021, 11:00am ESTDahua offers race recognition, verified by Dahua's own software client, SDK, technical documentation, and public bids, IPVM has found.
Dahua says they can recognize, in their words, "white" vs "yellow" vs "black". The company touted a "skin color" detecting AI camera in China and now its latest end user software includes "race" recognition.
This directly contradicts Dahua telling media that it "does not sell products that feature [an] ethnicity-focused recognition function".
IPVM collaborated with the LA Times on this report, see: LA Times Race detection and ‘real-time Uighur warnings’ — inside Dahua’s facial recognition technology.
SmartPSS Includes "Race"
Dahua's latest client/end user facing software, SmartPSS, includes "race" as a "face recognition" category:
The software connects to Dahua devices offering race recognition.
Dahua SDK - Black, White, Yellow
Dahua's own SDK, from its 'downloads' webpage, includes three 'races': "black", "white", and "yellow":
This was the same code that went viral on Twitter for including code specifically targeting Uyghurs, leading Dahua to delete the SDK and claim to The South China Morning Post that it "does not sell products that feature [an] ethnicity-focused recognition function".
Dahua Camera Includes "Skin Color"
Dahua touted "skin color" recognition on its own China product page for the DH-SD-8A1240WC-HNF, which Dahua says supports "Ethnic Skin Color" (人种肤色) recognition:
Notably, Dahua's crosstown rival Hikvision touted an Uyghur-detecting AI camera in 2019, IPVM revealed.
Bid Mentions Dahua Race Rec
A bid request from China's Central South University in Hunan Province for a different Dahua camera mentioned the same term, "Ethnic Skin Color" 人种肤色, in the specs:
IPVM could not find more publicly available details about this specific Dahua camera model (DH-SDT-5X409-4F-06ZH-WA).
Drops Dahua Over "Race" Analytics
Daniel Lewkovitz, CEO of Sydney security provider Calamity Monitoring, who first notified IPVM about SmartPSS offering the "race" category, said he was dropping Dahua in response:
"We have now moved away from Dahua because of this and other security providers now have an ethical decision, not simply one of price. This decision may hurt us commercially in the short term but our values are everything and I will not support this type of behaviour."
Risk Of Race Rec Spreading
Race analytics usage is almost unheard among video surveillance integrators outside China (although some Western face recognition firms, like Facewatch, do provide it.) As these controversial analytics become integrated into Dahua's global (non-PRC) solutions, this raises the risk of race detection software spreading worldwide.
Risk For Dahua
This also presents a long-term risk for Dahua as race analytics are much more controversial outside China and more firms may drop Dahua for ethical reasons. Then again, Dahua is a company that boasts of human rights sanctions showcasing its "strong technological capability".
No Response from Dahua
IPVM contacted Dahua more than a day and a half prior to publishing, detailing the evidence discovered, but Dahua did not respond.
3 reports cite this report:
Comments (51)
Undeniable proof.
Unfortunately, without a large public outcry, most integrators using Dahua now will ignore this and continue to use them.

Geez.... it's almost too crazy to think this is real.
Congratulations to Daniel of Calamity for showing some leadership in this.
Now we just need more to follow his lead & hit this scumbag company where it really hurts - in the hip pocket.
The best function would be a Jew detection. But this will be difficult since they are east europeen in fact ;-)
It is good that this got out, unfortunate that this is happening. However, it has been well-known for long that Dahua and Hikvision supply the methods PRC uses to monitor its people - if the PRC wants the technology the companies will deliver. It's business. What other stuff haven't we found yet? Only time will tell.
Im' wondering when the 'Gender' and 'Age' options will become a 'problem'...Should they not already be a 'problem' if 'ethnicity' is a 'problem'. Then maybe, I wonder if having 'color' imagery available is also a 'problem'. Maybe simple low-resolution B&W imagery is the way to go. That way the playing field will be entirely level. Im also wondering if there are other currently acceptable analytics that may offend that should also be removed. Hmmm..lets see.....
JOHN HANOVICH... TAKE ACTION IMMEDIATELY.