"Automated Racism": Chinese Tech Companies Profiling Ethnic Minority

Published Apr 26, 2019 12:54 PM

Scrutiny of Chinese facial recognition providers, including Hikvision, has sharply increased following a New York Times report that they developed analytics for the government to track members of China's Uyghur ethnic minority, "potentially ushering in a new era of automated racism."

automated racism china

Uyghurs are a mostly Muslim people in China's western Xinjiang region who are subject to a harsh Chinese government campaign of mass detentions and intrusive surveillance, a crackdown that's been lucrative for firms like Hikvision and Dahua.

The NYT article indicates a growing ethical/geopolitical divide between China and the rest of the world on video surveillance and facial recognition, threatening cooperation with China on the topics. It will likely lead to human rights questions surrounding Western investors and tech firms who continue to work with the Chinese providers named by the NYT.

In this post, we examine the report and its broader impact, including:

  • NYT Summary
  • Hikvision Mentioned
  • Comparison to West
  • Accuracy Concerns
  • Face Rec Industry Impact
  • Impact Within China

NYT ****** ******* 

*** ******* ************ ******* ********* **** ********* ******** analytics "********** **** *****’* ******* ********* networks ** ************ *******" ***** "***** exclusively *** ******* ***** ** ***** appearance *** ***** ******* ** ***** comings *** ****** *** ****** *** review."

*** ******* ***** *** ********* ********, citing * ********* ******* *** ******/********* documents:

"****** *** *****" ****** *********** ****** China ****** ** *** **** *********, with *** ********** **** * ***** camera ****** "****** ******* ****** *********** to ******** ******/***-****** **********,” *** ******* system ************* ********* ****** ** ******** Uyghurs **** "*** **** ****** ** the **** ***"

* **** ****** ******** **** **** like "**********" ********** *,*** *** ** 500,000 **** **** ****** **** *** course ** * *****.

rec uygur

********* ********* ********* **** *** **** ***** "immediately **** ******" ** ****** "** originally *** ****** ***** ** * neighborhood, *** ****** ** **** *** Uighurs ******"

****** ********* **** ****** ******* "********** ethnic ********* ********," **** ****** ** has ****** **** ******.

Hikvision *********** 

****** *** ********* ******, *** ******** that ********* "******* * ******** *********** ********, but ***** ******* ** *** ** 2018." **** ****, **** ***** *** story ***** *********'* ******** *********,***** **** ************* ********* ** * conference ** *****.

***** ***** ************ ****** ** ****'* report,********* ******* **** **, ******* **-******* the ***** *** ******** ****** *******

Companies ********: **** ******* ** ********

***** **** *********, *** ********* ****** out ** *** *** ******* **** of *****'*******-******** '********':

Comparison ** ****** ********* ** *** ****

**** ********* ****** ** *** ** as **** *** *** ****. **** of *** ***** ***** ************ ********* offer **. *******, **** **, * small (***** ** ********* ********* ** LinkedIn) *** *********-***** ****** *********** ******* with *****$*** ** ******* *******, ******** *** *********** ** ****** "********* *********" ************:

ever ai

*******, ****** *** ** *** ********** in *** **** ** **** **** limited ** ** ** *********** ** lawsuits/bias ***********/***** ********. ***** *** ** documented ******* ** ** ****** *********** using ****** ********* ** ****** *** suspects ** *********. *** ******* **** is **** *** ********************* *** **** **** ***** ************ footage ** ******* **** *** ******** algorithms **** **** ****** ****** ******** by **** - *******, *** **** denied **** ***** *** **** **** feature ** ***** ** ***** ****** profiling ***********.

Accuracy ********

*** *** ********* (*********) **** **** systems *** "*********" *** ****** ** a ***** ****** ** **** ******** conditions *** ***** ************* *******, ******* the ***** ** ***** ********* ******* people ** ***** *******.

Impact ** **** *** ******** 

*** ******* ***** *** *** ******* Xinjiang ***** ****** ****** ** ********** criticism ** *****'* **** *********** *******. This *** * ****** ** *******, chiefly:

Effect ****** *****

**** ******* **** ******* *** ** some ******* ** ***** - *********, for *******,*** **** ****** ******** *********** **** ******** ************ firm **** **********. *******, ***** **** SenseTime, ***** ** ** **** *** of ******* ***** ******* **** **** the ********* ******** ******** ******; **** of **** **** **** ********* ** questions ***** ***** **** ******** ***** rights ****** *****.

**********

***** *** *** ******** ************, ********* Western ********* *********** ***** ******* ****** recognition ********* ********** ** ****** ** Xinjiang ** ********** *** **** ** minority ********* **** **** ** ***** carefully **** *** **.

Vote / ****

Comments (10)
UI
Undisclosed Integrator #1
Apr 26, 2019

Obviously there are business case uses for racial identification in analytics.  I voted legal.  It’s how they are used that should be managed IMHO.

But then, I support people having the right to own guns and how they are used determines what is legal or not. 

Do I support using any AI functions to discriminate in a way you would prosecute a human? No. 

Maybe we should ban scales, mine tells me I am fat!

(7)
(2)
(1)
UI
Undisclosed Integrator #2
Apr 26, 2019

I spoke with one of the major China manufacturers recently about this.  He said their government is just trying to protect them from harm.  So many of China's people are brain washed by the government.  Of course, they have no choice either because to speak up against the government is to risk immediate imprisonment or even death.

Sadly, racism is RAMPANT in China.  Racism will never go away as it is still rampant in the USA especially the south.  As long as small minds exist, fear and prejudice will.

I wouldn't be surprised some business in the South starts using analytics to weed out potential criminals based on skin color soon. 

(2)
U
Undisclosed #3
Apr 26, 2019

"I wouldn't be surprised some business in the South starts using analytics to weed out potential criminals based on skin color soon."

this is patently ridiculous - and you embarrass yourself by stating such things.  you clearly get your knowledge of the U.S. south from old Mark Twain stories.

I am not from the south but I have lived here for decades.

sure there are old-timers who wear the reb hats and spew hate until they finally die and go away... but the average southerner (today) is far less racist than those I know who live in enclaves of racial homogeny in the north.

EDIT:  I was not clear above regarding my primary argument - since your comments inflamed the humanist passions within me.

To be more clear:  I find your comment to be patently racist itself - as you apparently see the U.S. south through a racist-tinted lens.

(2)
(1)
JH
John Honovich
Apr 26, 2019
IPVM

I think it's safe to say that any ethnic or racial profiling using video surveillance in the US would cause a national controversy.

I am not saying it will not happen but when it does, the (justifiable) outrage to it will be severe.

Contrast to China, where the NY Times is banned for such reports and people commenting publicly on such matters are blocked or worse.

(2)
UI
Undisclosed Integrator #2
Apr 26, 2019

I didn't say all in the south are racist.  However, it is still quite prevalent there, and throughout the USA.  So yea, I did not embarrass myself whatsoever by making the statement it could happen.

I wonder, are you a minority?

(1)
(2)
U
Undisclosed #3
Apr 26, 2019

"I didn't say all in the south are racist."

please.  this is hardly a defense of your original statement.  your position was quite clear.

"I did not embarrass myself whatsoever by making the statement it could happen."

seriously?  monkeys might fly out of my butt - it could happen.  but I do not predicate my public statements with this qualifier because it is the weakest defense of any position that I might utter that I can imagine.

UD
Undisclosed Distributor #4
Apr 29, 2019

I worry that eventually doing business with China will be like the Americans that did business with Hitler...

The place has literally and figuratively been becoming more and more toxic.

I remember YEARS ago, early 2000s, a Korean MFG was showing us their new auto-tracking (Long before Dahua's) and when asked "what about when there are two people?" the answer was "It follows the darker person."

I can't condone this kind of thing.

(1)
UI
Undisclosed Integrator #5
Apr 29, 2019

To your question - Should ethnic/racial analytics be legal? I saw yes. Why - If there is a look out for a pale skinned red head; the fact that I refer to the person as pale skinned is racial.

It is not the fact that there is an algorithm that can analyze flesh tone, it is how the analytic is used. I know this sounds like - Guns don't kill people, people kill people." Algorithms aren't biased, the algorithm could be used in a biased manner. Algorithms need to be developed to be as accurate as possible based on all the physical characteristics of a person. 

If skin tone can't be used as a characteristic, what can be used? Well, whatever your answer, somebody is going to say you are biased. Height, weight, gender, skin tone, hair color, hair length are all perceptions. You shouldn't outlaw the use of our perception.

You should outlaw the use of an algorithm in a biased manner. In many cases, a preemptive use of such an algorithm would be biased; I want to know when a certain type of person enters the neighborhood.  While forensic use of the same algorithm with the same parameters would be reasonable; a 6' tall, heavy set, white male, middle aged committed the crime. 

(2)
(1)
UM
Undisclosed Manufacturer #6
Apr 30, 2019

Algorithms aren't biased, the algorithm could be used in a biased manner.

Tech has tremendous problems with algorithms being biased, so much that "algorithmic bias" is a core term and problem for data scientists to solve.

https://arstechnica.com/tech-policy/2019/01/yes-algorithms-can-be-biased-heres-why/

(1)
UI
Undisclosed Integrator #1
Apr 30, 2019

Algorithms would be perfect if people didn’t create them.