"Automated Racism": Chinese Tech Companies Profiling Ethnic Minority

By IPVM Team, Published Apr 26, 2019, 08:54am EDT

Scrutiny of Chinese facial recognition providers, including Hikvision, has sharply increased following a New York Times report that they developed analytics for the government to track members of China's Uyghur ethnic minority, "potentially ushering in a new era of automated racism."

automated racism china

Uyghurs are a mostly Muslim people in China's western Xinjiang region who are subject to a harsh Chinese government campaign of mass detentions and intrusive surveillance, a crackdown that's been lucrative for firms like Hikvision and Dahua.

The NYT article indicates a growing ethical/geopolitical divide between China and the rest of the world on video surveillance and facial recognition, threatening cooperation with China on the topics. It will likely lead to human rights questions surrounding Western investors and tech firms who continue to work with the Chinese providers named by the NYT.

In this post, we examine the report and its broader impact, including:

  • NYT Summary
  • Hikvision Mentioned
  • Comparison to West
  • Accuracy Concerns
  • Face Rec Industry Impact
  • Impact Within China

NYT ****** ******* 

*** ******* ************ ******* ********* **** developed ******** ********* "********** into *****’* ******* ********* networks ** ************ *******" which "***** *********** *** Uighurs ***** ** ***** appearance *** ***** ******* of ***** ******* *** goings *** ****** *** review."

*** ******* ***** *** following ********, ****** * anonymous ******* *** ******/********* documents:

"****** *** *****" ****** departments ****** ***** ****** to *** **** *********, with *** ********** **** a ***** ****** ****** "should ******* ****** *********** to ******** ******/***-****** **********,” and ******* ****** ************* notifying ****** ** ******** Uyghurs **** "*** **** flight ** *** **** day"

* **** ****** ******** with **** **** "**********" determined *,*** *** ** 500,000 **** **** ****** over *** ****** ** a *****.

rec uygur

********* ********* ********* **** *** tech ***** "*********** **** alarms" ** ****** "** originally *** ****** ***** in * ************, *** within ** **** *** Uighurs ******"

****** ********* **** ****** against "********** ****** ********* publicly," **** ****** ** has ****** **** ******.

Hikvision *********** 

****** *** ********* ******, NYT ******** **** ********* "offered * ******** *********** ********, but ***** ******* ** out ** ****." **** year, **** ***** *** story ***** *********'* ******** *********,***** **** ************* ********* at * ********** ** China.

***** ***** ************ ****** up ****'* ******,********* ******* **** **, quietly **-******* *** ***** and ******** ****** *******

Companies ********: **** ******* ** ********

***** **** *********, *** companies ****** *** ** the *** ******* **** of *****'*******-******** '********':

Comparison ** ****** ********* ** *** ****

**** ********* ****** ** the ** ** **** but *** ****. **** of *** ***** ***** surveillance ********* ***** **. However, **** **, * small (***** ** ********* according ** ********) *** Francisco-based ****** *********** ******* with *****$*** ** ******* *******, ******** *** *********** ** ****** "********* detection" ************:

ever ai

*******, ****** *** ** the ********** ** *** West ** **** **** limited ** ** ** constrained ** ********/**** ***********/***** scrutiny. ***** *** ** documented ******* ** ** police *********** ***** ****** analytics ** ****** *** suspects ** *********. *** closest **** ** **** *** Intercept************ *** **** **** video ************ ******* ** develop **** *** ******** algorithms **** **** ****** search ******** ** **** - *******, *** **** denied **** ***** *** skin **** ******* ** order ** ***** ****** profiling ***********.

Accuracy ********

*** *** ********* (*********) that **** ******* *** "imperfect" *** ****** ** a ***** ****** ** good ******** ********** *** other ************* *******, ******* the ***** ** ***** positives ******* ****** ** legal *******.

Impact ** **** *** ******** 

*** ******* ***** *** the ******* ******** ***** rights ****** ** ********** criticism ** *****'* **** recognition *******. **** *** a ****** ** *******, chiefly:

Effect ****** *****

**** ******* **** ******* led ** **** ******* in ***** - *********, for *******,*** **** ****** ******** *********** **** Xinjiang ************ **** **** Technology. *******, ***** **** SenseTime, ***** ** ** sign *** ** ******* firms ******* **** **** the ********* ******** ******** market; **** ** **** have **** ********* ** questions ***** ***** **** abetting ***** ****** ****** there.

**********

***** *** *** ******** implications, ********* ******* ********* considering ***** ******* ****** recognition ********* ********** ** active ** ******** ** developing *** **** ** minority ********* **** **** to ***** ********* **** now **.

Vote / ****

Comments (10)

Obviously there are business case uses for racial identification in analytics.  I voted legal.  It’s how they are used that should be managed IMHO.

But then, I support people having the right to own guns and how they are used determines what is legal or not. 

Do I support using any AI functions to discriminate in a way you would prosecute a human? No. 

Maybe we should ban scales, mine tells me I am fat!

Agree: 7
Disagree: 2
Informative
Unhelpful
Funny: 1

I spoke with one of the major China manufacturers recently about this.  He said their government is just trying to protect them from harm.  So many of China's people are brain washed by the government.  Of course, they have no choice either because to speak up against the government is to risk immediate imprisonment or even death.

Sadly, racism is RAMPANT in China.  Racism will never go away as it is still rampant in the USA especially the south.  As long as small minds exist, fear and prejudice will.

I wouldn't be surprised some business in the South starts using analytics to weed out potential criminals based on skin color soon. 

Agree
Disagree
Informative
Unhelpful: 2
Funny

"I wouldn't be surprised some business in the South starts using analytics to weed out potential criminals based on skin color soon."

this is patently ridiculous - and you embarrass yourself by stating such things.  you clearly get your knowledge of the U.S. south from old Mark Twain stories.

I am not from the south but I have lived here for decades.

sure there are old-timers who wear the reb hats and spew hate until they finally die and go away... but the average southerner (today) is far less racist than those I know who live in enclaves of racial homogeny in the north.

EDIT:  I was not clear above regarding my primary argument - since your comments inflamed the humanist passions within me.

To be more clear:  I find your comment to be patently racist itself - as you apparently see the U.S. south through a racist-tinted lens.

Agree: 2
Disagree
Informative
Unhelpful: 1
Funny

I think it's safe to say that any ethnic or racial profiling using video surveillance in the US would cause a national controversy.

I am not saying it will not happen but when it does, the (justifiable) outrage to it will be severe.

Contrast to China, where the NY Times is banned for such reports and people commenting publicly on such matters are blocked or worse.

Agree
Disagree
Informative: 2
Unhelpful
Funny

I didn't say all in the south are racist.  However, it is still quite prevalent there, and throughout the USA.  So yea, I did not embarrass myself whatsoever by making the statement it could happen.

I wonder, are you a minority?

Agree
Disagree
Informative: 1
Unhelpful: 2
Funny

"I didn't say all in the south are racist."

please.  this is hardly a defense of your original statement.  your position was quite clear.

"I did not embarrass myself whatsoever by making the statement it could happen."

seriously?  monkeys might fly out of my butt - it could happen.  but I do not predicate my public statements with this qualifier because it is the weakest defense of any position that I might utter that I can imagine.

Agree
Disagree
Informative
Unhelpful
Funny

I worry that eventually doing business with China will be like the Americans that did business with Hitler...

The place has literally and figuratively been becoming more and more toxic.

I remember YEARS ago, early 2000s, a Korean MFG was showing us their new auto-tracking (Long before Dahua's) and when asked "what about when there are two people?" the answer was "It follows the darker person."

I can't condone this kind of thing.

Agree
Disagree
Informative
Unhelpful
Funny: 1

To your question - Should ethnic/racial analytics be legal? I saw yes. Why - If there is a look out for a pale skinned red head; the fact that I refer to the person as pale skinned is racial.

It is not the fact that there is an algorithm that can analyze flesh tone, it is how the analytic is used. I know this sounds like - Guns don't kill people, people kill people." Algorithms aren't biased, the algorithm could be used in a biased manner. Algorithms need to be developed to be as accurate as possible based on all the physical characteristics of a person. 

If skin tone can't be used as a characteristic, what can be used? Well, whatever your answer, somebody is going to say you are biased. Height, weight, gender, skin tone, hair color, hair length are all perceptions. You shouldn't outlaw the use of our perception.

You should outlaw the use of an algorithm in a biased manner. In many cases, a preemptive use of such an algorithm would be biased; I want to know when a certain type of person enters the neighborhood.  While forensic use of the same algorithm with the same parameters would be reasonable; a 6' tall, heavy set, white male, middle aged committed the crime. 

Agree: 2
Disagree: 1
Informative
Unhelpful
Funny

Algorithms aren't biased, the algorithm could be used in a biased manner.

Tech has tremendous problems with algorithms being biased, so much that "algorithmic bias" is a core term and problem for data scientists to solve.

https://arstechnica.com/tech-policy/2019/01/yes-algorithms-can-be-biased-heres-why/

Agree
Disagree
Informative: 1
Unhelpful
Funny

Algorithms would be perfect if people didn’t create them. 

Agree
Disagree
Informative
Unhelpful
Funny
Read this IPVM report for free.

This article is part of IPVM's 7,092 reports and 940 tests and is only available to members. To get a one-time preview of our work, enter your work email to access the full article.

Already a member? Login here | Join now
Loading Related Reports