NYT ****** *******
*** ******* ************ ******* ********* **** ********* ******** analytics "********** **** *****’* ******* ********* networks ** ************ *******" ***** "***** exclusively *** ******* ***** ** ***** appearance *** ***** ******* ** ***** comings *** ****** *** ****** *** review."

*** ******* ***** *** ********* ********, citing * ********* ******* *** ******/********* documents:
"****** *** *****" ****** *********** ****** China ****** ** *** **** *********, with *** ********** **** * ***** camera ****** "****** ******* ****** *********** to ******** ******/***-****** **********,” *** ******* system ************* ********* ****** ** ******** Uyghurs **** "*** **** ****** ** the **** ***"
* **** ****** ******** **** **** like "**********" ********** *,*** *** ** 500,000 **** **** ****** **** *** course ** * *****.

********* ********* ********* **** *** **** ***** "immediately **** ******" ** ****** "** originally *** ****** ***** ** * neighborhood, *** ****** ** **** *** Uighurs ******"

****** ********* **** ****** ******* "********** ethnic ********* ********," **** ****** ** has ****** **** ******.
Hikvision ***********
****** *** ********* ******, *** ******** that ********* "******* * ******** *********** ********, but ***** ******* ** *** ** 2018." **** ****, **** ***** *** story ***** *********'* ******** *********,***** **** ************* ********* ** * conference ** *****.

***** ***** ************ ****** ** ****'* report,********* ******* **** **, ******* **-******* the ***** *** ******** ****** *******.
Companies ********: **** ******* ** ********
***** **** *********, *** ********* ****** out ** *** *** ******* **** of *****'*******-******** '********':
Comparison ** ****** ********* ** *** ****
**** ********* ****** ** *** ** as **** *** *** ****. **** of *** ***** ***** ************ ********* offer **. *******, **** **, * small (***** ** ********* ********* ** LinkedIn) *** *********-***** ****** *********** ******* with *****$*** ** ******* *******, ******** *** *********** ** ****** "********* *********" ************:

*******, ****** *** ** *** ********** in *** **** ** **** **** limited ** ** ** *********** ** lawsuits/bias ***********/***** ********. ***** *** ** documented ******* ** ** ****** *********** using ****** ********* ** ****** *** suspects ** *********. *** ******* **** is **** *** ********************* *** **** **** ***** ************ footage ** ******* **** *** ******** algorithms **** **** ****** ****** ******** by **** - *******, *** **** denied **** ***** *** **** **** feature ** ***** ** ***** ****** profiling ***********.
Accuracy ********
*** *** ********* (*********) **** **** systems *** "*********" *** ****** ** a ***** ****** ** **** ******** conditions *** ***** ************* *******, ******* the ***** ** ***** ********* ******* people ** ***** *******.
Impact ** **** *** ********
*** ******* ***** *** *** ******* Xinjiang ***** ****** ****** ** ********** criticism ** *****'* **** *********** *******. This *** * ****** ** *******, chiefly:
Effect ****** *****
**** ******* **** ******* *** ** some ******* ** ***** - *********, for *******,*** **** ****** ******** *********** **** ******** ************ firm **** **********. *******, ***** **** SenseTime, ***** ** ** **** *** of ******* ***** ******* **** **** the ********* ******** ******** ******; **** of **** **** **** ********* ** questions ***** ***** **** ******** ***** rights ****** *****.
**********
***** *** *** ******** ************, ********* Western ********* *********** ***** ******* ****** recognition ********* ********** ** ****** ** Xinjiang ** ********** *** **** ** minority ********* **** **** ** ***** carefully **** *** **.
Vote / ****

Comments (10)
Undisclosed Manufacturer #1
Obviously there are business case uses for racial identification in analytics. I voted legal. It’s how they are used that should be managed IMHO.
But then, I support people having the right to own guns and how they are used determines what is legal or not.
Do I support using any AI functions to discriminate in a way you would prosecute a human? No.
Maybe we should ban scales, mine tells me I am fat!
Create New Topic
Undisclosed Integrator #2
I spoke with one of the major China manufacturers recently about this. He said their government is just trying to protect them from harm. So many of China's people are brain washed by the government. Of course, they have no choice either because to speak up against the government is to risk immediate imprisonment or even death.
Sadly, racism is RAMPANT in China. Racism will never go away as it is still rampant in the USA especially the south. As long as small minds exist, fear and prejudice will.
I wouldn't be surprised some business in the South starts using analytics to weed out potential criminals based on skin color soon.
Create New Topic
Undisclosed Distributor #4
I worry that eventually doing business with China will be like the Americans that did business with Hitler...
The place has literally and figuratively been becoming more and more toxic.
I remember YEARS ago, early 2000s, a Korean MFG was showing us their new auto-tracking (Long before Dahua's) and when asked "what about when there are two people?" the answer was "It follows the darker person."
I can't condone this kind of thing.
Create New Topic
Undisclosed Integrator #5
To your question - Should ethnic/racial analytics be legal? I saw yes. Why - If there is a look out for a pale skinned red head; the fact that I refer to the person as pale skinned is racial.
It is not the fact that there is an algorithm that can analyze flesh tone, it is how the analytic is used. I know this sounds like - Guns don't kill people, people kill people." Algorithms aren't biased, the algorithm could be used in a biased manner. Algorithms need to be developed to be as accurate as possible based on all the physical characteristics of a person.
If skin tone can't be used as a characteristic, what can be used? Well, whatever your answer, somebody is going to say you are biased. Height, weight, gender, skin tone, hair color, hair length are all perceptions. You shouldn't outlaw the use of our perception.
You should outlaw the use of an algorithm in a biased manner. In many cases, a preemptive use of such an algorithm would be biased; I want to know when a certain type of person enters the neighborhood. While forensic use of the same algorithm with the same parameters would be reasonable; a 6' tall, heavy set, white male, middle aged committed the crime.
Create New Topic
Undisclosed Manufacturer #1
Algorithms would be perfect if people didn’t create them.
Create New Topic