Alibaba Admits Developing Racist Uyghur Recognition

By IPVM Team, Published Dec 22, 2020, 07:31am EST

Alibaba admitted its Cloud division developed racist AI software, saying it is "dismayed" while claiming it "never intended" to target "specific ethnic groups" and the tech was only used "within a testing environment".

For an 80-second overview video, watch below:

Developing such software takes complex steps that intentionally target Uyghurs, contrary to Alibaba claiming it "never intended" to target them. Alibaba has also refused to provide any proof this was just a 'test' or 'trial' - its own website showed this was a live feature.

Finally, Alibaba's statement was not published in Chinese, despite the racist tech being only available on Alibaba Cloud's China website, allowing Alibaba to appease an international audience while avoiding the risk of upsetting the PRC government.

Background

On December 16, 2020, IPVM and The New York Times reported that Alibaba's cloud division openly offered Uyghur detection as part of a content moderation solution in an API guide:

IPVM Image

The news was picked up by major media outlets across the world including CNN, the BBC, Reuters, France's leading newspaper Le Figaro, Turkey's leading newspaper Hurriyet, SCMP in Hong Kong, etc.

Alibaba's New Statement Says "Dismayed"

Join IPVM Newsletter?

IPVM is the #1 authority in video surveillance news, in-depth tests, and training courses. Get emails, once a day, Monday to Friday.

Before the investigation was published, Alibaba only issued a curt statement that this software was used "within a testing environment". However, as the story spread internationally, Alibaba issued a new statement that it was "dismayed" its Cloud division had developed this software, claiming Alibaba "never intended" to target "specific ethnic groups" but remaining firm this was only a "trial" anyway:

IPVM Image

By Definition, Uyghur Detection Targets "Specific Ethnic Groups"

Alibaba claims "We never intended our technology to be used for and will not permit it to be used for targeting specific ethnic groups".

However, developing Uyghur detection, by definition, requires "targeting specific ethnic groups". Computer vision depends on large training sets with labeled images of Uyghurs and non-Uyghurs in order to train the AI to pick out Uyghurs. This is not a process that happens accidentally or unintentionally.

'Trial'/'Testing' Claims: Unproven, Technically Dubious

Alibaba also claims this was a "trial technology" with Alibaba Cloud earlier stating Uyghur recognition was only deployed "within a testing environment".

However, the API guide showing Uyghur recognition made no mention of testing anywhere. API guides are meant to help customers utilize existing, functioning software.

Alibaba China Keeps Silent

IPVM verified that Alibaba has not responded to this issue on its China platform or social media channels, despite the fact that Uyghur detection was only offered on Alibaba Cloud's China website (not its International one) while the vast majority of Uyghurs live in the PRC.

Publishing a statement in China saying it is "dismayed" at Uyghur analytics risk upsetting the PRC government, which has required Uyghur analytics for police usage:

IPVM Image

No Response From Alibaba

IPVM brought up all these points in a request for comment to Alibaba, however, Alibaba has not responded. If they do follow up we will update.

Conclusion

Clearly, Alibaba is hoping that this issue will blow over, with investors/media taking its misleading statement as a sign the situation has somehow been resolved.

Regardless, the evidence is clear that Alibaba specifically targeted Uyghurs and not in a harmless trial, but by directly offering Cloud clients this racist software. Shameful corporate spin cannot detract from this.

Comments (2)

Only IPVM Subscribers may comment. Login or Join.

that is honorable work done by IPVM on this one, for the sake of our planet.

more like this please!

congrat's

Agree: 4
Disagree
Informative
Unhelpful
Funny

So, it was just testing, but we're going to ignore the reasons why they felt it was necessary to test such a thing in the first place? I can't think of a single positive reason to do it. I suppose they're "dismayed" they were forced to do a bit of self recognition as well.

Agree: 4
Disagree
Informative
Unhelpful
Funny
Loading Related Reports