BBC Panorama Documentary on AI Features IPVMBy Conor Healy and Charles Rollet, Published May 28, 2021, 04:54am EDT
The world's longest-running news television program, BBC Panorama, prominently featured research from IPVM in its investigation into AI including critical examinations of Dahua, Hikvision, and Huawei.
Watch the 3-minute clip featuring IPVM including the BBC visiting IPVM's USA testing facility:
IPVM Quoted Alongside Microsoft, Human Rights Watch
The BBC Panorama episode covered how artificial intelligence is changing the modern world, dividing the US and China. IPVM’s Government Director Conor Healy was interviewed for the documentary, joining Microsoft President Brad Smith and Human Rights Watch’s China Director Sophie Richardson, among others.
BBC Banned In China After Xinjiang Reporting
The BBC has authored several high-profile investigations into PRC government human rights abuses in Xinjiang, e.g. rape and sexual abuse against Uyghurs, a look inside Xinjiang's 're-education camps', and Uyghur children being "systematically separated" from their parents; the BBC's reporting on COVID in China was also considered especially sensitive.
In response, this February the PRC government banned the BBC in mainland China and Hong Kong and a month later ejected its Beijing correspondent John Sudworth. PRC state media said this showed the government's "zero tolerance for fake news". (The PRC is ranked as one of the worst countries for press freedom.)
BBC Quotes IPVM's Hikvision, Dahua, Huawei Uyghur Alerts
Panorama said IPVM "uncovered new evidence" of high-tech Uyghur repression in the form of 'Uyghur alerts' which automatically detect Uyghur faces for PRC police. IPVM finding a Huawei patent which included Uyghur detection was used as an example:
IPVM's Conor Healy told the BBC the patent shows Huawei "worked directly with the Chinese government" as it was co-authored with the government-run China Academy of Sciences. The BBC also brought up IPVM's article on Hikvision touting a Uyghur-detecting AI camera:
Along with IPVM finding Uyghur-tracking code in Dahua's public SDK:
In response, Huawei told the BBC it "does not condone the use of technology to discriminate or oppress against members of any community". Dahua claimed, without evidence, that it mentioned Uyghurs in the context of all the PRC's 56 officially recognized minorities. Hikvision said its mention of Uyghur detection was "uploaded online without appropriate review", denying its tech had any "minority recognition function".
China Government Falsely Claims No Uyghur Analytics
The London UK PRC Embassy told the BBC that "there is no so-called facial recognition technology featuring Uyghur analytics whatsoever":
This statement is false, with IPVM and others like The New York Times finding numerous original PRC government and surveillance company documents explicitly mandating facial recognition tech with "Uyghur alarms", "real-time Uyghur warnings", "Uyghur/Non-Uyghur" detection, etc.
The BBC also quoted the chair of China's "Expert Committee on AI Governance" Dr. Lan Xue, who also denied the existence of Uyghur analytics but said that the use of technology against "terrorists" in Xinjiang is "quite understandable":
[Dr. Lan Xue:] the media outside China - a lot of those sort of charge, many are not accurate and not true, but one thing that we do have to recognize [is] that in Xinjiang there was a separatist movement that generated a lot of terrorists. I think the Xinjiang local government had the responsibility to really protect the Xinjiang people. So I think if the technology is used in those contexts that's quite understandable [emphasis added]
"Huge Implications For Society": 'One Person, One File' Examined
IPVM also contributed as-yet unreported documents illustrating China’s AI-based surveillance ambitions with a system called 'One Person, One File', which was mentioned in a police tender from Nanqiao District (Anhui province) found by IPVM:
The BBC said "One Person One File could have huge implications for society", as it consists of a system of comprehensive profiles on PRC citizens to analytically assess and even predict individuals’ behavior, tracking things like "relationships", "peer analysis", and political activities, said IPVM's Healy:
For each person the government would store their personal information, their political activities, relationships... anything that might give you insight into how that person would behave and what kind of a threat they might pose
IPVM found a new Huawei patent suggesting it is also developing a 'One Person, One File' system integrated with facial recognition data:
Huawei "didn't directly address Panorama's questions about their involvement in One Person One File", the BBC reported, although Huawei stressed that it is "independent of government wherever it operates".
AI Technology Compared To Orwell
The BBC quoted Healy saying that Orwell himself could not have "ever imagined" a system like 'One Person, One File':
If you have any desire to protest what they're building will cut off your ability to do that before you even start. It makes any kind of dissidency potentially impossible and creates true predictability for the government in the behavior of their citizens, I don't think that Orwell would've ever imagined that a government could be capable of this kind of analysis [emphasis added]
Microsoft President Brad Smith echoed this sentiment, commenting (generally) that he was "constantly reminded of George Orwell's lessons" about "a government who could see everything":
I’m constantly reminded of George Orwell’s lessons in his book 1984. You know the fundamental story…was about a government who could see everything that everyone did and hear everything that everyone said all the time. [...] Well, that didn’t come to pass in 1984, but if we’re not careful that could come to pass in 2024.
BBC Panorama Other Findings
The Panorama documentary contained a number of other findings unrelated to IPVM, notably PRC police testing emotion-detection software on detained Uyghurs:
The BBC quoted an anonymous software engineer saying he had installed such systems for PRC police, with AI being used to "indicate a person's state of mind, with red suggesting a negative or anxious state of mind". Human Rights Watch's China Director Sophie Richardson was quoted calling out this "shocking material":
It is shocking material. It's not just that people are being reduced to a pie chart, it's people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that's taken as an indication of guilt, and I think, that's deeply problematic.
PRC AI-Based Video Surveillance Raising Alarms Globally
The BBC's extensive quoting of IPVM's findings shows that PRC video surveillance and AI practices are raising unprecedented alarm worldwide. Unethical usage of AI-based video surveillance technology, such as 'Uyghur alerts', threatens core human rights and the video surveillance industry's broader reputation.
2 reports cite this report:
Back to Top