Latin American NGOs Denounce 'Perverse' Facial Recognition

By IPVM, Published Feb 09, 2022, 08:34am EST (Info+)

A consortium of 11 Latin American NGOs, released a report on what it labels as "perverse" facial recognition implementations by nine Latin American governments, including criticism of major manufacturers such as Dahua and Hikvision.

IPVM Image

The NGOs' report is a sign of civil society's increasing scrutiny and awareness of government-implemented facial recognition technology in Brazil and elsewhere in Latin America.

Inside this report, IPVM examines:

  • International industry companies named in the report
  • Most implementations lack a legal basis
  • Weak local regulation
  • No public consultation, no impact assessments, no external auditing
  • Dahua, Hikvision among human rights violators
  • Alleged racialized Dahua algorithms in Mexico


IPVM ImageThe 25-page Facial recognition in Latin America: Trends in the implementation of a perverse technology report (cover pictured) was published by Latin American nonprofits on November 8, 2021.

The report documents the increase in public sector facial recognition implementations throughout Latin America (including the circumstances that facilitated this growth, such as advances in storage capacity). The technology's present deployment in a variety of use cases, including border control, public transit, and access to social benefits/services is also outlined.


The report is a qualitative analysis aggregating data on 38 public sector facial recognition implementations in nine Latin American countries (see map):

IPVM Image

The data on these implementations were gathered in April and May 2021 by 10 regional organizations (Ipandetec, ADC, Internet Lab, Hiperderecho, Coding Rights, Fundación Karisma, IDEC, TEDIC, Derechos Digitales, and R3D) and document facial recognition initiatives implemented since 2018. The report is limited to government-mandated facial recognition implementations and does not consider the many private-sector-financed local deployments involving facial recognition:

Systems deployed in private spaces such as stores, shopping malls or private banks were not included, nor were systems implemented in areas such as electronic commerce or access to digital devices or applications where they do not condition access to a public service.

Unlike in the EU, where the GDPR would restrict such deployments (see Irish City Fined $120,000+ for 400+ Camera System Violating GDPR), Latin American data protection legislation, such as Brazil's LGPD, contains ambiguous public safety carveouts, if any.

Using information access requests, open-source research, and interviews with public officials and industry employees, the researchers compiled the names of manufacturers and integrators involved in the deployments and answers to questions such as:

  • Is there any legal or regulatory basis for implementation?
  • Was there a public participation process prior to implementation?
  • Did the implementation process consider conducting an impact study on privacy and/or human rights?
  • Is there a planned external audit of the implementation?
  • Are there any records of security incidents, discriminatory use, or other types of abuse related to the initiative since its implementation?

Companies Named in AlSur Report

Among the companies named in the report are Dahua, Hanwha (misspelled therein as "Hanwa"), Hikvision, and ISS (named by its full name, "Intelligent Security Systems"), all of which were named due to their inclusion in Mexican public sector facial recognition deployments.

The non-Latin American companies found by the authors to be involved in government-funded facial recognition projects were:

IPVM Image

Report: 'No Specific Legal Basis' for Implementation

The AlSur study identified numerous problems with existing public sector facial recognition deployments in Latin America. For example, AlSur said the majority of documented implementations lack a legal basis:

[I]t is important to mention that in more than 60% of the cases there is no specific legal basis to support the implementation. Only 14 of the 38 documented cases indicate the existence of regulations that would support the deployment of this type of technology. [emphasis added]

Of those AlSur considers to be a legal, the NGOs noted that such basis is based on a generous interpretation of prevailing laws:

However, it is noteworthy that in most of the cases the regulations cited are not specifically for the use of facial recognition or biometric data, but are a broad interpretation of regulations referring to the use of other types of technologies (for example, the operation of video surveillance cameras), that are analogized to facial recognition with dubious arguments or specific powers that could be fulfilled through the use of facial recognition ("to oversee compliance with provisions on evasion in public transport", "immigration verification functions, foreigners and immigration control", etc.) [emphasis added]

Specific Rules 'Scarce' for Biometrics

AlSur added that prevailing legislation is not specific:

Even at the level of personal data protection, in those countries where general rules exist, references to the specific regulation of the use of biometric data are scarce. In the opinion of the various local experts, none of the regulations used to justify the implementation of facial recognition systems offers adequate treatment from a human rights point of view. [emphasis added]

AlSur's comments are in line with Brazil-specific observations shared with IPVM by Brazilian university professors and computer science experts.

No Public Consultation

The NGO consortium additionally reported that, with one exception, regional facial recognition systems are deployed by governments without any public consultation:

In the vast majority of cases, the facial recognition systems identified within the framework of this study were implemented without any kind of public consultation or participation, with the exception of a surveillance system in Chile[.]

US-based NGO Access Now published substantially similar findings in an August 2021 report, noting that "[g]overnments make deals with little or no public debate or oversight[.]"

No Impact Assessments

AlSur also found no evidence of local governments performing any impact studies/assessments concerning deployed technology:

There were not either any privacy or human rights impact studies[.]

Related, one of Ireland's largest cities is now conducting Data Protection Impact Assessments (DPIAs) on its 400+ camera CCTV system after receiving one of Europe's largest GDPR fines.

No External Auditing

Lastly, citing "only three" exceptions, AlSur found scant evidence of external audits of public sector facial recognition deployments:

Generally, the development of external audits of the functioning of implemented systems is not a common practice either.

Human Rights Violations

The report names Dahua, Hikvision, France's IDEMIA, and the UK's FaceWatch for their "alleged involvement in human rights violations" outside of Latin America:

It is important to note the presence of companies questioned internationally for their alleged involvement in human rights violations. Particularly illustrative is the case of Chinese companies Dahua and Hikvision, banned from operating in the United States, while having contracts worth millions in Mexico. France's IDEMIA (ex-Morpho Safran), present in at least four countries of the region, and UK's FaceWatch, whose software is used in one of the initiatives identified in Brazil, have also been the subject of concerns by international organizations. [emphasis added]

Note that Dahua and Hikvision have not actually been "banned from operating in the United States" but have instead been blocked from US Government installs and face a federal ban on FCC authorizations.

Dahua, in particular, is named for its contracts in the PRC's Xinjiang region and Uyghur recognition software:

The company [Dahua], one of the most important in the global video surveillance market, was sanctioned by the United States in 2019 for human rights violations due to its participation in harassment campaigns against an Islamic minority. The company has contracts for surveillance projects in Xinjiang, where the Chinese government is accused of genocide against the Uighur population. According to information published by the U.S. press in early 2021, its facial recognition software was able to identify people of such ethnicity.

Hikvision is later cited for allowing its technology to "be used by the Chinese government to crack down on Muslim minorities":

Hikvision, one of those awarded in the "Proyecto de Videovigilancia Urbana Integral con Tecnología Analítica" [Comprehensive Urban Video Surveillance Project with Analytical Technology], was sanctioned by the United States in 2019 and by Norway in 2020 for human rights abuses. Their surveillance technologies would also be used by the Chinese government to crack down on Muslim minorities.

France's IDEMIA is also named for controversial activities in both the PRC and Kenya:

The company has been criticized by Amnesty International for exporting digital surveillance technology to China, due to the human rights risk involved. IDEMIA was blamed for problems with the general election in Kenya in 2017, which resulted in the National Assembly canceling its existing contracts and banning it from entering into new ones. The decision was repealed by the country's Supreme Court.

FaceWatch is similarly criticized for its involvement in potentially unethical deployments in places of commerce in the UK, such as that profiled by IPVM last month.

Dahua 'Tropicalized' Algorithms to Identify 'Mexican Phenotype'

Aside from human rights violations, Dahua is also criticized in the AlSur report over its role in a deployment in the northern Mexican border state of Coahuila that was mentioned in The New York Times in August 2021:

Regarding the Coahuila Video Intelligence System, Dahua has stated to the press that the algorithms provided have been "tropicalized" to identify the "Mexican phenotype," suggesting that the company had some access to data on Mexican individuals to train their facial recognition systems. The initiative was presented to the public shortly after a visit by the governor of Coahuila to China, where he met with the company. Its CEO, Zhijie Li, was present at the launch of the system. [emphasis added]

Dahua adapting its algorithms to a supposed "Mexican phenotype" was previously reported in 2020 by Mexican news outlets Aristegui Noticias and El Economista. If true, this would not be the first instance of Dahua racializing its software.

Dahua maintains a close relationship with public officials in Coahuila, with Miguel Ángel Riquelme Solís, the sitting state governor, attending a 2020 Dahua fever camera donation ceremony, as IPVM previously reported:

IPVM Image

AlSur did not respond to IPVM's questions about the Dahua Coahuila deployment.

Public Deployments "Very Serious Risk"

AlSur positions itself against public sector facial recognition deployments, concluding that the technology is a risk for democratic societies:

The possibility of being under constant surveillance, moreover, encourages silencing and self-censorship, and represents a very serious risk for democratic societies.

The report further claims that public sector facial recognition deployments pose specific threats to women, people of color, and transgender individuals:

The right to non-discrimination, recognized in Articles 1 and 24 of the American Convention on Human Rights, is also directly threatened by the high rates of false positives that facial recognition systems produce, a problem that increases exponentially when the people under surveillance belong to historically vulnerable groups such as women, dark-skinned people, or transgender people. The systems implemented in the region have already registered errors that resulted in serious consequences for the people affected.


With this report, AlSur now joins a growing list of organizations that oppose government-deployed facial recognition in Latin America, including US-based Access Now and Brazil-based LAPIN.

It is incumbent upon manufacturers to be transparent in their dealings with governments and to be honest about the limitations of their technologies (such as facial recognition's drastic performance decrease in dark scenes). Government officials and the public-at-large may then set reasonable expectations and limits on surveillance technologies.

Comments (2)

Only IPVM Subscribers may comment. Login or Join.

General public space CCTV based facial recognition is, of course, potentially problematical (for technical reasons as much as any others), and it can certainly be easily understood how it could have a "chilling effect" on public space discourse. For this reason I'd certainly be against it, when private location FR holds far fewer fears, for me at least.

Of course, were one to be cynical, it would be easy to imagine that there would be political support for some of these effects, depending upon who was in power and who was doing the demonstrating. For example, FR deployment in Ottowa to identify "Freedom Convoy" demonstrators, and why that might be considered beneficial, to some.

But this very possibility of non-disinterested deployments makes it all the more vital that proper scrutiny is maintained and, in general public spaces, the bias should always be toward no deployment and a higher bar for proportionality and reasoning to justify it.

Informative: 1

Interesting points, Malcolm.

But this very possibility of non-disinterested deployments makes it all the more vital that proper scrutiny is maintained

What I see as an even greater concern here, aside from non-disinterested deployments, is data privacy. There is no public discussion delimiting who has access to the data, whether it can be shared and with whom, how long it will be stored, and whether it will ever be deleted.

private location FR holds far fewer fears

We've already seen that private location FR entails very real fears of abuse by creeps. Large-scale government deployments that bring more people into the mix, especially when weakly regulated, only amplify the potential for abuse.

Loading Related Reports