- New Amnesty investigation highlights why EU export rules for surveillance technology fail.
European tech companies risk fuelling widespread human rights abuses by selling digital surveillance technology to China’s public security agencies, a new Amnesty International investigation reveals. The findings are published ahead of a crucial meeting in Brussels on 22 September where the European Parliament and EU member states will decide whether to strengthen lax surveillance export rules.
Amnesty International found that three companies based in France, Sweden and the Netherlands sold digital surveillance systems, such as facial recognition technology and network cameras, to key players of the Chinese mass surveillance apparatus. In some cases, the export was directly for use in China’s indiscriminate mass surveillance programmes, with the risk of being used against Uyghurs and other predominantly Muslim ethnic groups throughout the country.
Most EU governments, including France and Sweden, are resisting calls to strengthen export rules to include strong human rights safeguards in biometric surveillance technology, an area that European companies dominate. Germany, which has held the EU presidency since 1 July, and the Netherlands have both expressed the need for stronger human rights safeguards in the past but have so far failed to address this successfully at EU level.
“Europe’s biometric surveillance industry is out of control. Our revelations of sales to Chinese security agencies and research institutions that support them are just the tip of the iceberg of a multi-billion Euro industry that is flourishing by selling its wares to human rights abusers, with few safeguards against end-use abuses,” said Merel Koning, Senior Policy Officer, Technology and Human Rights at Amnesty International.
Across China, mass surveillance projects such as “Skynet” and “Sharp Eyes” are being rolled out to keep people under constant observation. China’s public security agencies are key players in developing this unprecedented expansion of surveillance. Biometric surveillance is ubiquitous in northwest China’s Xinjiang Uyghur Autonomous Region, where an estimated up to one million Uyghurs and members of other ethnic groups have been arbitrarily held captive in so-called “re-education camps”.
“EU governments’ condemnation of the systematic repression in Xinjiang rings hollow if they continue to allow companies to sell the very technology that could be enabling these abuses. The current EU export regulation system is broken and needs fixing fast,” said Merel Koning.
Biometric surveillance tools, including facial recognition software, are among the most invasive digital surveillance technologies that enable governments to identify and track individuals in public spaces or single them out based on their physiological or behavioural characteristics. These technologies pose a clear threat to the rights to privacy, freedom of assembly, speech, religion and non-discrimination.
Amnesty’s investigation identified the sale of three different types of digital surveillance technologies to Chinese state security agencies, entities that contribute to the upholding of laws that violate human rights, as well as entities in Xinjiang.
Morpho, which is now part of Idemia, a French multinational, was awarded a contract to supply facial recognition equipment directly to the Shanghai Public Security Bureau in 2015. The company specializes in security and identity systems, including facial recognition systems and other biometric identification products. Amnesty International calls for a ban on the use, development, production, sale and export of facial recognition technology for identification purposes by both state agencies and private-sector actors.
Amnesty’s research found that Axis Communications, a Swedish company, even boasts on its website of its involvement in expanding the Chinese surveillance state. Axis develops and markets network cameras, which specialize in security surveillance and remote monitoring. The company has supplied its technology to China’s public security apparatus and is repeatedly listed as a “recommended brand” in Chinese state surveillance tender documents dating from 2012 to 2019.
The company’s website states it expanded the network of security cameras from 8,000 to 30,000 in Guilin, a city in the south of China with a population of approximately 5 million people, as part of an upgrade of the city’s Skynet surveillance programme. The cameras in the network have a 360-degree angle and a range of 300 to 400 metres, making it possible to track targets from all directions.
“Chinese public security agencies are using products sold by European companies to build up their abusive surveillance capacity. These companies are profiting from of the sale of digital surveillance technologies that are linked to horrific human rights violations. The companies should have known full well that sales to China’s authorities were of significant risk but apparently took no steps to prevent their products from being used and studied by human rights abusers. In so doing, they totally failed in their human rights responsibilities. This is why the EU legislature needs to act to stop similar abusive trade,” said Merel Koning.
A Dutch company, Noldus Information Technology, sold emotion recognition systems to public security and law enforcement–related institutions in China. The company’s “FaceReader” software is used for automated analysis of facial expressions that convey anger, happiness, sadness, surprise and disgust. FaceReader was found to be used by Chinese universities with links to the public security apparatus and the police, as well as by the Ministry of Public Security. China’s legal system falls short of international standards in numerous respects and is often misused by the authorities to restrict human rights.
Amnesty International also found that Noldus sold its digital surveillance technology to at least two universities in Xinjiang between 2012 and 2018. This included supplying its “The Observer XT” software to Shihezi University in 2012. The university falls under the administration of the Xinjiang Production and Construction Corps (XPCC). XPCC fulfils a special role “in safeguarding the country’s unification and Xinjiang’s social stability and in cracking down on violent terrorist crimes”.
In 2012, it was already known that the Chinese government routinely conflates Uyghur cultural and religious practice with terrorism. In the years that followed, the technological advancement of the suppression of minorities in Xinjiang became apparent, with emotion and behavioural analysis systems of particular interest to the Chinese authorities.
The investigated exports by the EU companies pose a significant risk to human rights. None of the companies provided Amnesty International with clear answers as to what due diligence was carried out before completing these sales. This is one of the reasons why the EU must take action now.
Amnesty’s report illustrates the major shortcomings in the current export regulation framework of the EU, the Dual Use Regulation. Amnesty is calling on the EU legislature to include all digital surveillance technology under its export framework, strengthen human rights safeguards in export decisions and ensure all companies conduct a human rights impact assessment.
“In response to Amnesty International, Axis Communications said that they do not require a license for the export of cameras for use in Chinese mass surveillance programmes. This is exactly the problem of the current EU exports regulation framework. EU governments need to face up to their responsibilities and rein in this unchecked industry,” said Merel Koning.
“Until the EU does, they have serious questions to answer about their potential role in human rights violations perpetrated by the Chinese government.”