The Use of Big Data and Surveillance Technology in the context of China’s Repression of the Uyghur-minority in the Xinjiang Region
Arbitrary mass surveillance and detention are Orwellian political tools; China should abandon use of them and release all those held in political education centers immediately.
Maya Wang, China Senior Researcher (Human Rights Watch)
The indigenous ethnic Uyghur-minority faces severe repression by Chinese authorities in the Xinjiang Region. According to estimations of the US think tank Council on Foreign Relations, more than one million Uyghurs have been imprisoned in so-called detention centers, while the rest of the Uyghur population faces intense surveillance, severe restrictions in everyday life as well as discrimination and marginalization on a constant basis. Recently, reports about a predictive policing program on the basis of big data or about a face-scanning system including an ‘Uighur alarm’ employed by Chinese government authorities have led to large international criticism and even comparisons to George Orwell’s ‘1984’ novel (see quote). How does the use of big data and surveillance technology contribute to the discrimination and marginalization of the Uyghurs? In the following, I will briefly examine China’s high-tech methods to govern and suppress the Uyghur population in Xinjiang.
II. The Uyghur-Chinese conflict
The Uyghur population amounts to about eleven million people in total who live in the Xinjiang Uighur Autonomous Region (XUAR) in the northeast of China (Maizland 2020). The ethnic minority of the Uyghurs is mostly Muslim and Turkic-speaking (Maizland 2020). With the Chinese Communist Party under Mao Zedong establishing the People’s Republic of China in 1949, the XUAR was brought under the control of the Chinese authorities, though granting very limited (religious) autonomy to the Uyghur population. While ethnoreligious tensions between the Uighur minority and the majority Han Chinese population date back decades, the Urumqi riots of 2009 mark a starting point for increased repression of the Uyghur population. Following an accusation of sexual assault against a Han Chinese woman by Uyghur workers in a toy factory in Urumqi, the capital of the XUAR, riots between Uyghurs and Han Chinese people (incl. police forces) on July 5, 2019, led to many people killed and injured on both sides (Handley 2019). This decisive event increased tensions in the region: As a consequence of terrorist acts of Uyghur separatists in 2012 and of Muslim extremists in 2014, Chinese state authorities started a “Strike Hard Campaign against violent activities and terrorism” (Human Rights Watch 2018) in May 2014, leading to increased suppression of the Uyghur population (Handley 2019). Justified by the intention of preventing extremist and separatist ideas and “eliminating threats to China’s territorial integrity, government and population” (Maizland 2020), between eight hundred thousand to two million Uyghurs have been detained in re-education camps since April 2017 according to the Council on Foreign Relations (Maizland 2020). Critics particularly condemn the seemingly arbitrary detention of people without charge or trial, based on various reasons which reach from being in contact with people from ‘sensitive’ countries (e.g. Turkey, Afghanistan) to quoting the Quran in text messages (Maizland 2020).
III. Surveillance Technology and Big Data in the Context of Repression of the Uyghurs
When fighting the “‘three [evil] forces’, that is, ‘separatism, terrorism and extremism’” (Human Rights Watch 2018) in the Xinjiang region, China relies to a growing extent on new technological developments in the areas of surveillance and big data analysis. According to Human Rights Watch (2018), mass surveillance has increased in the Xinjiang region recently. On the basis of two recent examples, a predictive policing program and a face-scanning system, I will demonstrate how Chinese authorities reinforce already existing marginalization and suppression of the Uyghur ethnic minority in the Xinjiang region and how they use data as a source of knowledge to govern and exert power over the Uyghurs.
a) The Predictive Policing Program “Integrated Joint Operator Platform”
As a part of the “Strike-Hard”-Campaign, a predictive policing program called “Integrated Joint Operators Platform (IJOP)” has been introduced by the Xinjiang Bureau of Public Security in August 2016 (Human Rights Watch 2018). IJOP collects data about the Uyghur population from various sources such as CCTV cameras, “wifi-sniffers”, security checkpoints and visitors’ management systems of access-controlled communities (Human Rights Watch 2018). This digitally exploited information is combined with existing information as well as information gathered from home visits carried out by local officials, where religious practices and family relations are examined (Human Rights Watch 2018). We can, thus, observe the exploitation from multiple data sources by government authorities.
The Uyghurs, as objects of surveillance, are in many cases unaware of them being under surveillance, and the ones “tasked with data collection do not appear to explain the reasons for such data collection, nor give residents a choice to decline to provide the data” (Human Rights Watch 2018). Based on the collected data sets, IJOP categorizes people into “trustworthy” and “non-trustworthy” and flags them accordingly, pushing the potentially threatening “non-trustworthy” candidates on to the police or government authorities for further investigation (Human Rights Watch 2018). Reasons for non-trustworthiness can reach from a supposedly wrong political or ideological mindset, active practice of religion to simple things such as possessing too many books, if not being in a teaching position (Human Rights Watch 2018). The government officials then decide upon the further procedure, not seldomly resulting in the flagged person being held in detention centers and selected for political re-education (Human Rights Watch 2018).
Surveillance data systematically leads to discrimination and marginalization of the Uyghur population because of their ethnicity or religious beliefs. Apart from the many Uyghurs being held in re-education camps, the rest of the Uyghur population is under constant surveillance thus suffering hard from China’s crackdown. The categorization of the population into different groups, here the Han ethnicity and the Uyghur ethnicity, that are treated differently, is what Lyon calls “social sorting” (Lyon 2019). According to Lyon (2019), already existing marginalization is reinforced through big data, as can be observed in the case of the Uyghurs. By exploiting different sources of data in the context of the IJOP, private information on the Uyghur-population is collected in an extensive manner, thereby creating knowledge. This knowledge serves the Chinese authorities as a means to govern and exercise power over the Uyghur-population, turning the Uyghurs into objects of power while simultaneously creating ever new knowledge about them. This mutual reinforcement of power and knowledge is what Foucault (cited in Sherida, 1980) fittingly termed as “power-knowledge”. According to this concept, the exercise of power heavily relies on knowledge, while conversely the accumulation and formation of knowledge needs to be embedded in a system of power (Foucault, cited in Sheridan 1980). Through a triangle of power-knowledge, consisting of surveillance, use of force, and constant intimidation and humiliation, the Uyghur minority shall be deprived of their religious beliefs and cultural traditions and be re-educated in a more pro-Chinese way in re-education centers.
b) Face-Scanning Technology with included “Uighur alarm”
A second, more recent, example of how technological advancements are used to repress the Uyghur minority in the Xinjiang region are tests of a face-scanning system by the Chinese tech giant Huawei. It includes an AI software which is capable of recognizing and identifying a person as belonging to the Uyghur ethnicity by his or her facial features, and subsequently triggering an “Uighur alarm” (Harwell und Dou 2020). As the Washington Post revealed on December 8, 2020, Huawei cooperated in 2018 with a facial recognition start-up called ‘Megvii’ to “test an artificial-intelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity” (Harwell und Dou 2020). The report of this test also speaks of the capability of the system to trigger an “Uighur alarm”, whenever a face of this ethnic group is recognized by the surveillance technology (Harwell und Dou 2020). Once the alarm is triggered, the necessary information is passed on to police authorities (Harwell und Dou 2020). Huawei and Megvii both claim that the “Uighur alarm” is not used in real-world settings and deny that the purpose of this technical possibility is to label ethnic groups. However, this example demonstrates how discriminatory technology could be (and maybe already is) used in order to suppress the Uyghur population. As Clare Gavie, a senior associate with the Center on Privacy & Technology at Georgetown Law, phrased fittingly, the “software represents a dangerous step toward automating ethnic discrimination at a devastating scale” (Harwell und Dou 2020). Likewise striking is the intertwinement and technological cooperation between public authorities and private companies like Huawei: Through development of such surveillance technology, Huawei knowingly (and willingly) contributes to discriminating governmental practices. Government authorities, on the other hand, ensure the long-term possession of cutting-edge technology for their purposes.
China heavily makes use of new technological developments in the areas of surveillance and big data to effectively exert power and control over the Uyghur minority in the Xinjiang region. Chinese authorities established a system of state surveillance fed by multiple sources that reaches far into the private spheres of the Uyghur population, enabling the authorities to crack down on the Uyghur ethnical minority. In cooperation with private companies like Huawei, discrimination on the basis of ethnicity or religious beliefs is automated with severe consequences for the victims who have to fear being held in detention camps and being forced to political re-education.
Handley, Erin 2019: How China’s mass detention of Uyghut Muslims stemmed from the 2009 Urumqi riots. https://www.abc.net.au/news/2019-07-05/china-xinjiang-urumqi-riots-10th-anniversary-uyghur-muslims/11270320 (accessed December 19, 2020).
Harwell, Drew & Dou, Eva 2020: Huawei tested AI software that could recognize Uighur minorities and alert police, report says. https://www.washingtonpost.com/technology/2020/12/08/huawei-tested-ai-software-that-could-recognize-uighur-minorities-alert-police-report-says/ (accessed December 19, 2020).
Human Rights Watch 2018: China: Big Data Fuels Crackdown in Minority Region. https://www.hrw.org/news/2018/02/26/china-big-data-fuels-crackdown-minority-region (accessed December 19, 2020).
Lyon, David 2019: Surveillance Capitalism, Surveillance Culture and Data Politics, in: Bigo, Didier; Isin, Engin; Ruppert, Evelyn (eds.), Data Politics: Worlds, Subjects, Rights, London: Routledge, pages 64-78.
Maizland, Lindsay (Council on Foreign Relations) 2020: China’s Repression of Uighurs in Xinjiang. https://www.cfr.org/backgrounder/chinas-repression-uighurs-xinjiang#:~:text=China%E2%80%99s%20Repression%20of%20Uighurs%20in%20Xinjiang%20More%20than,his%20shop%20in%20Kashgar%20in%20the%20Xinjiang%20region (accessed December 19, 2020).
Sheridan, Alan 1980: Michel Foucault: The Will to Truth, London: Routledge.
Michel Kuttenkeuler is currently completing his B.A. in International Relations and Economics at the University of Erfurt. Besides having interned at the Digital Rail for Germany (DB AG) and the Deutsche Gesellschaft e.V, he is a research assistant at the Chair for Strategic and International Management at his university. Michel's research interests are mostly at the intersection of technology, society, and the economy.