top of page

Annotated READING LIST

(continually expanding...)

collective authorship: Gaurika Chaturvedi, Simon Egbert, Tereza Ø. Kuldova, Shivangi Narayan

Ajana, Btihaj. 2012. “Biometric citizenship.” Citizenship Studies 16 (7): 851– 870. doi:10.1080/13621025.2012.669962.

Btihaj answers the question, “What is biometric citizenship?” from a UK- centric perspective. They identify and examine two aspects of biometric citizenship - ‘neoliberal citizenship’ and ‘biological citizenship’. Using the examples of the Iris Recognition Immigration System scheme, identity cards, and citizenship reform in the UK, Btihaj argues that the historical discrimination that citizenship is based upon is obfuscated by steering discourse about biometric citizenship towards technology and identification. From a neoliberal perspective, this is done by reframing citizenship as a method of procuring assets for the economy rather than as a method of being part of a collective. The biological perspective demonstrates that the shift to abstract language, such as ‘immigrant’ and ‘high-risk’, allows for the concealment of the discrimination based on race, nationality, ethnicity, and religion. Furthermore, this discrimination is now automated through biometric technology rather than overcome.

Amoore, Louis and Goede, Marieke De. 2005. ‘Governance, Risk and Dataveillance in the War on Terror’, Crime, Law & Social Change, Vol. 43,149-173.

 

This book tells you about the nexus of the use of data for ‘risk’ profiling of people, which is more often than not are used to exclude them from mainstream governance networks. Though this book is written in the context of the US’s ‘War on Terror’ project, it is nevertheless useful for everyone interested in how the government uses data to fit people into its already existing exclusion narratives.

 

Benjamin, R. 2019. Race After Technology: Abolitionist Tools for the New Jim Code. John Wiley & Sons.

 

This book is absolutely necessary reading for understanding how race is performed at the carceral level. Tech securitization tools disproportionately target racial minorities in the US – not by accident but by design, which is then used as evidence to project certain races as more criminal.

 

Bowker, G. C. and Star, S. L. 1999. Sorting Things Out: Classification and its Consequences London MIT Press, Massachusetts.

 

This book is an absolute essential reading to understand categorisation – the core of any algorithmic system – and how it is entwined in the social and political. This is also extremely important to break the myth of categories being ‘natural’ when they are a product of social and political imaginaries.

 

Brayne, S. 2020. Predict and Surveil: Data,Discretion, and the Future of Policing. Oxford University Press, USA.

 

A brilliant treatise on the current predictive policing systems in the US and where they are heading to.

 

Breckenridge, Keith. 2014. Biometric state: The global politics of identification and surveillance in South Africa, 1850 to the present. Cambridge University Press.

Given the global drive towards centralised biometric identification, Breckenridge looks at where these practices first originated and subsequently developed – South Africa. They provide a historical recounting of the contributions of Francis Galton, Edward Henry, and M. K. Gandhi in the institutionalisation of biometric registration through fingerprinting, identity cards, reference books, and associated bureaucracy from the 1850s. They claim that biometric government is antithetical to documentary government as people have little agency over how their identity may be perceived based on biometric data. They argue that biometric registration was introduced in South Africa as a method to identify, govern, and police imperial subjects along racial lines. The unchecked ambitions of imperial progressivism, supported by a racist distribution of power, formed the basis for the state’s attempts at centralised fingerprint registration. This sustained effort imbued the South African state with a path dependency towards biometric identification institutes. Breckenridge attributes the adoption of coercive systems of identification as the solution for poverty in former colonies to the reduced value of privacy due to a historically invasive relationship with the state, and the aforementioned path dependency. However, they warn that the global adoption of systems originally implemented in South Africa run the risk of unwittingly adopting the politics of those systems as well. Thus, they argue that there needs to be a body of law to regulate biometric registration in postcolonial states and across the globe.

Couldry, Nick, and Jun Yu. 2018. “Deconstructing datafication’s brave new world.” New Media & Society 20 (12): 4473–4491. doi:10.1177/1461444818 775968.

With data being hailed as “the new oil” and regulations like the General Data Protection Regulation (GDPR) focusing primarily on the uses of data, Couldry and Yu take a step back and look at data collection itself. Beginning with the premise that a state of dataveillance is incompatible with the values of democracy, they examine how data collection escapes ethical critique. They conduct a discursive deconstruction of general data discourses (by economic institutes and consultancy firms like the McKinsey and Deloitte) and specific data discourses (by companies and research institutes in the health and education sector). They contend that both general and specific discourses naturalise data and data-mining infrastructure to obscure the process of data collection. Furthermore, in the specific discourses, giving up data was framed as being ethical as it would contribute to the “common good”. Regulating information flow was understood to hinder the betterment of health and education. They argue that these perspectives overshadow the more pressing concern about how the state of constant data collection affects an individual’s freedom and autonomy. For example, the freedom to express and control their identity.

Dornan, Paul, and John Hudson. 2003. “Welfare Governance in the Surveillance Society: A Positive-Realistic Cybercriticalist View.” Social Policy & Administration 37 (5): 468–482. doi:10.1111/1467-9515.00352.

Hudson and Dornan critically review Fitzpatrick’s argument about the necessity for virtual rights as a category separate from existing social, legal, and political rights. Virtual rights, in Fitzpatrick’s cybercriticalist view, would deal with the gathering, storing, and analysing personal data in a way that reverses the current trend towards a panopticon. Hudson and Dornan consider this view to be dystopian. They examine three cases - social security fraud, private sector financial services, and benefit take-up in the UK - from a realistic perspective. They argue that reversing technological change is unlikely and the subsequent social change will not be drastic. Thus, they posit that embracing the inevitable increased surveillance of people’s lives will allow a welfare state to better help its citizens. Virtual rights would inhibit the ability to counter exclusionary surveillance with inclusionary surveillance.

Egbert, Simon; Krasmann, Susanne (2019): Predictive Policing: not yet, but soon preemptive? In: Policing & Society. Online first 02. Mai 2019. https://doi.org/10.1080/10439463.2019.1611821

 

Using the example of crime prediction software that is used in German-speaking countries, the article shows how current forms of predictive policing echo classical modes of risk calculation: usually employed in connection with domestic burglary, they help police to identify potential high-risk areas by extrapolating past crime patterns into the future. However, preemptive elements also emerge, to the extent that the software fosters ‘possibilistic’ thinking in police operations. In addition, following a general trend of data-driven government, crime prediction software will likely be integrated into assemblages of predictive technologies where criminal events are indeed foreclosed before they can unfold and emerge, implying preemptive police action.

Eubanks, Virginia. 2017. Automating Inequality: How High Tech Tools profile, police and punish the poor. New York: St. Martin’s Press.

 

The title of this book is self explanatory. This is another important book to read how securitization tools are used to discriminate against the marginalised. That they do not work in any neutral/objective or unbiased environment but are tools in the hands of the state to target very specific people. 

 

Kaufmann, Mareile; Egbert, Simon; Leese, Matthias (2019): Predictive Policing and the Politics of Patterns. In: The British Journal of Criminology 59 (3), 674–692. https://doi.org/10.1093/bjc/azy060

 

In this text we analyse predictive policing on an epistemological dimension by studying the key role of patterns in different crime prediction tools currently used around the world. Patterns, so our argument, are the epistemological core of predictive policing - when there is no pattern, no prediction is possible. And these patterns are both performative and socially constructed, limiting and preforming the range of predictable offences.

 

Egbert, Simon (2019): Predictive Policing and the Platformization of Police Work. In: Surveillance & Society 17 (1/2), 83-88. https://doi.org/10.24908/ss.v17i1/2.12920

 

In this paper I argue that because of its enablement of crime data analysis in general, predictive policing software will be an important incubator for datafied police work, because it has made police authorities aware that the massive amounts of crime data they possess are quite valuable and can now be easily analyzed. Significant transformative effects are to be expected for policing, especially in relation to data collection practices and surveillance imperatives.

 

Egbert, Simon; Paul, Bettina (2019): Preemptive „Screening for Malintent“: The Future Attribute Screening Technology (FAST) as Double Future Device. In: Futures Vol. 109, 108-116. https://doi.org/10.1016/j.futures.2018.04.003

 

Modern technologies and approaches of lie detection are getting more and more common. In this paper, by drawing on the example of Future Attribute Screening Technology (FAST), we analyse this trend with reference to the underlying processes of technological development. In doing so, FAST is conceptualized as ‘double future device’, as it targets to detect intentions of people to cause harm and simultaneously is largely still an anticipated technology which is still largely in an experimental state. As a result, we highlight the importance of the sociotechnical status and discursive role of technologies in socio-technical futures.

 

Egbert, Simon (2018): On Security Discourses and Techno-Fixes – The Political Framing and Implementation of Predictive Policing in Germany. In: European Journal for Security Research 3 (2), 95-114. https://doi.org/10.1007/s41125-017-0027-3

 

In addition to technical and economic factors, the rise of predictive policing in Germany is above all a political phenomenon, involving the discursive shaping of domestic burglary as a security problem. These new prediction tools also facilitate rhetorical links for politicians and police authorities in legitimising their ambitions to fight crime and enhance public security, presenting their methods as innovative and effective, making these technologies important components of corresponding security discourses.

Ferguson, Andrew Guthrie. 2017. ‘The Rise of Big Data Policing: Surveillance, Race and the future of law enforcement’ New York University Press, New York

 

Another important book on the current status of data led policing in the US and how its use pushes the already marginalised populations further into the margins.

Garfinkel, Harold, Anne Warfield Rawls (eds.) 2008. Towards a Sociological Theory of Information, Paradigm Publishers, Boulder, London.

 

This book tells how any piece of text, video or any piece of data for that matter, only becomes ‘information’ after its interactions with the social.

 

Egbert, Simon; Paul, Bettina (2016): Devices of lie detection as diegetic technologies in the ‘war on terror’, in: Bulletin of Science, Technology & Society, 35 (3-4), 84-92. https://doi.org/10.1177/0270467616634162

 

Although lie detection procedures have been fundamentally criticized since their inception at the beginning of the 20th century, they are still in use around the world. Thereby, the links between science and fiction in this topic are quite tight and by no means arbitrary: In the progressive narrative of the lie detection devices, there is a promise of changing society for the better, which is entangled in a fictional narrative provided by many cinematic and literary examples. In this paper, we highlight the role of science fictional narratives in the historical development and current application of lie detection procedures. Our argumentation shows that the fictional engagement with possible new lie detection practices is in itself creating a legitimating ground for future security technologies.

 

Gitelman, Lisa (eds.) 2013. “Raw Data” is an Oxymoron MIT Press, Cambridge, Massachusetts and London, England.

 

This edited volume contains a number of articles to explain the making of that ubiquitous thing – data. Like this introductory piece which explains the social nature of data and breaks the myth of data as ‘given’ and introduces the idea of data as ‘made’

Goldstein, Daniel M., and Carolina Alonso-Bejarano. 2017. “E-Terrify: Securitized Immigration and Biometric Surveillance in the Workplace.” Human Organization 76 (1): 1–14. doi:10.17730/0018-7259.76.1.1.

As part of the United States’ increased efforts to deport illegal immigrants, the E-Verify platform allows employers to determine whether employees are eligible for work. It uses data from the federal I-9 employment eligibility form and the person’s photograph. It is a ‘soft’ form of immigrant regulation that works in tandem with ‘hard’ regulation such as border control and ICE. Goldstein and Bejarano examine its impact on migrant communities in New Jersey through a four-year-long collaborative ethnography. Against the backdrop of the socio-legal framework affecting migrant workers, they argue that E-Verify signals to citizens that the government is serious about deportation while doing little to reduce illegal immigrant populations. With no enforcement of consequences for employers, they can hold the threat of economic instability and deportation over their workers. This leads to the exploitation of immigrants which satisfies the United States’ human capital need while workers are unable to demand their rights.

Hacking, Ian. 1990. The Taming of Chance Cambridge University Press, Cambridge.

 

This absolutely wonderful book explains how the world changed when it secured the ways to calculate the probabilities of occurrences of certain events.

Hull, M. S. 2012. Government of Paper: The Materiality of Bureaucracy in Urban Pakistan. University of California Press.

 

This is a seminal book to understand bureaucracy, especially how it works in the global south. Bureaucracy is important to understand because tech, especially security tech works closely with a bureaucratic set up which has its own impact on how its deployed and used.

Jacobsen, Elida K. U. 2012. “Unique Identification: Inclusion and surveillance in the Indian biometric assemblage.” Security Dialogue 43 (5): 457–474. doi:10.1177/0967010612458336.

Considering the colonial history of biometrics, Jacobsen looks at their manifestation in a postcolonial context, specifically the Unique Identification (UID) initiative, Aadhar, in India. Jacobsen draws on observational and interview data from Unique Identification Authority of India (UIDAI) employees, NGO workers, researchers and enrolees collected over five months in 2011-2012. They also look at primary sources like UIDAI reports, government and NGO websites, and news reports. They argue that some of the biopolitical colonial practices of citizenship and reducing deviant behaviour continue to exist in the postcolonial state however these vary in their implementation from context to context. They similarly conclude that the meaning-production process for biometric technology conceptualises it as the answer to a host of governance issues and the norm. However, the rationalities used to arrive at this conceptualisation are different for the different contexts observed - the Indian Home Ministry, the UIDAI, and the homeless people in Delhi.

Kuldova, Tereza (2020): Imposter Paranoia in the Age of Intelligent Surveillance: Policing Outlaws, Borders and Undercover Agents, Journal of Extreme Anthropology 4(1): 45-73. https://doi.org/10.5617/jea.7813

This article explores the possible effects of algorithmic governance on society through a critical analysis of the figure of the imposter in the age of intelligent surveillance. It links a critical analysis of new technologies of surveillance, policing and border control, to the extreme ethnographic example of paranoia within outlaw motorcycle clubs – organizations that are heavily targeted by new and old modes of policing and surveillance, while themselves increasingly embracing the very same logic and technologies themselves. With profound consequences. The article shows how in the quest for power, order, profit, and control, we are sacrificing critical reason and risk becoming as a society not unlike the paranoid criminal organizations.

Lee, Claire Seungeun. 2019. “Datafication, dataveillance, and the social credit system as China’s new normal.” Online Information Review 43 (6): 952– 970. doi:10.1108/oir-08-2018-0231.

China’s social credit system links financial, social, behavioural scores and social networks. Lee examines this by analysing official government documents and media reports in tandem with a scenario-based, story-completion method involving 22 participants. They argue that the social credit system is a unique implementation of datafication and dataveillance as legitimate methods for monitoring and controlling citizens, commerce, and society in an authoritarian state. Citizens are coming to terms with this new system, however, those that are inexperienced with technology, for example, the poor and elders, are likely to be disadvantaged by it. Furthermore, the system is at risk of reinforcing prior biases and social hierarchies while creating new issues of privacy.

Lupton, Deborah. 2014. “Quantified sex: a critical analysis of sexual and reproductive self-tracking using apps.” Culture, Health & Sexuality 17, no. 4 (November): 440–453. doi:10.1080/13691058.2014.920528.

With the increased presence of self-tracking technologies in healthcare, Lupton focuses on the specific field of sex and fertility tracking applications available on the Apple App Store and the Google Play Store. Approaching this from a sociomaterial perspective, Lupton examines the content of these apps and discusses their potential social, political, ethical, and cultural implications. They argue that these apps propagate hegemonic heteronormative stereotypes about sexuality and reproduction. For example, apps targeted towards men primarily focus on sexual performance while apps targeted towards women focus on fertility. They further caution against the ethical and privacy issues regarding the employment of these apps and the data they collect.

Lupton, Deborah, and Ben Williamson. 2017. “The datafied child: The dataveillance of children and implications for their rights.” New Media & Society 19 (5): 780–794. doi:10.1177/1461444816686328.

In the current media landscape of surveillance technologies monitoring children from birth to death, Lupton and Williamson examine what this means for children’s rights. Although these technologies are presented as a useful way for guardians to monitor their children and provide personalised care, they raise concerns about children’s privacy and autonomy. Lupton and Williamson argue that the detailed but flawed “data assemblages” created by the multitude of dataveillance technologies can be used to make decisions about children’s lives without their input or their guardians’ input. These “data assemblages” can be used for automated decision-making even when the child has reached adulthood. For example, by limiting their future educational or employment prospects. They argue that as of yet, there are not enough measures in place to protect children’s rights which account for their perspective, privacy, and practices.

Markó, Ferenc David. 2016. ““We Are Not a Failed State, We Make the Best Passports”: South Sudan and Biometric Modernity.” African Studies Review 59 (2): 113–132. doi:10.1017/asr.2016.39.

Markó examines South Sudan’s introduction of biometric identity through a year-long ethnographic study of the Directorate of Nationality, Passports and Immigration (DNPI) in 2013. They argue that their identity management system was intended to impress the world and convey the image of a stable state. In practice, the successful procurement of ID-documents provides a transitory claim to citizenship. When an individual’s citizenship is disputed, which happens often, the database of IDs is not consulted and the individual must undertake the expensive and lengthy process of earning an ID again. Markó argues that, in this way, the South Sudanese state creates internationally admissible travel documents while retaining the power to decide who deserves citizenship with the military elite.

Nair, Vijayanka. 2018. “An eye for an I: recording biometrics and reconsidering identity in postcolonial India.” Contemporary South Asia 26, no. 2 (May): 143–156. doi:10.1080/09584935.2017.1410102.

Nair illustrates how Aadhar, India’s biometric identification system, implemented by the Unique Identification Authority of India (UIDAI), affects the conception of identity and citizenship held by its enrolees and implementors. They conducted an 18-month long ethnography in Aadhar enrolment centres, offices of the UIDAI, and the Supreme Court litigation hearings against Aadhar in New Delhi. They demonstrate that Aadhar was associated with three kinds of criminality. In enrolment centres, the association of taking biometrics with catching criminals through forensics often came up in conversations. In the UIDAI offices, the main purpose of documenting all residents was to remove fraudulent identities used to exploit the welfare system. In the Supreme Court, Aadhar was seen as a method to provide citizenship to illegal immigrants through its “Introducer” route which was designed for citizens without appropriate documents. They conclude that postcolonial conceptions of the body in India expressed ideas of kinship and nationalism. Thus, Aadhar revived discussions about identity, illegal immigration, and citizenship in India.

Paul, Bettina; Egbert, Simon (2016). Drug Testing for Evidence? A Sociotechnical Practice. In: O’Gorman, Aileen; Potter, Gary; Fountain, Jane (Hg.): Evidence in Social Drug Research and Drug Policy. Lengerich: Pabst, 99-112.

 

Throughout Europe, drug tests are deployed in evermore contexts of everyday life to examine  individuals  and  check  if  they  have  used  illegal  substances.  Drawing  on ideas from science and technology studies, in this paper we challenge the idea that on-site drug tests, as technical devices, are per se able to generate evidentiary results. Rather, we  argue  that  such  devices  are  genuinely  sociotechnical  instruments.  Hence, we conclude that on-site drug tests do not produce objective data as such, that they are not evidentiary instruments per se, and that their results have to be dealt with by taking account of context-sensitive and case-related material at the least.

Pötzsch, Holger. 2015. “The Emergence of iBorder: Bordering Bodies, Networks, and Machines.” Environment and Planning D: Society and Space 33 (1): 101–118. doi:10.1068/d14050p.

Pötzsch looks at the use of biometrics, dataveillance, robotics, and predictive analysis in contemporary enforcement of borders, coining the term iBorders. They argue that these work in tandem to increase individuation, the authentication and confirmation of various unique identities; and massification, the use of abstracted behavioural data for predictive analysis. In this way, the border is not limited by territory but is a revokable state of being. Furthermore, the division between citizen and migrant is not inherent but is created through the cultural techniques of bordering. Thus, iBorders do not just identify contingent subjectivities but also contribute to their production. However, establishing identity requires more data, which in turn requires, often erroneous, automated processing of data-doubles. These decisions have material effects such as the incorrect denial of entry to travellers and the informing of policy based on flawed statistics.

Schafer, Mirko Tobias, Karin Van Es, Koen Leurs, and Tamara Shepherd. 2018. “Datafication & Discrimination.” In The datafied society: studying culture through data, 211–231. Amsterdam University Press.

Leurs and Shepherd approach data, specifically its use in European border control, from a people-centric perspective. Regarding big data’s shift from causal to correlative data analysis, they ask who this assists and who this disadvantages. They argue that biases creep in at all steps of data analysis, beginning at the organisation of knowledge itself. They attribute this to the fact that people are making decisions in all these steps. Furthermore, EURODAC’s social sorting system and Frontex’s data visualisations dehumanise and depersonalise the migration process while reifying the separation of insiders and outsiders. They conclude that, given big data’s problematic history and origin in a Western military-industrial context, it inherently marginalises already marginalised groups by reproducing the inequalities that oppress them.

Williamson, Ben. 2014. “Algorithmic skin: health-tracking technologies, personal analytics and the biopedagogies of digitized health and physical education.” Sport, Education and Society 20 (1): 133–151. doi:10.1080/135733 22.2014.962494.

Williamson looks at how digitised health and physical education work with algorithms to reframe students’ bodies. They contextualise an initial survey of health-tracking products for schools like Sqord and Fitnessgram with empirical and theoretical literature, and the trend towards ‘smart schools’. They look at health tracking as pleasurable self-surveillance and as a method of teaching bodily optimisation. They conceptualise the body as having ‘algorithmic skin’ which partially guides the body’s athletics, motion, and health. This reframes specific subjective bodily experiences in the objective terms of quantification. This quantified representation of the body is produced by algorithms which enforce an ideal based on commercial, governmental, and medical desires. A student’s body is thus conceptualised as optimisable and governable to the point of making predictions and potentially pre-emptive interventions based on these predictions.

bottom of page