top of page

Call for Papers: Algorithmic Governance and the Futures of Social Control

Special Issue of the Journal of Extreme Anthropology

Edited by: Tereza Kuldova, Christin Thea Wathne, Heidi Enehaug & Bitten Nordrik


Whoever fights monsters should see to it that in the process he does not become a monster.

― Nietzsche


Social bots, deep fakes, document fraud, fake news, propaganda, spam, lies, hoaxes, disinformation and deception. In the digital era of surveillance capitalism (Zuboff 2019), the sources and means of deception and make-believe proliferate as do the means of social sorting, targeting, and profiling (Bauman & Lyon 2013). They threaten to create a world where distinguishing between the fake and the real, truth and lie, machine and human, becomes increasingly difficult, if not impossible. A world where making these distinctions becomes an obsessive preoccupation – a preoccupation that replaces critical thinking with fact-checking, Truth-O-Meters, and audit, openness with borders and gates, trust with transparency and control, and the politics of citizenship with identity management (Muller 2009). Paradoxically, the very source of our confusion – the massive flow of data and new technologies – are also touted as the very solution. Data is marketed as neutral, as pure – as the ‘truth’. The results of data-driven predictions, of algorithmic decisions, of opaque artificial intelligence systems are labelled as ‘evidence’ and ‘intelligence’ – as something ‘solid’ and hard in ‘liquid times’ (Bauman 2000; Bauman & Lyon 2013), as something to reliably build our lives on, as well as our organizations and governance. And yet, we know well that raw and pure data is an oxymoron (Gitelman 2013); context is erased as ‘data’ becomes de-contextualized and consequently re-contextualized in statistical models and algorithms we are to rely on.


This obsession with data-driven ‘truth’ and governance, and the monetization of this ‘new oil’ of personal information and individuals digital footprint, generates new forms of harm and injustices. It sacrifices privacy, rights, liberties, the presumption of innocence, and due process on the altar of security (Benjamin 2019; O’Neil 2016) – and with it, more often than not, security itself. This obsession threatens to create a world where the omnipresent corporate and governmental surveillance (the two increasingly blurred) and the continual manufacturing and mediatization of new threats, risks, and fear feeds societal paranoia, generalized suspicion and mistrust (Frosh 2016; Breton). Everyone is a potential fraud or fake; nobody can be trusted. Distrust is institutionalized (Whelan 2013). This, paradoxically, again provides legitimacy to the very system that simultaneously manufactures both risks and threats and solutions to these risks.


Artificial intelligence, deep learning and big data analytics are viewed as the technologies of the future, capable of delivering expert intelligence decisions, risk assessments and predictions within milliseconds. Accurate or not, algorithms are transforming our societies, with profound consequences. Corporations have been investing in AI and harvesting enormous amounts of data to increase their profits by perfecting predictive consumerism, monitoring their employees, ad targeting, or credit score ratings, while cutting costs. Governments have been equally eager to collaborate with the private sector, invest in new AI technologies for predictive policing, and military intelligence, and enter ever new cross-sector partnerships in the name of efficiency, cost-effectiveness, data-driven decision making and forecasting, and streamlining of the workflow in public administration and services; in this sense, the technologies embody more than anything the visions of New Public Management. Jeremy Bentham’s ideas on governance as much as the panopticon come to a new expression through algorithmic governance (Bowrey & Smark 2010), which can be defined as follows:


‘Algorithmic governance has many faces: it is seen as ordering, regulation and behaviour modification, as a form of management, of optimisation and of participation. Depending on the research area it is characterised by inscrutability, the inscription of values and interests, by efficiency and effectiveness, by power asymmetry, by social inclusiveness, new exclusions, competition, responsiveness, participation, co-creation and overload. For most observers, governance becomes more powerful, intrusive and pervasive with algorithmization and datafication’ (Katzenbach & Ulbricht 2019: 11).


While the risks associated with AI are typically downplayed or reduced to ‘the effects on the labour market’, we are already seeing the contours of the societal effects of using AI models and automated decision making with inherent bias  – such as in the welfare systems in the UK and US (Eubanks 2018), or in predictive policing across an increasing number of countries (Kaufmann, Egbert, & Leese 2018) – AI models that not merely reproduce existing societal and systemic inequities but magnify them in the manner of self-fulfilling prophecy, and put them fully into system, resulting in ‘technological redlining’ (Benjamin 2019) while clearing from its way human discretion (Benjamin 2019). Despite these challenges, and driven by tech-optimism and fears of missing out and falling on diverse rankings measuring levels of digitization, governments have been eager to collaborate with the private sector and invest in AI systems for improved and more efficient public administration, new AI technologies for (predictive) policing, and military intelligence.


The special issue Algorithmic Governance and the Futures of Social Control invites anthropological and interdisciplinary contributions that investigate the multiple effects as well as unintended and harmful consequences of algorithmic governance and new forms of social sorting, control, and surveillance. In particular, we invite papers dealing with:


* transformation of expert knowledge, professionalism, professional judgement, and discretion

* algorithmic governance, surveillance and policing as a societal and institutional logic

* risk-based governance and the manufacturing of threat and insecurity, as well as the visions of the Other as fundamentally threatening

* organization of work, workplace surveillance and transformation of organizations, esp. in relation to questions of accountability, responsibility and transparency

* larger societal consequences of algorithmic governance for workers and citizens

* questions of legitimacy and societal trust, in light of the insight that AI models are, more than anything, an instantiation of policy


Interested contributors are encouraged to submit an abstract of 300 words and a short bio by the 15th of April 2020 to the journal Editor-in-Chief , Tereza Kuldova, at tkuld@oslomet.no Deadline for full submissions will be the 1st of October 2020.


Journal of Extreme Anthropology is an international, peer-reviewed, interdisciplinary and indexed journal that publishes articles written in the fields of anthropology, social sciences and humanities, specializing on extreme subjects, practices and theory.

For submission guidelines, and other details, please visit: https://journals.uio.no/JEA


Bauman, Z. 2000. Liquid modernity. Cambridge: Polity Press.

Bauman, Z., and D. Lyon. 2013. Liquid surveillance: a conversation. Cambridge: Polity.

Benjamin, R. 2019. Race After Technology: Abolitionist Tools for the New Jim Code. Medford, MA: Polity Press.

Bowrey, G. D., and C. J. Smark. 2010. ‘The influence of Jeremy Bentham on recent public sector financial

reforms.’ Journal of New Business Ideas and Trends 8 (1): 1-10.

Breton, H. O. ‘Coping with a Crisis of Meaning: Televised Paranoia.’ In Media and the Inner World: Psycho-cultural Approaches to Emotion, Media and Popular Culture, edited by Caroline Brainbridge and Candida Yates, 113-134. New York: Palgrave Macmillan.

Eubanks, V. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin's Publishing Group.

Frosh, S. 2016. ‘Relationality in a Time of Surveillance: Narcissism, Melancholia, Paranoia.’ Subjectivity 9 (1): 1-16.

Gitelman, L., ed. 2013. “Raw Data” Is an Oxymoron. Cambridge, Massachusetts: MIT Press.

Katzenbach, C., and L. Ulbricht. 2019. ‘Algorithmic Governance.’ Internet Policy Review 8 (4): 1-18.

Kaufmann, M., S. Egbert, and M. Leese. 2018. ‘Predictive Policing and the Politics of Patters.’ The British Journal of Criminology: 1-19. https://doi.org/doi:10.1093/bjc/azy060.

Muller, B. F. 2009. ‘(Dis)qualified bodies: Securitization, Citizenship and ‘identity management’.’ In Securitizations of Citizenship, edited by P. Nyers, 77-93. London: Routledge.

O’Neil, C. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Crown.

Whelan, A. 2013. ‘First as Tragedy, then as Corpse.’ In Zombies in the Academy: Living Death in the Higher Education, edited by Andrew Whelan, Ruth Walker and Christopher Moore, 11-26. Bristol: Intellect.

Zuboff, S. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books.

Comments


bottom of page