LIKE EVERYTHING

Page 1

LIKE EVERYTHING

(%%%%%%% %% %% %% %% %% %%%%%%%%%%%%%*

%%

..%(

%%

..%%

%% %%

..%%

%%

(%(((((((((((%%%%%%* (%###########%(

.%% ..%(

(%%%%%%% %% ..%%

..%%

%%

%%%%%%%%%%%%%* (%(((((((((((%%%%%%*

(%%%%%%%

.%%

%% ..%( (%%%%%%% %% ..%% %%%, ..%% %%%% .%%

%%%%

(%%%%%%%

.%% ..%%

%% ..%% ..%(

%% %% (%%%%%%% %%%, %%

.%% ..%(

%% ..%% .%% %%

%% %% ..%% %% ..%( ..%%..%% %% (%%%%%%% %% %% ..%% %% %% %% %%%, ..%% ..%% %% %% .%% (%###########%# %%%%%%%%%%%%%*%% %% %%%, %% (%(((((((((((%%%%%%* %%..%% %% ..%% (%###########%# %% %% ..%( (%###########%# %% (%(((((((((((%%%%%%* %% %% (%###########%( %%..%% %%%%%%%%%%%%%* %% %%%, (%###########%# %% %% ..%% (%###########%# .%% (%###########%( %% %%%%%%%%%%%%%* %% %%%, (%###########%( %% (%(((((((((((%%%%%%* %% (%###########%# .%% %% ..%% (%###########%%.......... (%###########%( %% (%(((((((((((%%%%%%* %% (%###########%# %% (%###########%( %% (%###########%%.......... %% ..%% ....%% (%###########%# %% (%###########%( %% (%###########%# %% (%###########%( %% ....%% %%%%%%%%%%%%%* %% %%%, (%###########%%%%%%%%%%................../%% (%###########%# %% (%###########%( %% (%###########%# (%###########%# %% .%% (%###########%%%%%%%%%%................../%% (%(((((((((((%%%%%%* %% (%%%%%%%%%%%%%( %%%%%%%%%%%%%%%%%%* (%###########%# .%% (%###########%# %% (%###########%%.......... (%###########%# %% (%%%%%%%%%%%%%( %%%%%%%%%%%%%%%%%%* (%###########%( %% (%###########%%.......... (%###########%# %% ....%% (%###########%# .%% (%###########%( %% ....%% (%###########%# .%% (%###########%%%%%%%%%%................../%% (%###########%%.......... (%###########%# %% (%###########%%%%%%%%%%................../%% (%###########%%.......... (%%%%%%%%%%%%%( %%%%%%%%%%%%%%%%%%* ....%% (%###########%# %% (%%%%%%%%%%%%%( %%%%%%%%%%%%%%%%%%* ....%% (%###########%%%%%%%%%%................../%% (%###########%# .%% (%###########%( (%###########%( %%%%%%%%%%%%%* (%###########%(

%%%%

(%###########%%%%%%%%%%................../%% (%%%%%%%%%%%%%( %%%%%%%%%%%%%%%%%%* (%###########%%.......... ....%%

(%%%%%%%%%%%%%(

%%%%%%%%%%%%%%%%%%*

(%###########%%%%%%%%%%................../%% (%%%%%%%%%%%%%(

Emancipation subjectivity in

from our divided

Elliot Bourne: MA Architecture

%%%%%%%%%%%%%%%%%%*

algorithmically age of boundless

prescribed Connectivity

Word Count: 7214


x

... Abstract

Algorithm, Control, Autonomy, Internet, Public

As COVID-19 has pushed more of our interactions into digital environments than ever were before, it is imperative that we understand the infrastructure of virtual public spaces - concealed behind clean interfaces and personalised information feeds. This discussion argues that these spaces are problematic planes, in which connectivity is engineered and sociality distorted in the name of ‘optimised user engagement’. This is done by way of algorithm, a technology which will be discussed and critiqued as both an operational technology and mechanism of control, embedded in existing hegemonies and bias. As this apparatus is deemed detrimental to the autonomy of individuals and societies, potential modes of resistance and emancipation will be explored.


CONTENTS

1

List of Illustrations

2

Part 1: Introduction // Start

6

Part 2: Sociality // Connectivity

10

Part 3: Apparatus // Architecture

18

Part 4: Control // Autonomy

22

Part 5: Resistance // Emancipation

31

Part 6: Conclusion // End

33

Bibliography

38

End


LIST OF ILLUSTRATIONS

Figure 1 Screenshot taken from: Concordia, Cambridge Analytica - The Power of Big Data and Psychographics, 2016 <https://www.youtube.com/ watch?v=n8Dd5aVXLCc&t=232s> [accessed 18 April 2021]

LIST OF ILLUSTRATIONS

1

Figure 2 Own graphic composed of words from: Cisene, ‘LDNOOBW/List-of-Dirty-NaughtyObscene-and-Otherwise-Bad-Words’, GitHub <https://github.com/LDNOOBW/Listof-Dirty-Naughty-Obscene-and-Otherwise-Bad-Words> [accessed 26 January 2021] Figure 3 Own Graphic of: Fiverr, ‘Buy SEO Services - Hire an SEO Freelancer Online’, Fiverr.Com <https://www.fiverr.com/categories/online-marketing/seoservices> [accessed 18 April 2021] Figure 4 Adam Simpson, The Panopticon, 2013, Digital <https://www.adsimpson.com/ work/the-panopticon> [accessed 18 April 2021] Figure 5 Own Graphic of: ‘Google Ads Data and Privacy – Google Safety Centre’ <https://safety.google/privacy/ads-and-data/> [accessed 18 April 2021] Figure 6 Own Graphic of web page accessed through: ‘Ad Settings’ <https:// adssettings.google.com/authenticated?hl=en> [accessed 18 April 2021] Figure 7 Own Graphic of webpage accessed through: ‘Facebook Home Page’ <https://www. facebook.com/> [accessed 18 April 2021] Figure 8 Own Graphic generated from information accessed through: ‘Ad Settings’ <https://adssettings.google.com/authenticated?hl=en> [accessed 18 April 2021] Figure 9 Screen shot taken from: ‘Project Overview ‹ Social Mirror’, MIT Media Lab <https://www.media.mit.edu/projects/social-media-mirror/overview/> [accessed 18 April 2021]2021]


1 Introduction Start.

“algorithms are generating the bounded conditions of what a democracy, a border crossing, a social movement, an election, or a public protest could be in the world.” - Louise Amoore1

I am writing this on Tuesday the 12th of January 2021. Donald Trump has been indefinitely suspended from Facebook, Instagram, Twitter, Twitch, Snapchat and (with some delay) YouTube, for inciting violence after an insurrection took place at the Washington D.C. Capitol building. There are two main stances on the event, not contained by political opinion or party affiliation. Some are heralding his removal (after numerous violations of the sites terms of service) a victory for accountability and against dogmatism.2 Others consider this an undemocratic exercise of power by ‘Big Tech’ organisations: an overt display of the governance these global private companies possess over what media we are able to consume.3 In arguing this point something implicit is said: the free to access sites of the internet, where we interact, communicate and share ideas (otherwise known as Web 2.0 or social media) are becoming something far larger than the services of private corporations. They are becoming an ever more integral part of discourse for many people internationally, pushed further into relevancy by the ongoing pandemic and the limitations to physical interaction.4

Louise Amoore, Cloud Ethics: Algorithms and the Attributes of Ourselves and Others (Duke University Press, 2020) p.4. 1

Elizabeth Dwoskin and Nitasha Tiku, ‘How Twitter, on the Front Lines of History, Finally Decided to Ban Trump’, Washington Post <https://www.washingtonpost.com/technology/2021/01/16/howtwitter-banned-trump/> [accessed 3 February 2021]. 2

James Clayotn, ‘Twitter Boss: Trump Ban Is “Right” but “Dangerous”’, BBC News, 14 January 2021, section Technology <https://www.bbc.com/news/technology-55657417> [accessed 3 February 2021]. 3

Minh Hao Nguyen and others, ‘Changes in Digital Communication During the COVID-19 Global Pandemic: Implications for Digital Inequality and Future Research’, Social Media + Society, 6.3 (2020), 2056305120948255 <https://doi.org/10.1177/2056305120948255>. 4

2


This underlying negotiation of what ‘social media’ actually is in reference to existing syntax is a common element of discussion on the topic. Of course, every user of social media knows what it is and the function it has in their life and interactions, however there are many more entangled [and often hidden] actors at play than in a normative physical social interaction. In short, there’s the users, the digital platform, the created ‘content’, and various legal and economic actors, all multifaceted and

INTRODUCTION

3

complex individually, compiled and interwoven in every interaction.5 In a way no previous technology could facilitate, agents of different natures and scales are creating spaces within which information is exchanged and people connect.

This complex assemblage makes it difficult to discern how social media exists coherently as a technology within existing economic, social and political frameworks. The technology is developing in complexity at an accelerating rate, which makes it challenging to be contained as an object of study within the technological field, let alone cultural and social studies; where papers are sometimes out of date before they’re even published.6 Economically, outlets of social media are considered private companies acquiring profit. In social studies, they are seen as platforms of exchange, with much discussion on how existing and dated theory, including Jürgen Habermas’ Public Sphere, may be translated; as well as some bemusement at its efficacy in political activism.7

8

Politically, its

agency is a hot topic of scrutiny as its seen to have played a substantial role in facilitating coercion and manipulation after the 2016 U.S. presidential election.9 Russian intelligence used the service to spread propaganda while political consulting firm Cambridge Analytica employed heavily personalised advertisements; using illicitly acquired the data from over 50 million Facebook users [see Figure 1].10

Clay Shirky, ‘The Political Power of Social Media: Technology, the Public Sphere, and Political Change’, Foreign Affairs, 90.1 (2011), 28–41. 5

Taina Bucher, ‘Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook’, New Media & Society, 14.7 (2012), 1164–80 <https://doi.org/10.1177/1461444812440159>. 6

Dounia Mahlouly, ‘Rethinking the Public Sphere in a Digital Environment: Similarities between the Eighteenth and the Twenty-First Centuries’, ESharp, 20.6 (2013). 7

Anita Breuer, The Role of Social Media in Mobilizing Political Protest: Evidence from the Tunisian Revolution (Rochester, NY: Social Science Research Network, 10 October 2012) <https:// doi.org/10.2139/ssrn.2179030>. 8

A.S.B., ‘Why Is Mark Zuckerberg Testifying in Congress?’, The Economist, 9 April 2018 <https://www.economist.com/the-economist-explains/2018/04/09/why-is-mark-zuckerberg-testifyingin-congress> [accessed 3 February 2021]. 9

Adam Satariano, ‘Facebook Identifies Russia-Linked Misinformation Campaign’, The New York Times, 17 January 2019, section Business <https://www.nytimes.com/2019/01/17/business/facebookmisinformation-russia.html> [accessed 13 April 2021]. 10


4 START

Figure 1 Former CEO of Cambridge Analytica, Alexander Nix, illustrating how different messages will be shown to different individuals, based on psychological profiling. The right message is intended for those deemed fearful and the left for those who have more ‘closed’ and ‘agreeable’ profiles. Concordia, 2016.

Without being able to concisely categorise and explain this radically new phenomenon (social media) we struggle to be able to know how exactly to navigate its problems. This ‘we’ does not only extend to members of the global public, or governments, but to the very CEOs of the most powerful companies.11 What we do know, is that these services cannot be considered neutral entities and if their proliferation is to continue, we need to understand and address the ingrained issues of these platforms of exchange. The most concerning problem highlighted by events such as the 2016 U.S. Presidential election, was not the breaches of privacy and coercion, but as Adam Poire explains “It’s that we have all, quite voluntarily, retreated into hyperpartisan virtual corners, owing in no small part to social media and internet companies that determine what we see by monitoring what we have clicked on in the past and giving us more of the same.”12

Chris Matyszczyk, ‘Facebook and Zuckerberg Still Don’t Know the Right Thing to Do’, CNET <https://www.cnet.com/news/mark-zuckerberg-still-doesnt-know-the-right-thing-to-do/> [accessed 3 February 2021]. 11

Adam Priore, ‘No, Big Tech Didn’t Make Us Polarized (but It Sure Helps)’, MIT Technology Review, 2018, 18–21, p.19. 12


This discussion will focus on the architecture of these virtual spaces of exchange, where interactions are engineered. A precise claim will be made: Digital environments, by design, exhibit a mode of control over individuals, imposing a problematic sociality, from which, we need to be emancipated. This claim will be argued through three sections: Firstly, an explanation of how sociality is engineered in virtual public

INTRODUCTION

5

spaces. Secondly, how the engineering apparatus comes to be. Thirdly, how this influences autonomy and controls. Before concluding, the fourth section will discuss emancipation: current calls for change and propositions for the future.


2 Sociality

Connectivity. “There is a general risk that those who flock together, on the Internet or elsewhere, will end up both confident and wrong, simply because they have not been sufficiently exposed to counterarguments. They may even think of their fellow citizens as opponents or adversaries in some kind of ‘war.’” - Cass Sunstein13

The technology of Web 2.0 can be described pragmatically as a facilitator of connections, or Internet Protocol (IP), which is defined as the unbiased transport of packets between two endpoints. The quality of this connection is defined by the networks ability to form these data exchanges. We can compare this exchange of data to utilities such as drinking water and electricity, as Jose ́ van Dijck does. As water droplets contribute to water flow and charged particles are constitute components of electrical energy, data packets are the rudimentary components of connectivity. Data is the resource upon which many Web 2.0 utility applications and other information technologies (telephone, television, etc) are founded upon. As the infrastructure of the internet developed, these simple links needed to be generated, organised, managed, filtered, delivered and made legible. As the internet grew, these environments where data was exchanged increasingly became the products of services, such as Google: to search through the internet as a database, Facebook: to share between contacts and more recently Spotify: to listen, share and find new music. The transaction which occurs in these free data services is unusual. The services are free to those who wish to use them and in return for this convenient and accessible service: the provider receives data and meta-data on the user their interactions and shared personal information.

Cass R. Sunstein, ‘The Polarization of Extremes’, The Chronicle of Higher Education, 54.16 (2007), p.B9. 13

José van Dijck, ‘Facebook and the Engineering of Connectivity: A Multi-Layered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi. org/10.1177/1354856512457548> p.143. 14

6


The reason for this data collection is multifaceted, it is often explained that this data is actually, in economic terms, the product of this service (as opposed to the service itself being the product). User data is formatted and sold to advertisers so that they may target their intended audience more successfully. This is explicit when we look at Google and Facebook though is also true of other Web 2.0 sites where advertisements appear.15

SOCIALITY

7

16

This phenomena has attracted many critics, some describing

it as theft of user’s ‘attention time’.17 The service providers however, also use this information with the intent of personalising and optimising their service to each user.18 This is done algorithmically, i.e through a mathematical equation. These equations determine what should be made visible to users by weighting sets of values (based on acquired data) attributed to both the particular piece of media and the user. On the scale of the individual this might seem to be a mutually benefitting intervention: Users get a better service and as a result the service is likely to receive increased engagement. However, this results in each user of the site receiving different information, making different connections and having a different experience; which in operation has troubling affects. These sites have become problematic planes within which sociality is distorted and divisiveness facilitated - in the name of optimisation and ‘increased engagement’. This may sound provocative but it’s according to Facebook’s own study.19

‘Google Ads - Get More Customers With Easy Online Advertising’ <https://www.googleadservices. com/pagead/aclk?sa=L&ai=DChcSEwjjiK-L_f3vAhWGtO0KHaB4DVUYABAGGgJkZw&ohost=www.google.com&cid=CAE SQOD2YzqAqC5ffBNECAGizo3GZ7hwDHngcJW9iRsOabE32E-cOP9dSAKBKDy6krZ7tllBvkmjZPyRS4MhlikHsQg&sig=AO D64_17rlhyKe8pX77jadcBbhx20p6k1g&q=&ved=2ahUKEwixy6eL_f3vAhXhQEEAHXk8AOMQqyQoAnoECAQQFw&adurl=> [accessed 14 April 2021]. 15

‘Facebook Advertising Targeting Options’, Facebook for Business <https://en-gb.facebook.com/ business/ads/ad-targeting> [accessed 14 April 2021]. 16

Christian Fuchs, ‘Labor in Informational Capitalism and on the Internet’, The Information Society, 26.3 (2010), 179–96 <https://doi.org/10.1080/01972241003712215>. 17

‘Personalized Search for Everyone’, Official Google Blog <https://googleblog.blogspot. com/2009/12/personalized-search-for-everyone.html> [accessed 14 April 2021]. 18

Jeff Horwitz and Deepa Seetharaman, ‘Facebook Executives Shut Down Efforts to Make the Site Less Divisive’, Wall Street Journal, 26 May 2020, section Tech <https://www.wsj.com/articles/ facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499> [accessed 3 February 2021]. 19


In an attempt to explain how this occurs succinctly I will borrow an analogy made by Geoff Cox in his eerily precursive 1999 essay ‘Crowd_ Code’.20 In it, Geoff likens a collection of internet users to a crowd or a ‘multi-user behavioural space’ in order to try and evaluate whether a dispersed and digitised embodiment of a crowd would diminish its political agency. From the outside the crowd seems unpredictable and unruly but from the inside it is understood, with ideas spreading radially from individuals and these relations effecting the overall consensus. If we are to take this

within which the crowd operates. These sites record the actions/ interactions of individuals in the crowd. This generalised understanding of each individual, is used to predict the responses of other individuals who exhibit similar actions/interactions. This attempted understanding and quantification is used to increase engagement i.e show each user more of what the platform thinks they will like/engage with. In this process the crowd is re-composed, with individuals with similar action/ interaction habits situated adjacent to one another, exposed mostly just to one another. This rationalises the crowd. The individual is naturally surrounded by commonality and agreement, rather than difference and critique. No longer is the social process of the crowd one of contestation and consensus, but agreement and division. The crowd becomes polarised by homophily. The space in which the crowd exists is designed to quantify, sort and prescribe; in doing so it generalises and divides. These practices of restructuring information and algorithmically determined encounters are not isolated only to Facebook but much of ‘Web 2.0’, as argued by David Beer.21

This phenomena is known as ‘The Filter Bubble’, popularised by the 2009 Eli Pariser book of the same name.22 In it, Pariser describes the filter bubble as a device which indoctrinates us with our own ideas, an ‘invisible autopropoganda’ which amplifies our desire for things that are familiar and concretises our existing views.23 In this action, social media algorithms prescribe our subjectivity. They make claim to what should be liked, what should be read, what should be listened to and even, to an extent, who should communicate with each other. Our connections are engineered and with this, our reality. Geoff Cox, ‘Crowd_code | Anti-Thesis’ <http://www.anti-thesis.net/crowd_code-2/> [accessed 2 February 2021]. 20

David Beer, ‘The Social Power of Algorithms’, Information, Communication & Society, 20.1 (2017), 1–13 <https://doi.org/10.1080/1369118X.2016.1216147>. 21

Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (Penguin Publishing Group, 2011). 22

Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (Penguin Publishing Group, 2011) p.9. 23

CONNECTIVITY

analogy further we could see the sites of Web 2.0 as the infrastructure,

8


The theory of democratic inquiry, by John Dewey seeks to explain how individuals affected by particular issues attempt to resolve them by forming groups, which he calls publics.24 These publics are formed of various individuals who could have different beliefs and values but all have a stake in the particular issue. Fenwick McKelvey writes about how this theory could be adapted and employed within Web 2.0, to tackle some of the problems of algorithmic sociality.25 However, the theory can also be

SOCIALITY

9

used to explain how it hinders agency. As Michel Callon, Pieree Lascoumes and Yannick Barthe suggest, the formation of a public is a collective process of composing a common world, where the ‘uncertainties of groupings’ define the entities.26 The composition of these new entities, while forming a new grouping, casts the identities of its members in flux. Individuals reflect and gain awareness, which is a necessary component in the formation of publics with agency, as it facilitates consensus. This mechanism is absent form the algorithmically curated associations which resemble publics.27 So if true publics are not able to form, neither can a common world to which individuals need to relate, in order to achieve democratic ends.

Unfortunately, the ‘filter bubble’ has only become more prolific in the past decade. Since then, Facebook, Twitter and Instagram (now owned by Facebook) have removed the default option of a chronological, non algorithmically curated feed and a new platform, TikTok, has emerged. The most downloaded phone application of 2020, TikTok has a main ‘For You’ page of content curated from any individual user (not just those personallu chosen) and a secondary ‘Following’ page’. TikTok’s entire format is the curatorial algorithm, so overt that users make content about it.28 As more interactions are undertaken online, voluntarily or due to circumstance, more interactions are pushed into the realm of the algorithm; spaces designed for optimised user engagement, not political, or individual agency. In order to explain more fully how this mathematical curator controls and in turn, how we might subvert its operation, we must first gain a more comprehensive understanding of what the algorithm, actually is. John Dewey, ‘Democratic Ends Need Democratic Methods For Their Realisation’, in The Later Works, ed. by Jo Ann Boydston (Carbondale, IL: SIU, 1981), p. 367–368. 24

Fenwick McKelvey, ‘Algorithmic Media Need Algorithmic Methods: Why Publics Matter’, Canadian Journal of Communication, 39 (2014) <https://doi.org/10.22230/cjc.2014v39n4a2746>. 25

Michel Callon, Pierre Lascoumes, and Yannick Barthe, Acting in an Uncertain World: An Essay on Technical Democracy (MIT Press, 2011) p.132. 26

Tarleton Gillespie, ‘The Relevance of Algorithms’, in Media Technologies: Essays on Communication, Materiality, and Society ed. by Michel Callon, Pieree Lascoumes and Yannick Barthe (Cambridge, MA: MIT Press, 2013), pp. 167–94 <https://doi.org/10.7551/ mitpress/9780262525374.003.0009>. 27

Chris Stokel-Walker, ‘We Had Experts Dissect TikTok’s Algorithm, and Their Findings Reveal Why a US Buyer Will Struggle to Replicate Its Magic’, Business Insider <https://www.businessinsider. com/why-tiktok-algorithm-bytedance-acquisition-trump-2020-9> [accessed 4 February 2021]. 28


3 Apparatus Architecture.

“Trying to establish law and order in the information sphere, some warriors of categorization seem oblivious to the nature of transient realities and the fuzzy inflections of meaning. Cataloguers’ interests and requirements necessarily dominate over the more objective need of navigating the complexity of the world. They breed cognitive management technologies blind to cultural and subjective ambiguity and the slipperiness of context-dependent statements. Ideas of an objective ordering of abstract space are based on a religious notion of immaculate purity. They feed on dangerous ideologies of cybernetic control that imagine the manifest world to be reducible to a single viewpoint.” - Konrad Becker29

The algorithm as discussed so far in this essay and others is a term used to describe a computational process used to make a decision based on a vast number of factors that are not obvious or transparent. However, attributing this mechanism solely to the algorithm would be a wild over simplification of the subject matter. The algorithm is not even a fixed set of computational instructions anymore, with the advent of machine learning the algorithm is a fluid, ephemeral entity, constantly optimising itself and making new links between data strands.30 There are many issues with the simplifying and ascribing agency to this process, foremost: this frames algorithmic problems as something purely technical. The algorithm however, does not exist in a vacuum; it is the product of, and embedded in, already existing frameworks, hegemonies and biases.31 As Kate Crawford argues, there is too much focus on the moment the algorithm acts.32 And as David Beer suggests, “we also need to think about the powerful ways in which notions and ideas about the algorithm circulate through the social world.”33 Konrad Becker, ‘The Power of Classification. Culture, Context, Command, Control, Communications, Computing’, Future Non Stop <http://future-nonstop.org/c/07cabb9532d6e90e68704d4d6039d04f> [accessed 2 February 2021]. 29

‘Machine Learning in Julia’, The Alan Turing Institute <https://www.turing.ac.uk/research/research-projects/machine-learning-julia> [accessed 14 April 2021]. 30

Matthew Fuller, ‘Algorithmic Cultures and Security’, 2015 <https://www.youtube.com/watch?v=sSRyG4u1fTc> [accessed 2 February 2021]. 31

Kate Crawford, ‘Can an Algorithm Be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, 41.1 (2016), 77–92 <https://doi. org/10.1177/0162243915589635>. 32

David Beer, ‘The Social Power of Algorithms’, Information, Communication & Society, 20.1 (2017), 1–13 <https://doi.org/10.1080/1369118X.2016.1216147> p.2. 33

10


So in order to effectively critique the algorithm we must understand the factors which shape its reality and the worldview which employs it. This is not a new phenomena, Bordieu wrote about the power of technology to reproduce existing power structures in reference to photography (1965) and television (1996) before the internet.34

35

He suggested these technologies

are applied in the framing of existing power structures, as they have been inherited by a specific cultural and social class. It is not surprising

APPARATUS

11

that radical new technologies require regulatory transformation, however the issues with the algorithm lie in its perceived status as a rarified, neutral and rational form of intelligence.36

37

As the algorithm is a

mathematical procedure, it is assumed to be a process of pure reason and logic. However, the quantified inputs are an abstraction of non qualitative reality. The algorithm claims to make explicit quantifiable knowledge from a variable relational reality.38 For this reason, it is important that when the algorithm is referenced, it is understood not just as a mathematical equation but a component of an apparatus through which power is enacted. An apparatus which, as discussed in the previous section, engineers interactions and prescribes subjectivity.

Pierre Bourdieu and Shaun Whiteside, Photography: A Middle-Brow Art (Stanford University Press, 1996). 34

35

Pierre Bourdieu, On Television (New Press, 1999).

Jonathan Obar and Steven Wildman, ‘Social Media Definition and the Governance Challenge: An Introduction to the Special Issue’, SSRN Electronic Journal, 2015 <https://doi.org/10.2139/ ssrn.2637879> p.2. 36

Matthew Fuller, ‘Algorithmic Cultures and Security’, 2015 <https://www.youtube.com/ watch?v=sSRyG4u1fTc> [accessed 2 February 2021]. 37

Luciana Parisi, ‘Algorithmic Cultures and Security’, 2015 <https://youtu.be/3vrFLcmX6HM> [accessed 2 February 2021]. 38


It follows that understanding the political and social context of the algorithm’s creation is essential to understanding its implementation. As with most Big Tech and Silicon Valley companies, those who develop algorithms are overwhelmingly white and male.39 In operation this is exposed as problematic in many instances, from facial recognition software unable to identify black faces to AI automated recruitment tools proven to discriminate against women.40

41

Although these examples are troubling and

show algorithms are capable of embodying bias and ignorance, they do not

language models, used to train algorithms to interpret and classify audio and visual media, are influenced overtly by this insular practice. The OpenAI model neglected words deemed crude by the creators, which extended to nouns such as ‘orgasm’, ‘vulva’ and ‘nipple’ [see Figure 2]. Issues of algorithmic inclusivity are discussed frequently in the technological field, with several papers discussing things such as ‘de-colonial AI’ and the risk language models (similar to the one just discussed) pose to marginalised groups.42

43

Timnit Gebru, one of the authors of the cited paper

on language models, and a well respected researcher, was fired from Google after refusing to retract her contribution.44 This caused a wave of backlash and for the paper to become infamous.45 The algorithm is exposed in these examples as an embodiment of traditional, homogenous social categorisations and a product of existing hegemonies and bias.

‘Bay Area Tech Diversity: White Men Dominate Silicon Valley’, Reveal, 2018 <https:// revealnews.org/article/heres-the-clearest-picture-of-silicon-valleys-diversity-yet/> [accessed 15 April 2021]. 39

‘UK Passport Photo Checker Shows Bias against Dark-Skinned Women’, BBC News, 7 October 2020, section Technology <https://www.bbc.com/news/technology-54349538> [accessed 15 April 2021]. 40

Reuters, ‘Amazon Ditched AI Recruiting Tool That Favored Men for Technical Jobs’, The Guardian, 2018 <http://www.theguardian.com/technology/2018/oct/10/amazon-hiring-ai-gender-biasrecruiting-engine> [accessed 15 April 2021]. 41

Shakir Mohamed, Marie-Therese Png, and William Isaac, ‘Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence’, Philosophy & Technology, 33.4 (2020), 659– 84 <https://doi.org/10.1007/s13347-020-00405-8>. 42

Emily M. Bender and others, ‘On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?’, in Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, FAccT ’21 (New York, NY, USA: Association for Computing Machinery, 2021), pp. 610–23 <https:// doi.org/10.1145/3442188.3445922>. 43

Julia Carrie Wong, ‘More than 1,200 Google Workers Condemn Firing of AI Scientist Timnit Gebru’, The Guardian, 2020 <http://www.theguardian.com/technology/2020/dec/04/timnit-gebrugoogle-ai-fired-diversity-ethics> [accessed 15 April 2021]. 44

‘Kari Paul, ‘Two Google Engineers Quit over Company’s Treatment of AI Researcher’, The Guardian, 2021 <http://www.theguardian.com/technology/2021/feb/04/google-timnit-gebru-aiengineers-quit> [accessed 15 April 2021]. 45

ARCHITECTURE

illustrate the full capcity of potential issues. Even the most developed

12


1

2g1c

102

cunt

203

jiggaboo

303

scat

2

2 girls 1 cup

103

darkie

204

jiggerboo

304

schlong

3

acrotomophilia

104

date rape

205

jizz

305

scissoring

daterape

206

juggs

306

semen

deep throat

207

kike

307

sex

deepthroat

208

kinbaku

308

sexcam

kinkster

4 5 6

107

anilingus

108

dendrophilia

309

sexo

8

anus

109

dick

210

kinky

310

sexy

9

apeshit

110

dildo

211

knobbing

311

sexual

10

arsehole

111

dingleberry

212

leather restraint

312

sexually

11

ass

112

dingleberries

213

leather straight jacket

313

sexuality

dirty pillows

214

lemon party

314

shaved beaver

dirty sanchez

215

livesex

315

shaved pussy

doggie style

216

lolita

316

shemale

lovemaking

14

asshole assmunch auto erotic

113 114 115

15

autoerotic

116

doggiestyle

217

317

shibari

16

babeland

117

doggy style

218

make me come

318

shit

17

baby batter

118

doggystyle

219

male squirting

319

shitblimp

18

baby juice

119

dog style

220

masturbate

320

shitty

dolcett

221

masturbating

321

shota

domination

222

masturbation

322

shrimping

dominatrix

223

menage a trois

323

skeet

milf

19 20 21

APPARATUS

anal

106

7

13

ball gag ball gravy ball kicking

120 121 122

22

ball licking

123

dommes

224

324

slanteye

23

ball sack

124

donkey punch

225

missionary position

325

slut

24

ball sucking

125

double dong

226

mong

326

25

bangbros

126

double penetration

227

s&m

motherfucker

327

26

bangbus

127

dp action

228

smut

mound of venus

328

snatch

dry hump

229

mr hands

329

snowballing

dvda

230

muff diver

330

sodomize

eat my ass

231

muffdiving

331

sodomy

nambla

27 28 29

bareback barely legal barenaked

128 129 130

30

bastard

131

ecchi

232

332

bastardo

132

ejaculation

233

spastic

31

nawashi

333

32

bastinado

133

erotic

234

spic

negro

334

33

bbw

134

erotism

235

splooge

neonazi

335

splooge moose

escort

236

nigga

336

spooge

eunuch

237

nigger

337

spread legs

fag

238

nig nog

338

spunk

nimphomania

34 35 36

bdsm beaner beaners

135 136 137

beaver cleaver

138

faggot

239

339

beaver lips

139

fecal

240

strap on

38

nipple

340

beastiality

140

felch

241

strapon

39

nipples

341

40

bestiality

141

fellatio

242

strappado

nsfw

342

41

big black

142

feltch

243

strip club

nsfw images

343

style doggy

female squirting

244

nude

344

suck

femdom

245

nudity

345

sucks

figging

246

nutten

346

suicide girls

nympho

37

42 43 44

big breasts big knockers big tits

143 144 145

45

bimbos

146

fingerbang

247

347

birdlock

147

fingering

248

sultry women

46

nymphomania

348

47

bitch

148

fisting

249

swastika

octopussy

349

48

bitches

149

foot fetish

250

swinger

omorashi

350

tainted love

footjob

251

one cup two girls

351

taste my

frotting

252

one guy one jar

352

tea bagging

fuck

253

orgasm

353

threesome

orgy

49 50 51

black cock blonde action blonde on blonde action

150 151 152

blowjob

153

fuck buttons

254

354

blow job

154

fuckin

255

throating

53

paedophile

355

blow your load

155

fucking

256

thumbzilla

54

paki

356

55

blue waffle

156

fucktards

257

tied up

panties

357

56

blumpkin

157

fudge packer

258

tight white

panty

358

tit

fudgepacker

259

pedobear

359

tits

futanari

260

pedophile

360

titties

gangbang

261

pegging

361

titty

penis

52

57 58 59

bollocks bondage boner

158 159 160

60

boob

161

gang bang

262

362

boobs

162

gay sex

263

tongue in a

61

phone sex

363

62

booty call

163

genitals

264

topless

piece of shit

364

63

brown showers

164

giant cock

265

tosser

pikey

365

towelhead

girl on

266

pissing

366

tranny

girl on top

267

piss pig

367

tribadism

girls gone wild

268

pisspig

368

tub girl

playboy

64 65 66

brunette action bukkake bulldyke

165 166 167

bullet vibe

168

goatcx

269

369

bullshit

169

goatse

270

tubgirl

68

pleasure chest

370

bung hole

170

god damn

271

tushy

69

pole smoker

371

70

bunghole

171

gokkun

272

twat

ponyplay

372

71

busty

172

golden shower

273

twink

poof

373

twinkie

goodpoop

274

poon

374

two girls one cup

goo girl

275

poontang

375

undressing

goregasm

276

punany

376

upskirt

poop chute

67

Figure 2 List of words deemed obscene, used in the Open AI, Image GPT language model. Shutterstock, 2020

alaskan pipeline

105

209

12

13

alabama hot pocket

72 73 74

butt buttcheeks butthole

173 174 175

75

camel toe

176

grope

277

377

camgirl

177

group sex

278

urethra play

76

poopchute

378

camslut

178

g-spot

279

urophilia

77

porn

379

78

camwhore

179

guro

280

vagina

porno

380

venus mound

carpet muncher

180

hand job

281

pornography

381

viagra

handjob

282

prince albert piercing

382

vibrator

hard core

283

pthc

383

violet wand

pubes

79 80 81

carpetmuncher chocolate rosebuds

181 182

cialis

183

hardcore

284

384

circlejerk

184

hentai

285

vorarephilia

83

pussy

385

cleveland steamer

185

homoerotic

286

voyeur

84

queaf

386

85

clit

186

honkey

287

voyeurweb

queef

387

86

clitoris

187

hooker

288

voyuer

quim

388

289

vulva

raghead

389

290

wank

raging boner

390

291

wetback

rape

391

292

wet dream

raping

392

293

white power

rapist

393

294

whore

rectum

394

295

worldsex

reverse cowgirl

395

296

wrapping men

rimjob

396

297

wrinkled starfish

rimming

397

298

xx

rosy palm

398

299

xxx

rosy palm and her 5 sisters

399

300

yaoi

rusty trombone

400

301

yellow showers

sadism

401

302

yiffy

santorum

402

303

zoophilia

scat

403

82

87 88 89

clover clamps clusterfuck cock

188 189 190

horny hot carl hot chick

90

cocks

191

how to kill

91

coprolagnia

192

how to murder

92

coprophilia

193

huge fat

93

cornhole

194

humping

coon

195

94 95 96

coons creampie

196 197

incest intercourse jack off

cum

198

98

cumming

199

jailbait

99

cumshot

200

jelly donut

100

cumshots

201

jerk off

101

cunnilingus

202

jigaboo

97

jail bait


Another aspect of the apparatus, which is worth unpacking here is its language. Often nouns are used, which are not native to the object they’re describing. Some terms, such as ‘user’ and ’content’ (used to describe individuals who inhabit a virtual space and their input to it, respectively) are the commodity language of companies like YouTube and Facebook, but have seeped into common use.46 This language is abstractive in itself and suggestive of the process of algorithmic quantification. These terms and similar are frequently becoming the topic of popular 48 49

Other elements of the language of Web 2.0 are of

interest too, words such as ‘platform’ and ‘site’ are the physicalisation of what is only ever experienced as coloured pixels on a screen. These terms, presented by sites such as Facebook and Youtube (as well as myself) suggest these sites are performative stages, as well as meeting places.50 Tarlton Gillespie argues that the word ‘platform’ is inherently ambiguous as it links the computational and the architectural to the social.51 However, these links, as argued through this essay, are sure and steadfast. Gillespie also argues this definition is used to afford these companies absolution of responsibility to what users share on their sites, likening themselves to true public space. This frames the sites as autocratic, presiding over a public of their design with an apparatus of control also deemed neutral and autocratic. As Kate Crawford argues, this poses a serious problem when we wish to intervene in their process of governance.52

‘Create Great Content - YouTube’ <https://creatoracademy.youtube.com/page/course/greatcontent> [accessed 15 April 2021]. 46

Taylor Lorenz, ‘The Real Difference Between Creators and Influencers’, The Atlantic, 2019 <https://www.theatlantic.com/technology/archive/2019/05/how-creators-became-influencers/590725/> [accessed 15 April 2021]. 47

Emma Grey Ellis, ‘Why Women Are Called “Influencers” and Men “Creators”’, Wired <https://www. wired.com/story/influencers-creators-gender-divide/> [accessed 15 April 2021]. 48

Christopher McFadden, ‘YouTube’s History and Its Impact on the Internet’, 2020 <https:// interestingengineering.com/youtubes-history-and-its-impact-on-the-internet> [accessed 15 April 2021]. 49

José van Dijck, ‘Facebook and the Engineering of Connectivity: A Multi-Layered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi. org/10.1177/1354856512457548>. 50

Tarleton Gillespie, ‘The Politics of “Platforms”’, New Media & Society, 12.3 (2010), 347–64 <https://doi.org/10.1177/1461444809342738>. 51

Kate Crawford, ‘Can an Algorithm Be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, 41.1 (2016), 77–92 <https://doi. org/10.1177/0162243915589635> p.86. 52

ARCHITECTURE

culture conversations.47

14


Algorithms are inevitably modelled with specific intentions, influenced by an agenda, be it commercial or otherwise. In the context of social media these algorithms are a capitalist device, with the goal of increasing profit. This means that the success of the algorithm is measured against this intent. A successful social media algorithm is a profitable one. This agenda, also informs the algorithms design by: what data is collected, who it’s acquired from, how it’s measured and how it’s interpreted. All of

APPARATUS

15

these are political decisions, and can be used to advance an agenda, as Konrad Becker explains.53 Classification, by decisions stated above, create a reality that “in itself forms an effective case for a particular interpretation of reality”.54 Categorisation, and quantification, in any field, does not necessarily document any given reality but produces knowledge in a particular interpretation of perception.

We can understand the shortcomings of an algorithm with a narrow agenda by looking at recent phenomena happening on TikTok. As media on this platform is proliferated based purely on its engagement, it neglects to categorise why this engagement is happening. Frequently this is due to interest and positive relation between the user and the media, however this is not always true. In some instances users engage with media because it makes them feel negative emotions, which can also lead to a desire to share, comment or otherwise engage. This causes instances where individuals intend for their content to gain a particular kind of audience, who would interact with their content out of enjoyment, but consequently gain one who feel quite the opposite. This phenomena can lead to hateful and abusive interactions on the platform. This issue and others are discussed in a recent paper by Ellen Simpson and Bryan Semaan, titled ‘For You, or For”You”? Everday LGBTQ+ Encounters with TikTok.55 This shows the problematic limit of computational thinking and a sign of potential issues which could become more severe in the future if not addressed.

Konrad Becker, ‘The Power of Classification. Culture, Context, Command, Control, Communications, Computing’, Future Non Stop <http://future-nonstop.org/c/07cabb9532d6e90e68704d4 d6039d04f> [accessed 2 February 2021]. 53

Konrad Becker, ‘The Power of Classification. Culture, Context, Command, Control, Communications, Computing’, Future Non Stop <http://future-nonstop.org/c/07cabb9532d6e90e68704d4 d6039d04f> [accessed 2 February 2021]. 54

Ellen Simpson and Bryan Semaan, ‘For You, or For”You”? Everyday LGBTQ+ Encounters with TikTok’, Proceedings of the ACM on Human-Computer Interaction, 4.CSCW3 (2021), 252:1-252:34 <https://doi.org/10.1145/3432951>. 55


16 ARCHITECTURE

Figure 3 Freelance service directory, Fivver’s page advertising numerous ‘search engine optimisation’ services. Fiverr, 2021.

We can understand how algorithms produce knowledge in a particular interpretation of perception by studying ‘Search Engine Optimisation’ services [See Figure 3]. These services exist to cheat the ingrained logic of search engine algorithms, such as Google’s. Understanding the quantifiable qualities that search engines value, websites can enhance these attributes, to manipulate the algorithm and appear more prominent in search results.56 In this instance, the algorithm of the search engine, by quantifying with a prescribed set of values, dictates what is valuable. And thus, proliferates the worldview of the algorithm and its creators. No system of classification could possibly encompass such a diverse worldview, to be able to ascribe objective value to websites. As Konrad Becker claims “Search results based on skewed but hidden mechanics of classification, lack inclusiveness, fairness and scope of representation.”57

‘Search Engine Optimization (SEO)’, Optimizely <https://www.optimizely.com/uk/optimizationglossary/search-engine-optimization/> [accessed 15 April 2021]. 56

Konrad Becker, ‘The Power of Classification. Culture, Context, Command, Control, Communications, Computing’, Future Non Stop <http://future-nonstop.org/c/07cabb9532d6e90e68704d4 d6039d04f> [accessed 2 February 2021]. 57


The algorithm itself is also no longer just the algorithm. With the advent of machine learning, the once manual and recursive process of optimising an algorithm’s design based on results is now itself automated. Although marketed as a form of artificial intelligence, machine learning does not yet exhibit reasoning to the extent that it can revise what is known, but deductive reasoning, allowing it to revise and make links between already quantified pieces of data.58 This process operates within an already

APPARATUS

17

fixed schema and is unconcerned with the human. This new technology makes the algorithm more challenging as an object of study, as it is now more opaque and ephemeral than ever. However, it is also, in application, more truthfully deductive and non-inferential. This limit of reason is the starting point for developing a mode of resistance. If we are able to understand the limit of computational thinking we may be able to devise an intervention which subverts its operation. Understanding the algorithm as deeply relational and entangled, we will now look more closely at the way this apparatus affects perception and autonomy, when employed in virtual public space.

Expert.ai Team, ‘What Is Machine Learning? A Definition - Expert System’, Expert.Ai, 2020 <https://www.expert.ai/blog/machine-learning-definition/> [accessed 15 April 2021]. 58


4 Control Autonomy.

“It’s about not so much a matter of winning arguments as of being open about things. Being open is setting out the ‘facts,’ not only of a situation but of a problem. Making visible things that would otherwise remain hidden.” - Deleuze59

The algorithm seeks to quantify how things relate to one another: it seeks to understand how subjective meaning and objective attribution interrelate. These are things humans have asked, without definitive answers, since we’ve been able to. The digital world was meant to liberate us: with the promise of borderless access to a bounty of knowledge; and communication not compromised, irrespective of distance. These historic claims at best are attributed with conditional clauses and at worst seem entirely false.60 Google’s self proclaimed mission is to “organize the world’s information”, however this is not possible without a technical model of the world.61 Personalised search is not framed simply as a view onto existing reality, which, in itself is problematic.62 As argued by Martin Feuz, Matthew Fuller and Felix Stalder, what’s proposed is actually an ‘augmented’ version of reality.63 Each user’s individual relationship with reality is determined by algorithm and used to determine search results. The politics of what is made visible through this algorithm is based on cultural assumptions and a perceived authority of data sets. In prescribing what should be made visible to any one individual there is a profound loss of autonomy. As Felix Stalder puts it: “we are presented with a picture of the world (at least how it appears in search results) made up of what someone else, based on proprietary knowledge, determines to be suitable to one’s individual subjectivity.”64 Gilles Deleuze, Negotiations, 1972-1990, trans. by Martin Joughin (Columbia University Press, 1997) p.127 59

Thomas Frey, ‘Eight False Promises of the Internet’, Futurist Speaker, 2011 <https://futuristspeaker.com/business-trends/eight-false-promises-of-the-internet/> [accessed 16 April 2021]. 60

‘How Google Search Works | Our Mission’ <https://www.google.com/intl/en/search/howsearchworks/ mission/> [accessed 16 April 2021]. 61

Helen Nissenbaum and Lucas D. Introna, ‘Shaping the Web: Why the Politics of Search Engines Matters’, The Information Society, 16.3 (2000), 169–85 <https://doi. org/10.1080/01972240050133634>. 62

Martin Feuz, Matthew Fuller, and Felix Stalder, ‘Personal Web Searching in the Age of Semantic Capitalism: Diagnosing the Mechanisms of Personalisation’, First Monday, 2011 <https://doi. org/10.5210/fm.v16i2.3344>. 63

18


Legal scholars Frank Pasquale and Oren Bracha concisely explain the issue in the following way:

Meaningful autonomy requires more than simple absence of external constraint once an individual makes a choice and sets out to act upon it. At a minimum, autonomy requires a meaningful variety of choices, information of the relevant state of the world and of these

CONTROL

19

alternatives, the capacity to evaluate this information and the ability to make a choice. If A controls the window through which B sees the world—if he systematically exercises power over the relevant information about the world and available alternatives and options that reaches B— then the autonomy of B is diminished. To control one’s informational flows in ways that shape and constrain her choice is to limit her autonomy, whether that person is deceived or not.65

Personalisation can then be understood to both increase and reduce individual autonomy. Information is made available which would otherwise be challenging to locate, thus increasing autonomy. However autonomy is also diminished because the profile used to determine subjectivity cannot adequately represent the individual it is based on. The apparatus reinforces only aspects of the individual it was able to quantify through a process of translation.66

This mechanism is a harmful intrusion in the relationship between media and subjectivity. When we experience media, we internalise thoughts about this media and through this form a sense of ourself: as an individual in relation to this piece of media.67 Within virtual public space, this mode of becoming self aware is interwoven within capitalist mechanics of storage, memory, calculation and analysis, which change the condition of the self coming to be known. The algorithm in this process becomes a problematic mechanism of knowledge production.

Frank A. Pasquale and Oren Bracha, Federal Search Commission? Access, Fairness and Accountability in the Law of Search (Rochester, NY: Social Science Research Network, 26 July 2007) <https://papers.ssrn.com/abstract=1002453> [accessed 16 April 2021]. 64

Felix Stalder and Christine Mayer, ‘The Second Index. Search Engines, Personalization and Surveillance (Deep Search), n.n. -- Notes & Nodes on Society <http://felix.openflows.com/ node/113> [accessed 2 February 2021]. 65

Jonathan Obar and Steven Wildman, ‘Social Media Definition and the Governance Challenge: An Introduction to the Special Issue’, SSRN Electronic Journal, 2015 <https://doi.org/10.2139/ ssrn.2637879> p.2. 66

Matthew Fuller, ‘Algorithmic Cultures and Security’, 2015 <https://www.youtube.com/ watch?v=sSRyG4u1fTc> [accessed 2 February 2021]. 67


To understand more fully how the algorithm operates as an apparatus of control in society, I will discuss it through an analysis of Foucault’s writing on the panopticon [see Figure 4].68 It is productive to continue to think though the algorithm as a technology which alters what is visible; particularly with Foucault’s ideas of how power and surveillance affect behaviour. Rajchman points out in his discussion of Foucault that “Architecture helps “visualise” power in other ways than simply manifesting it. It is not simply a matter of what a building shows “symbolically” or

As Taina Butcher comments on this point: in visibility, being conceived is an organisation of power in a negative and positive sense.70 Foucault shows that “spaces are designed to make things seeable, and seeable in a specific way”.71 This is true not just of prisons and hospitals but also Social media websites. All are spaces of ‘constructed visibility’.72 Panoptic architecture functions through a technical structuring of being with an implemented awareness of a constant possibility of inspection. To restructure visibility is to highlight the “distribution of individuals in relation to one another, of hierarchical organisation, of dispositions of centres and channels of power”.73 It is this notion around a technical structuring of what is visible that is relevant and applicable to social media. As Butcher puts it: “The spaces designed by the (im)material conditions of the software are similarly designed to make things visible, and thus knowable, in a specific way.”74 The algorithm can now be understood more fully as an apparatus of control. The sociality which exists in virtual public spaces can be seen as operating according to the rules and laws which are embedded in its technical procedures.75

68

Michel Foucault, Discipline and Punish: The Birth of the Prison (Penguin UK, 2019).

John Rajchman, Foucault’s Art of Seeing, October, 44 (1988), 89–117 <https://doi. org/10.2307/778976> p.103. 69

Taina Bucher, ‘Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook’, New Media & Society, 14.7 (2012), 1164–80 <https://doi.org/10.1177/1461444812440159> p.1170. 70

John Rajchman, ‘Foucault’s Art of Seeing’, October, 44 (1988), 89–117 <https://doi. org/10.2307/778976> p.103. 71

Taina Bucher, ‘Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook’, New Media & Society, 14.7 (2012), 1164–80 <https://doi.org/10.1177/1461444812440159>. 72

73

Michel Foucault, Discipline and Punish: The Birth of the Prison (Penguin UK, 2019) p.205.

Taina Bucher, ‘Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook’, New Media & Society, 14.7 (2012), 1164–80 <https://doi.org/10.1177/1461444812440159> p.1171. 74

Luciana Parisi, ‘Automated Thinking and the Limits of Reason’, Cultural Studies ↔ Critical Methodologies, 16.5 (2016), 471–81 <https://doi.org/10.1177/1532708616655765>. 75

AUTONOMY

“semiotically”, but also of what it makes visible about us and within us”.69

20


Figure 4 Illustrations of the panopticon: a prison in which inmates are always under threat of the gaze of the guard. The Panopticon by Adam Simpson, 2013.

CONTROL

21

Virtual public spaces however, are unlike the panopticon, where each individual perceives an equal threat of permanent visibility and adjusts their behaviours accordingly. The threat of visibility, in relation to other individuals is not evenly dispersed; and is not often a threat but a reward for content deemed worthy by algorithm. As discussed previously, social media algorithms work to adjust visibility of content based on its perceived relevance to specific individuals. This makes the metaphor of the panopticon less relevant, however we could modify this analogy, understanding the rooms of the panopticon as transient, being algorithmically recomposed. Each individual here would exist within a different version of their environment to any other individual, existing in an ephemeral and relational plane, as opposed to an object stable one. Individuals are unintentionally informing the mechanism of control that recomposes their virtual environment.

A theory which aims to explain relations between technologies and individuals, not as distinct entities but socio-technical ensembles, is Bruno Latour’s actor-network theory.76 Actor-network theory’s version of social constructivism posits socio-technical relations as both material and semiotic; with Human and non-human agents “intrexicably intertwined in the shaping of interactive processes.”77 Using this logic we can understand social media algorithms as agents which influence us, but that we can also exert influence over. Being aware of this mechanism, at least on a conceptual level, should allow us to devise a mode of resistance which the individual can enact: to disrupt the computational process of noninferential reasoning. The following section will discuss current calls for reform, modes of resistance and their limits before making a proposition. Bruno Latour and Centre de Sociologie de l’Innovation Bruno LaTour, Reassembling the Social: An Introduction to Actor-Network-Theory (OUP Oxford, 2005). 76

José van Dijck, ‘Facebook and the Engineering of Connectivity: A Multi-Layered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi. org/10.1177/1354856512457548>. 77


5 Resistance Emancipation.

“Supposedly, if online social practices are mostly determined by their technological infrastructure, they do not give users the opportunity to perform their social reality.” - Doonia Mahlouly78

There are many obstacles those studying algorithmic control must contend with. Algorithms are proprietary technology, meaning independent researchers are generally unable to access this information.79 Even if they were able to access the written code, this in itself would not be enough to fully assess the algorithm, as it would be abstracted from its operational context.80 To understand the technology and its effects it must be run and understood as part of a complex network. Algorithmic media is also developing in complexity itself, machine learning makes the object of study for algorithmic researchers even more elusive. This means researchers are limited to observing algorithms in operation.

Dounia Mahlouly, ‘Rethinking the Public Sphere in a Digital Environment: Similarities between the Eighteenth and the Twenty-First Centuries’, ESharp, 20.6 (2013) p.5. 78

Fenwick McKelvey, ‘Algorithmic Media Need Algorithmic Methods: Why Publics Matter’, Canadian Journal of Communication, 39 (2014) <https://doi.org/10.22230/cjc.2014v39n4a2746>. 79

Fenwick McKelvey, ‘A Programmable Platform? Drupal, Modularity, and the Future of the Web’, The Fibreculture Journal, Trans.18 (2011) <https://eighteen.fibreculturejournal.org/2011/10/09/ fcj-128-programmable-platform-drupal-modularity-and-the-future-of-the-web/> [accessed 16 April 2021]. 80

22


As a response to this, many researchers call for transparency in algorithm production, either as the topic of their writing or as a component of their conclusion.81

82 83

As put by Ganaele Langlois, Fenwick McKelvey, Greg Elmer

and Kenneth Werbin in 2009: “‘How can we understand, map and otherwise critique emergent forms of connectivity and articulation among Web 2.0 sites, users and content, especially when the architecture and technical processes shaping communicational dynamics are black-boxed, opaque and

RESISTANCE

23

secretive?”84

Calls for transparency are entirely justified, both so the algorithm can be studied as an object but also so the general public can be aware of the technological processes at work in virtual public spaces. Studies such as “I Always Assumed that I Wasn’t Really that Close to [Her]” show that the general public, even undergraduates at an elite university in this example, are not aware of the algorithmic processes.85 In this 2015 study, sixty-two percent of students when asked, said they were not aware that Facebook curated users’ News Feeds. It is reasonable to think that if more people were aware of these computational processes it could affect change. However, since 2015 and particularly after the 2016 U.S presidential election people are more aware of algorithmic curation.86 New emerging platforms such as TikTok, seem to suggest the general public are quite happy to offer their data and subjectivity willingly, as long as they enjoy the service. Despite being branded a US national security risk TikTok is still only growing in popularity.87

88

José van Dijck, ‘Facebook and the Engineering of Connectivity: A Multi-Layered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi. org/10.1177/1354856512457548>. 81

Christina Blacklaws, ‘Algorithms: Transparency and Accountability’, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376.2128 (2018), 20170351 <https://doi.org/10.1098/rsta.2017.0351>. 82

Fenwick McKelvey Ganaele Langlois, ‘Mapping Commercial Web 2.0 Worlds: Towards a New Critical Ontogenesis’, The Fibreculture Journal, Web 2.0.14 (2009) <https://fourteen.fibreculturejournal. org/fcj-095-mapping-commercial-web-2-0-worlds-towards-a-new-critical-ontogenesis/> [accessed 16 April 2021]. 83

Motahhare Eslami and others, ‘“I Always Assumed That I Wasn’t Really That Close to [Her]”: Reasoning about Invisible Algorithms in News Feeds’, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (New York, NY, USA: Association for Computing Machinery, 2015), pp. 153–62 <https://doi.org/10.1145/2702123.2702556>. 84

José van Dijck, ‘Facebook and the Engineering of Connectivity: A Multi-Layered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi. org/10.1177/1354856512457548>. 85

A.S.B., ‘Why Is Mark Zuckerberg Testifying in Congress?’, The Economist, 9 April 2018 <https://www.economist.com/the-economist-explains/2018/04/09/why-is-mark-zuckerberg-testifying-in-congress> [accessed 3 February 2021]. 86

Brian Fung Business CNN, ‘TikTok Is a National Security Threat, US Politicians Say. Here’s What Experts Think’, CNN <https://www.cnn.com/2020/07/09/tech/tiktok-security-threat/index.html> [accessed 16 April 2021]. 87

DataReportal, ‘Digital 2021: Global Overview Report’ <https://datareportal.com/reports/digital-2021-global-overview-report>. 88


Debates on data control are no longer just between ‘David and Goliath’, as Apple have recently entered the discussion on data ethics. The company’s CEO, Tim Cook, is calling for ‘comprehensive data privacy laws’, focused on minimising data collection and informing users about its use.89 Although Apple’s stance is being met with resistance, it is already having a positive impact.90 Companies such as Facebook are modifying their practices to conform to Apple’s wishes.91 These changes however, still do not address the core issues affecting autonomy discussed in this essay.92

scholarly fields, which suggest they will be unavailing. It is very unlikely high profile platforms like Facebook and Twitter will ever reveal their proprietary algorithms, as these are a key component of their lucrative product.93 Also, as Helen Nissenbaum argues, there is a paradox in arguing for algorithmic transparency. If it were possible to predict consistently how an algorithm works it would mean “revealing information handling practices in ways that are relevant meaningful to the choices individuals must make.” If this were done not only would it involve describing every process of the algorithm, its condition, flows and exceptions, “it is unlikely to be understood, let alone read”.94

Jon Brodkin, ‘Tim Cook Calls for Strong US Privacy Law, Rips “Data-Industrial Complex”’, Ars Technica, 2018 <https://arstechnica.com/tech-policy/2018/10/tim-cook-calls-for-strong-usprivacy-law-rips-data-industrial-complex/> [accessed 17 April 2021]. 89

Yuan Yang and Patrick McGee, ‘China’s Tech Giants Test Way around Apple’s New Privacy Rules’, Financial Times, 2021 <https://www.ft.com/content/520ccdae-202f-45f9-a516-5cbe08361c34> [accessed 13 April 2021]. 90

Josh Taylor, ‘Facebook v Apple: The Looming Showdown over Data Tracking and Privacy’, The Guardian, 2021 <http://www.theguardian.com/technology/2021/feb/14/facebook-v-apple-the-loomingshowdown-over-data-tracking-and-privacy> [accessed 18 April 2021]. 91

Ian Bogost, ‘Apple’s Empty Grandstanding About Privacy’, The Atlantic, 2019 <https://www. theatlantic.com/technology/archive/2019/01/apples-hypocritical-defense-data-privacy/581680/> [accessed 17 April 2021]. 92

Kate Crawford, ‘Can an Algorithm Be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, 41.1 (2016), 77–92 <https://doi. org/10.1177/0162243915589635>. 93

Helen Nissenbaum, ‘A Contextual Approach to Privacy Online’, Daedalus, 140.4 (2011), 32–48 <https://doi.org/10.1162/DAED_a_00113> p.36. 94

EMANCIPATION

There are several criticisms to calls for algorithmic transparency in

24


Machine learning only complicates calls for transparency even further, by adding another layer of complexity and obscurity. That withstanding, machine learning/neural network systems are often trained to optimise an objective function, which can be explicitly understood and critiqued, without necessarily needing the workings of the algorithm to be explained. In theory, information about a neural network’s computation can also be revealed mid operation, to understand what has been learnt and what

RESISTANCE

25

components of the network are responsible for specific results. This is a functionality available so the designers of such technology can understand their creation but has the capacity to be modified for a different audience. It could also be argued that not everyone needs to be able to understand algorithmic media technically, just a few who can translate its operation to the public. This however, highlights an important point: what is meant by algorithmic transparency is not consistently defined.

Platforms such as Google and Facebook, as a response to numerous high profile news articles on the dangers of data collection, are being more open about what information they collect on individuals and how they use it [see Figure 5].95 Google now has an option to ‘turn off’ personalised searching, meaning your personal search results will no longer be tailored to yourself [see Figure 6]. This however does not mean that Google will stop collecting data on you, or that this data won’t inform the personalisation of any other user’s experience. Nor, is there an option to opt out of personalisation on Google’s other services, such as Youtube. Facebook has also introduced a “Why am I seeing this ad?” option which will explain the demographic the advertiser is trying to target and what information Facebook has collected on you to inform this [see Figure 7]. You can also adjust what kinds of advertisements you want to see more or less of, but unlike Google there is no option to disable the personalisation, of adverts or other content. Both platforms also allow you to see your profiled interests and remove whichever you desire [see Figure 8]. This version of transparency is unlikely to be what was intended by the individuals calling for it; it allows for some understanding of what data is being collected but does little to explain the workings of the apparatus or its affects.

Fraser Moore, ‘Here Are the “Sinister” Dangers That Could Arise from Companies Collecting Our Data, According to a Computer Scientist’, Business Insider <https://www.businessinsider.com/ here-are-the-dangers-of-corporations-collecting-our-user-data-viktor-mayer-schoenberger-2018-3> [accessed 17 April 2021]. 95


26 EMANCIPATION

^Figure 5 Google’s ‘Security and Privacy’ page on ‘Ads and Data’. Google, 2021.

^Figure 6 Google’s ‘Ad Settings’ page showing option to disable personalisation. Google, 2021.

< Figure 7 Menu option to lead to a page titled: ’Why am I seeing this Ad?’ On my Facebook homepage. Facebook, 2021.


“Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.”

RESISTANCE

27

Figure 8 A portion of my personal data profile, collected by Google. Each category has an option to be disabled from influencing search personalisation. Google, 2021.


Although superficial, and some studies argue misleading, these features do allow users to access some information on the operation of the apparatus.96 So, why are they not more widely used? David McRaney, in his 2012 book ‘You Are Not So Smart’, argues that social media platforms reaffirming the beliefs we already hold are in fact the reason people use them.97 Some studies into technological fixes to the problems of curatorial algorithms seem to suggest this is true. ‘Social Mirror’, developed by Deb Roy, allowed people to see how their twitter social network existed within wider

their virtual environment was composed differently to other users. On it, Adam Priore writes:

The impact of the experiment was short-lived, however. Though a week after it ended some participants were following a more diverse set of Twitter accounts than before, two to three weeks later most had gone back to homogeneity. And in another twist, people who ended up following more contrarian accounts suggested by the researchers to help them diversify their Twitter feeds subsequently reported that they’d be even less inclined to talk to people with opposing political views.99

Deb Roy’s proposed solution following this failure? To have a technocratic regulatory process where scholars review algorithms to make sure we’re seeing an ‘unbiased’ representation of views. If we are to understand the issue with the algorithm as not just existing at its point of action but in its intent and context, this further interference by individuals, with new bias, could potentially compound the problems. This logic is a continuation of the problematic belief that the world is in fact objectively categorisable; it suggest a further prescription of a particular, but different, reality.

Athanasios Andreou and others, ‘Investigating Ad Transparency Mechanisms in Social Media: A Case Study of Facebook’s Explanations’, 2018 <http://dx.doi.org/10.14722/ndss.2018.23191>. 96

David McRaney, You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself (Gotham Books, 2012). 97

‘Project Overview ‹ Social Mirror’, MIT Media Lab <https://www.media.mit.edu/projects/socialmedia-mirror/overview/> [accessed 17 April 2021]. 98

Adam Priore, ‘No, Big Tech Didn’t Make Us Polarized (but It Sure Helps)’, MIT Technology Review, 2018, 18–21, p.20. 99

EMANCIPAtION

context [see Figure 9].98 This allowed the individuals to understand how

28


RESISTANCE

29

Figure 9 Visualisation available through the programme “Social Mirror”, which allows user to view how their specific twitter network exists in relation to other users. MIT Media Lab, 2017.

This is not the only attempt to create technological infrastructure of emancipation. After the 2016 election, Ethan Zuckerman developed a tool for Facebook called “Gobo”.100 The intention of Gobo was to allow users to alter their ‘content filters’; to control the variety of political and social perspectives they could see. Facebook showed little interest in this (despite now having this functionality with advertisements). Zuckerman suggests this is because Facebook believes very few people would actually want to diversify their feed.101 Although this could be correct, it is likely it would also decrease engagement on the site, lessening the efficacy of the curatorial algorithm: this is something the company would not want. As of January 2021, Facebook themselves are proposing a ‘solution’, to prevent the site from being blamed further for events such as the capitol riot. Simply to ‘de-politicise the site’.102 This in intent does little to address all of the ingrained social issues discussed through this essay, and also (if, even possible) limits the positive aspects of increased communication and potential agency that digital sociality can facilitate.103 ‘Project Overview ‹ Gobo’, MIT Media Lab <https://www.media.mit.edu/projects/gobo/overview/> [accessed 4 February 2021]. 100

JAdam Priore, ‘No, Big Tech Didn’t Make Us Polarized (but It Sure Helps)’, MIT Technology Review, 2018, 18–21, p.20. 101

Motley Fool Transcribing, ‘Facebook (FB) Q4 2020 Earnings Call Transcript’, 2021 <https:// www.fool.com/earnings/call-transcripts/2021/01/28/facebook-fb-q4-2020-earnings-call-transcript/> [accessed 4 February 2021]. 102

Anita Breuer, The Role of Social Media in Mobilizing Political Protest: Evidence from the Tunisian Revolution (Rochester, NY: Social Science Research Network, 10 October 2012) <https:// doi.org/10.2139/ssrn.2179030>. 103


These examples highlight the limits of current techno-infrastructural proposals of resistance, as they either go against our desire for affirmation and agreement, potentially create new versions of engineered sociality (with new problems), or have motives other than emancipation. Calls for transparency are necessary, however appear unavailing. This suggests that the mode of resistance must come from the individual. As Andrew Feenberg argues: internet technologies are not objects but unfinished processes.104 Web 2.0 owners and users are engaged in a struggle

largely with its owners, the earlier discussion on actor-network theory explains a way in which users have agency to influence the technology which exerts control over them. This theory is supported by David Beer, who also argues that user’s agency contributes to the formation of each platform’s ‘performative infrastructure’.105 Explaining this as the mechanism by which: people express their tastes and preferences, which get translated into relational databases, which in turn inform the algorithmic curation of their experiences. In this process we can employ our understanding of the limits of computational logic and non-inferential reasoning to subvert the automated mechanism. Namely, by individuals expressing tastes and preferences which they do not hold, but which are intended to misinform the algorithm.

Andrew Feenberg, ‘Critical Theory of Communication Technology: Introduction to the Special Section’, The Information Society, 25.2 (2009), 77–83 <https://doi. org/10.1080/01972240802701536>. 104

David Beer, ‘Power through the Algorithm? Participatory Web Cultures and the Technological Unconscious’, New Media & Society, 11.6 (2009), 985–1002 <https://doi. org/10.1177/1461444809336551> p.998. 105

EMANCIPATION

to define a given platform’s social meaning. Although the power rests

30


6 31

Conclusion End.

“Perhaps most obviously, the claim that anything at all is perfectly knowable is perverse.” - Adam Greenfield106

The Web 2.0 sites, which act as virtual public space, cannot be considered places where information is simply transported between individuals. Instead, these spaces act as mediators of information, making claim to what should be made visible to any individual. Through this, connections are engineered and sociality is distorted. Individual’s actions on the platforms of Web 2.0 and beyond, are quantified, categorised and sorted: rendered into a data profile, in the image of a particular worldview. This profile is used to inform the algorithmic apparatus, allowing it to make claim to what the individual should like, experience and interact with. This has damaging impacts on both individual agency and autonomy but also democratic methods in society. As individuals in virtual public spaces are likely to have their beliefs affirmed, the possibility for deliberation and consensus is inhibited.

We must understand the way in which these technologies have the capacity to shape our subjectivity, our communities and our lives. Furthermore, we must engage with them critically, both in discourse and practice, understanding them not as something neutral, autocratic or purely technical but as an apparatus of control, enacted by individuals with an agenda. We are entangled with algorithms in virtual space:a techno-symbiotic relationship, in which many actors have an amount of agency. This agency must be employed, allowing us to gain emancipation from algorithmic sovereignty.

106

Adam Greenfield, Radical Technologies: The Design of Everyday Life (Verso Books, 2017) p.51.


My proposition is as follows: First, users should follow as wide a variety of mainstream media and appropriately fringe regulated news outlets as the feel happy to do so, in order to make their curated feed as diverse as possible. Second, user should interact with (or ‘like’) everything.

This resistance operates at the level of the individual and the apparatus. The contributing users themselves become less discernible, less predictable

making the data set attributed to them anomalous. This data set, being used to inform the apparatus, will contribute to the algorithm making links where they do not necessarily exist. This will contribute to disrupting the efficacy of the apparatus overall. The quality of the particular Web 2.0 service is likely to be reduced, as a result, in a major part for the individual and a minor part for other users. If this loss of quality were to dissuade the individual enacting the resistance (or anyone else) from using the service, this would be a success.

Nonetheless, virtual public spaces and the algorithms that operate within them are a new reality we must contend with. It is imperative that we ensure they develop in a way where the individual is empowered, not just their user equivalent.

END

and less quantifiable, freeing them from their prescribed environment; and

32


BIBLIOGRAPHY

33

Ahmed, Maryam, ‘UK Passport Photo Checker Shows Bias against Dark-Skinned Women’, BBC News, 7 October 2020, section Technology <https://www.bbc. com/news/technology-54349538> [accessed 15 April 2021] Amoore, Louise, Cloud Ethics: Algorithms and the Attributes of Ourselves and Others (Duke University Press, 2020) Andreou, Athanasios, Giridhari Venkatadri, Oana Goga, Krishna Gummadi, Patrick Loiseau, and Alan Mislove, ‘Investigating Ad Transparency Mechanisms in Social Media: A Case Study of Facebook’s Explanations’, 2018 <http://dx.doi.org/10.14722/ndss.2018.23191> A.S.B., ‘Why Is Mark Zuckerberg Testifying in Congress?’, The Economist, 9 April 2018 <https://www.economist.com/the-economist-explains/2018/04/09/ why-is-mark-zuckerberg-testifying-in-congress> [accessed 3 February 2021] Becker, Konrad, ‘The Power of Classification. Culture, Context, Command, Control, Communications, Computing’, Future Non Stop <http://futurenonstop.org/c/07cabb9532d6e90e68704d4d6039d04f> [accessed 2 February 2021] Beer, David, ‘Power through the Algorithm? Participatory Web Cultures and the Technological Unconscious’, New Media & Society, 11.6 (2009), 985– 1002 <https://doi.org/10.1177/1461444809336551> Beer, David, ‘The Social Power of Algorithms’, Information, Communication & Society, 20.1 (2017), 1–13 <https://doi.org/10.1080/136911 8X.2016.1216147> Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell, ‘On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?’, in Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, FAccT ’21 (New York, NY, USA: Association for Computing Machinery, 2021), pp. 610–23 <https://doi. org/10.1145/3442188.3445922> Blacklaws, Christina, ‘Algorithms: Transparency and Accountability’, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376.2128 (2018), 20170351 <https:// doi.org/10.1098/rsta.2017.0351> Bourdieu, Pierre, On Television (New Press, 1999) Bourdieu, Pierre, and Shaun Whiteside, Photography: A Middle-Brow Art (Stanford University Press, 1996) Breuer, Anita, The Role of Social Media in Mobilizing Political Protest: Evidence from the Tunisian Revolution (Rochester, NY: Social Science Research Network, 10 October 2012) <https://doi.org/10.2139/ssrn.2179030> Bucher, Taina, ‘Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook’, New Media & Society, 14.7 (2012), 1164–80 <https://doi.org/10.1177/1461444812440159> Callon, Michel, Pierre Lascoumes, and Yannick Barthe, Acting in an Uncertain World: An Essay on Technical Democracy (MIT Press, 2011) Clayotn, James, ‘Twitter Boss: Trump Ban Is “right” but “Dangerous”’, BBC News, 14 January 2021, section Technology <https://www.bbc.com/news/ technology-55657417> [accessed 3 February 2021] Cox, Geoff, ‘Crowd_code | Anti-Thesis’ <http://www.anti-thesis.net/crowd_ code-2/> [accessed 2 February 2021] Crawford, Kate, ‘Can an Algorithm Be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, 41.1 (2016), 77–92 <https://doi.org/10.1177/0162243915589635>


‘Create Great Content - YouTube’ <https://creatoracademy.youtube.com/page/ course/great-content> [accessed 15 April 2021] Deleuze, Gilles, Negotiations, 1972-1990, trans. by Martin Joughin (Columbia University Press, 1997) Dewey, John, The Later Works, 1925-1953 (SIU Press, 1981) van Dijck, José, ‘Facebook and the Engineering of Connectivity: A MultiLayered Approach to Social Media Platforms’, Convergence, 19.2 (2013), 141–55 <https://doi.org/10.1177/1354856512457548> Dwoskin, Elizabeth, and Nitasha Tiku, ‘How Twitter, on the Front Lines of History, Finally Decided to Ban Trump’, Washington Post <https://www. washingtonpost.com/technology/2021/01/16/how-twitter-banned-trump/> [accessed 3 February 2021] Ellis, Emma Grey, ‘Why Women Are Called “Influencers” and Men “Creators”’, Wired <https://www.wired.com/story/influencers-creators-gender-divide/> [accessed 15 April 2021] Eslami, Motahhare, Aimee Rickman, Kristen Vaccaro, Amirhossein Aleyasen, Andy Vuong, Karrie Karahalios, and others, ‘“I Always Assumed That I

Wasn’t Really That Close to [Her]”: Reasoning about Invisible Algorithms in News Feeds’, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15 (New York, NY, USA: Association for Computing Machinery, 2015), pp. 153–62 <https://doi. org/10.1145/2702123.2702556> Expert.ai Team, ‘What Is Machine Learning? A Definition - Expert System’, Expert.Ai, 2020 <https://www.expert.ai/blog/machine-learning-definition/> [accessed 15 April 2021] ‘Facebook Advertising Targeting Options’, Facebook for Business <https:// en-gb.facebook.com/business/ads/ad-targeting> [accessed 14 April 2021] Feenberg, Andrew, ‘Critical Theory of Communication Technology: Introduction to the Special Section’, The Information Society, 25.2 (2009), 77–83 <https://doi.org/10.1080/01972240802701536> Feuz, Martin, Matthew Fuller, and Felix Stalder, ‘Personal Web Searching in the Age of Semantic Capitalism: Diagnosing the Mechanisms of Personalisation’, First Monday, 2011 <https://doi.org/10.5210/ fm.v16i2.3344> Foucault, Michel, Discipline and Punish: The Birth of the Prison (Penguin UK, 2019) Freer, Anne, ‘TikTok Was the Most Downloaded App of 2020’, Business of Apps, 2020 <https://www.businessofapps.com/news/tiktok-was-the-mostdownloaded-app-of-2020/> [accessed 4 February 2021] Frey, Thomas, ‘Eight False Promises of the Internet’, Futurist Speaker, 2011 <https://futuristspeaker.com/business-trends/eight-false-promisesof-the-internet/> [accessed 16 April 2021] Fuchs, Christian, ‘Labor in Informational Capitalism and on the Internet’, The Information Society, 26.3 (2010), 179–96 <https://doi. org/10.1080/01972241003712215> Ganaele Langlois, Fenwick McKelvey, ‘Mapping Commercial Web 2.0 Worlds: Towards a New Critical Ontogenesis’, The Fibreculture Journal, Web 2.0.14 (2009) <https://fourteen.fibreculturejournal.org/fcj-095-mappingcommercial-web-2-0-worlds-towards-a-new-critical-ontogenesis/> [accessed 16 April 2021] Gillespie, Tarleton, ‘The Politics of “Platforms”’, New Media & Society, 12.3 (2010), 347–64 <https://doi.org/10.1177/1461444809342738> Gillespie, Tarleton, ‘The Relevance of Algorithms’, in Media Technologies: Essays on Communication, Materiality and Society (Cambridge, MA: MIT Press, 2013), pp. 167–94 <https://doi.org/10.7551/ mitpress/9780262525374.003.0009>

34


35

‘Google Ads - Get More Customers With Easy Online Advertising’ <https:// www.googleadservices.com/pagead/aclk?sa=L&ai=DChcSEwjjiK-L_ f3vAhWGtO0KHaB4DVUYABAGGgJkZw&ohost=www.google.com&cid=CAESQOD2YzqAqC 5ffBNECAGizo3GZ7hwDHngcJW9iRsOabE32E-cOP9dSAKBKDy6krZ7tllBvkmjZPyRS4M hlikHsQg&sig=AOD64_17rlhyKe8pX77jadcBbhx20p6k1g&q=&ved=2ahUKEwixy6eL_ f3vAhXhQEEAHXk8AOMQqyQoAnoECAQQFw&adurl=> [accessed 14 April 2021] Greenfield, Adam, Radical Technologies: The Design of Everyday Life (Verso Books, 2017) ‘How Google Search Works | Our Mission’ <https://www.google.com/intl/en/ search/howsearchworks/mission/> [accessed 16 April 2021] Kaufmann, Mareile, and Julien Jeandesboz, ‘Politics and “the Digital”: From Singularity to Specificity’, European Journal of Social Theory, 20.3 (2017), 309–28 <https://doi.org/10.1177/1368431016677976> Latour, Bruno, and Centre de Sociologie de l’Innovation Bruno LaTour, Reassembling the Social: An Introduction to Actor-Network-Theory (OUP Oxford, 2005) Lorenz, Taylor, ‘The Real Difference Between Creators and Influencers’, The

Atlantic, 2019 <https://www.theatlantic.com/technology/archive/2019/05/ how-creators-became-influencers/590725/> [accessed 15 April 2021] Lucas D. Introna, Helen Nissenbaum, ‘Shaping the Web: Why the Politics of Search Engines Matters’, The Information Society, 16.3 (2000), 169–85 <https://doi.org/10.1080/01972240050133634> Luciana Parisi, ‘Algorithmic Cultures and Security’, 2015 <https://youtu. be/3vrFLcmX6HM> [accessed 2 February 2021] ‘Machine Learning in Julia’, The Alan Turing Institute <https://www.turing. ac.uk/research/research-projects/machine-learning-julia> [accessed 14 April 2021] Mahlouly, Dounia, ‘Rethinking the Public Sphere in a Digital Environment: Similarities between the Eighteenth and the Twenty-First Centuries’, ESharp, 20.6 (2013) Matthew Fuller, ‘Algorithmic Cultures and Security’, 2015 <https://www. youtube.com/watch?v=sSRyG4u1fTc> [accessed 2 February 2021] Matyszczyk, Chris, ‘Facebook and Zuckerberg Still Don’t Know the Right Thing to Do’, CNET <https://www.cnet.com/news/mark-zuckerberg-stilldoesnt-know-the-right-thing-to-do/> [accessed 3 February 2021] McFadden, Christopher, ‘YouTube’s History and Its Impact on the Internet’, 2020 <https://interestingengineering.com/youtubes-history-and-its-impacton-the-internet> [accessed 15 April 2021] McKelvey, Fenwick, ‘Algorithmic Media Need Algorithmic Methods: Why Publics Matter’, Canadian Journal of Communication, 39 (2014) <https://doi. org/10.22230/cjc.2014v39n4a2746> McRaney, David, You Are Not So Smart: Why You Have Too Many Friends on Facebook, Why Your Memory Is Mostly Fiction, and 46 Other Ways You’re Deluding Yourself (Gotham Books, 2012) Mohamed, Shakir, Marie-Therese Png, and William Isaac, ‘Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence’, Philosophy & Technology, 33.4 (2020), 659–84 <https:// doi.org/10.1007/s13347-020-00405-8> Motley Fool Transcribing, ‘Facebook (FB) Q4 2020 Earnings Call Transcript’, 2021 <https://www.fool.com/earnings/call-transcripts/2021/01/28/facebookfb-q4-2020-earnings-call-transcript/> [accessed 4 February 2021] Nguyen, Minh Hao, Jonathan Gruber, Jaelle Fuchs, Will Marler, Amanda Hunsaker, and Eszter Hargittai, ‘Changes in Digital Communication During the COVID-19 Global Pandemic: Implications for Digital Inequality and Future Research’, Social Media + Society, 6.3 (2020), 2056305120948255 <https://doi.org/10.1177/2056305120948255>


Obar, Jonathan, and Steven Wildman, ‘Social Media Definition and the Governance Challenge: An Introduction to the Special Issue’, SSRN Electronic Journal, 2015 <https://doi.org/10.2139/ssrn.2637879> Pariser, Eli, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think (Penguin Publishing Group, 2011) Parisi, Luciana, ‘Automated Thinking and the Limits of Reason’, Cultural Studies ↔ Critical Methodologies, 16.5 (2016), 471–81 <https://doi. org/10.1177/1532708616655765> Pasquale, Frank A., and Oren Bracha, Federal Search Commission? Access, Fairness and Accountability in the Law of Search (Rochester, NY: Social Science Research Network, 26 July 2007) <https://papers.ssrn.com/ abstract=1002453> [accessed 16 April 2021] Paul, Kari, ‘Two Google Engineers Quit over Company’s Treatment of AI Researcher’, The Guardian, 2021 <http://www.theguardian.com/ technology/2021/feb/04/google-timnit-gebru-ai-engineers-quit> [accessed 15 April 2021] ‘Personalized Search for Everyone’, Official Google Blog <https://

googleblog.blogspot.com/2009/12/personalized-search-for-everyone.html> [accessed 14 April 2021] Priore, Adam, ‘No, Big Tech Didn’t Make Us Polarized (but It Sure Helps)’, MIT Technology Review, 2018, 18–21 ‘Project Overview ‹ Gobo’, MIT Media Lab <https://www.media.mit.edu/ projects/gobo/overview/> [accessed 4 February 2021] ‘Project Overview ‹ Social Mirror’, MIT Media Lab <https://www.media.mit. edu/projects/social-media-mirror/overview/> [accessed 17 April 2021] Rajchman, John, ‘Foucault’s Art of Seeing’, October, 44 (1988), 89–117 <https://doi.org/10.2307/778976> Rangarajan, Sinduja, ‘Bay Area Tech Diversity: White Men Dominate Silicon Valley’, Reveal, 2018 <https://revealnews.org/article/heres-the-clearestpicture-of-silicon-valleys-diversity-yet/> [accessed 15 April 2021] Reuters, ‘Amazon Ditched AI Recruiting Tool That Favored Men for Technical Jobs’, The Guardian, 2018 <http://www.theguardian.com/technology/2018/ oct/10/amazon-hiring-ai-gender-bias-recruiting-engine> [accessed 15 April 2021] Satariano, Adam, ‘Facebook Identifies Russia-Linked Misinformation Campaign’, The New York Times, 17 January 2019, section Business <https://www.nytimes.com/2019/01/17/business/facebook-misinformationrussia.html> [accessed 13 April 2021] ‘Search Engine Optimization (SEO)’, Optimizely <https://www.optimizely.com/ uk/optimization-glossary/search-engine-optimization/> [accessed 15 April 2021] Seetharaman, Jeff Horwitz and Deepa, ‘Facebook Executives Shut Down Efforts to Make the Site Less Divisive’, Wall Street Journal, 26 May 2020, section Tech <https://www.wsj.com/articles/facebook-knows-it-encouragesdivision-top-executives-nixed-solutions-11590507499> [accessed 3 February 2021] Shirky, Clay, ‘The Political Power of Social Media: Technology, the Public Sphere, and Political Change’, Foreign Affairs, 90.1 (2011), 28–41 Stalder, Felix, and Christine Mayer, ‘The Second Index. Search Engines, Personalization and Surveillance (Deep Search)’ n.n. -- Notes & Nodes on Society <http://felix.openflows.com/node/113> [accessed 2 February 2021] Stokel-Walker, Chris, ‘We Had Experts Dissect TikTok’s Algorithm, and Their Findings Reveal Why a US Buyer Will Struggle to Replicate Its Magic’, Business Insider <https://www.businessinsider.com/why-tiktok-algorithmbytedance-acquisition-trump-2020-9> [accessed 4 February 2021]

36


37

Sunstein, Cass R., ‘The Polarization of Extremes’, The Chronicle of Higher Education, 54.16 (2007), B9 Taylor, Josh, ‘Facebook v Apple: The Looming Showdown over Data Tracking and Privacy’, The Guardian, 2021 <http://www.theguardian.com/ technology/2021/feb/14/facebook-v-apple-the-looming-showdown-over-datatracking-and-privacy> [accessed 18 April 2021] Times, Financial, ‘TikTok Wants to Keep Tracking IPhone Users with StateBacked Workaround’, Ars Technica, 2021 <https://arstechnica.com/ gadgets/2021/03/chinas-tech-giants-test-way-around-apples-new-privacyrules/> [accessed 12 April 2021] Tufekci, Zeynep, ‘Algorithmic Harms beyond Facebook and Google: Emergent Challenges of Computational Agency’, Colorado Technology Law Journal, 13 (2015), 203 Wong, Julia Carrie, ‘More than 1,200 Google Workers Condemn Firing of AI Scientist Timnit Gebru’, The Guardian, 2020 <http://www.theguardian.com/ technology/2020/dec/04/timnit-gebru-google-ai-fired-diversity-ethics> [accessed 15 April 2021] Yang, Yuan, and Patrick McGee, ‘China’s Tech Giants Test Way around Apple’s New Privacy Rules’, Financial Times, 2021 <https://www.ft.com/ content/520ccdae-202f-45f9-a516-5cbe08361c34> [accessed 13 April 2021]


38

THIS PAGE INTENTIONALLY LEFT BLANK


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.