NULL ISLAND

Page 1

NULL ISLAND 25.04.2019

Exhibition 05.05.2019

‘ E X P L O R I N G THEBUSIESTPLACEONEARTH THAT DOESN’T EXIST.’

ADS4 SLUICE Gallery

Exhibition E9 6JY


TABLE

OF

CONTENTS

A

ESSAY ‘Null Island and the consequences of classification’

3

B

12

1 2 3

EXHIBITION Map ‘Behind Closed Doors’ ‘Made in our image(s)’ ‘World Wide West’

14 18 24

C

CREDITS

28


3

NULL ISLAND AND THE CONSEQUENCES OF CLASSIFICATION

‘At the centre of the world there is a fiction; a fictional piece of land a metre wide by a metre long. It has not been thrown up from the depths; not from the violence of lava bursting up and cooling, though there is a violence in its history. It is called Null Island, and you cannot travel there.’—Jon K Shaw & Theo Reeves-Evison Null Island is the busiest place on Earth, but it is impossible to visit. It is the most photographed place, but impossible to find. Null island is located where the equator meets the meridian, where nature meets culture (The equator: the middle of the planet, the line surrounding the earth halfway between its magnetic poles; a line established by probes and sensors, by investigation of a scientific kind. A line more found than made. The meridian: a line inscribed on the globe, tethering that globe to the capital of a faded empire whose persistence is still felt. A line that hides a history of cultural violence, that delineates time, that prejudices the economies of the west over the east. A line not given but created.) Yet the problem with Null Island is that it does not exist. This non-place is only visible to machines. It is the product of an interaction between the planet, our species and technology. A place, measuring one metre long by one metre wide, where a natural order, a constructed order, and—more recently—a digital order coalesce.


The point where the lines intersect, 0° North, 0° East, perplexes machines. Computers need a piece of land on which to ground their calculations. So we feed them a fiction, throw a nonexistent island out into the ocean. In return they run the numbers for our GPS, guiding us home safely at night, tagging our photos and mapping our memories, aligning our satellites and connecting us across the globe. For machines, Null Island is a necessary fiction. For us, it is an unnecessary fact—one that embodies the often paradoxical, sometimes dangerous, consequences of digital classification. Although this place may not exist physically, the ramifications of its existence are tangible and material—it affects our behaviour and changes the ways we interact. Neither real, fake, but unreal, Null Island makes manifest a simple truth: the way we/machines classify our world produces unintentional, and often problematic new truths, objects, and worlds. To classify is human. We all spend large parts of our days creating taxonomies and we make up a range of ad hoc classifications to do so. We constantly classify objects, animals, places, illnesses, occupations and ideas. We create separations and ordinances based on certain ways of categorising the world in its material and social dimensions, which are culturally inherited and formalised in manuals, checklists, statistics, or bureaucratic procedures. From the simplest forms of personal organisation, such as how files are organised on a computer, to more complex social and cultural forms, such as how we identify in terms of gender, sexuality, race, ethnicity, and nationality, we are immersed in systems of classification. Yet, for any individual, group or situation, classifications and standards can give an advantage or they create suffering. Jobs are made and lost, some regions benefit at the expense of others. Behind all classification systems, however trivial or neutral they may seem, there is a certain intentionality, the consequences of which

4


5

affect social relationships and, ultimately, the identity of individuals. The difference between the myriad modes of classification that affect our lives is not that some are biased, while others are not. There is always bias. The only difference is that some systems of classification are more visible than others. Yet this visibility often only occurs when the injustice of any given system becomes socially untenable. For example, the ‘diagnosis’ of homosexuality as an illness caused untold suffering for multiple generations, but was only widely acknowledged, and subsequently ‘demedicalised’, in the wake of the LGBT social movements of the 1960s. Sometimes, despite a productive visibility, many classification systems remain vehemently, almost innately, conservative, taking a long time to alter their taxonomies in order to reflect the values of society at large. For example, libraries in more than 138 countries organise their resources according to Dewey Decimal Classification. This proprietary system is the most widely used in the world. But given its nineteenth-century origins, many have noted its longstanding and problematic cultural biases, including its Christian religiocentrism, racism, sexism and homophobia. Throughout the lifetime of the system, homosexuality has appeared under various categories, including: 132: Mental Derangements, 159.9; Abnormal Psychology, specifically 159.9734746: Sexual Inversion/Homosexuality; or even 616.8: Neurological Disorders. It took until 1996 for it to move into 306.7: Sexual Relations. Finding ways to ensure that insidious biases within systems of classification are minimised—and that the taxonomies which underpin everyday life reflect progressive social mores—is the core motivation behind this exhibition. By uncovering manifest absurdities, we hope to generate important (and overdue) conversations regarding classification and its consequences. However, the specific case of Null Island testifies to the ways in which


our contemporary condition is somewhat more complex than historic cases like the Dewey Decimal system might suggest. Humankind has conceived and honed classifications for two main reasons: as a way to find or make some order in the world; and, more simply, as a means to satisfy the basic human need to put things in certain places, so we know where they are when we need them. Yet, as outlined above, however imbricated these organisational structures or taxonomies become in our lives, they are often self-effacing. We don’t see or fully understand them—and this invisibility prevents us from asking a series of important questions. Like: what are these categories? Who makes them? Who may change them? How do they spread? And what is the relationship between locally generated categories, tailored to the particular space of the computer desktop, and the macro, commodified, elaborate and expensive categories generated by medical diagnoses, government regulatory bodies and technology firms? The invisibility of classification has been exacerbated by the rise of contemporary technology. On April 1, 2004, Google rolled out a free, advertising-supported email service: Gmail. When Gmail started, Google was still a one-product company, known primarily for its efficient search algorithm. It is therefore no surprise that Gmail was released touting full-text search ability as its main asset. Its original tagline: “Search, don’t sort”. We used to think sorting saved time. It once did, but it doesn’t anymore. Google’s logic suggests that an automated full-text search for words or numbers on a whole corpus of sources, left raw and unsorted, is a more powerful retrieval tool than the traditional manual process of first sorting items by topic, then looking for them in their respective folders. Today taxonomies, at least in their more

6


7


practical, utilitarian mode—as an information retrieval tool— are useless. A pragmatic chaos reigns. Beyond the obvious psychological implications of this new (non)order, as well as problems of retrieval (particularly when human forms of classification interact with machines that obey no human logic—think of a worker trying to retrieve an object from an Amazon warehouse without any technological aid), this “death” of the taxonomy also has deeper, more nefarious consequences: attitudes emerge which are the unwitting product of machine forms of (non-)classification. Tech companies work every day on the design, delegation and choice of classification systems, yet few see these systems as embodying moral and aesthetic values that in turn craft people’s identities, aspirations and dignity. As the world becomes ever more automated by artificially intelligent machines, understanding the ways in which the world is classified has taken on a critical importance. Like us, machines classify in order to understand and order the world. Yet the ‘soft’ AI that powers Google, Netflix, Amazon, etc., is based on a series of classifications that facilitate a crude form of machine learning— with this machine ‘learning’ going largely unchecked. Consequently, the biases that exist in society quickly become embedded in the machine. There is now ample evidence of the destructive biases that are latent within technology. Recent research has shown that the 100 top recruitment websites, which are now powered by algorithms, present high-paying job ads six times more frequently to male candidates than to females. ProPublica also recently revealed that algorithmically generated risk assessments used by US judges to guide their decisions at bail hearings were unreliable and prejudiced after discovering that black defendants were twice as likely as white ones to be flagged as future criminals. Even if these measures are built with the best intentions, their growing usage is beginning to

8


9

aggravate discrimination and impact the very minority groups who have historically been subject to unfair judicial treatment. The manifestation of this machine bias not only reveals something about the society from which the biases came, but also about the hidden classification systems we are all, unwittingly, subjected to every day. The targeted advertising we face, the products we are recommended, and the TV programmes we are encouraged to watch, are all the result of assumptions made by machines due to certain classification protocols. While the existence of a class-based society may be nothing new, the longterm implications of opaque, automated and algorithmic classification certainly are. So what might Null Island teach us about this confusing new normal? As the illogical product of a machine mode of existence, it has the potential to make manifest the absurd, novel and disconcerting consequences of contemporary AI and its unique approaches toward classification. Each day countless people seeking digital directions on their computers and smartphones are diverted to an isolated spot in the Atlantic Ocean, 1600 kilometres off the western coast of Africa, where the Prime Meridian and the equator intersect. This lonely way station in the Gulf of Guinea is, according to its (now defunct) website, a thriving republic with a population of 4,000, a roaring economy, a tourism bureau, a unique native language and the world’s highest per capita use of Segway scooters. In the realm of digital cartography, it is one of the most-visited places in the world. The only problem for its millions of visitors is that there isn’t much to see. Null Island does not exist. This digital “island”, which is described by cartographers as “the default destination for mistakes”,


exists as a result of programming errors in geographic information systems (GIS). Whenever you enter a location into your computer or smartphone, a program converts that information into coordinates. If there’s an error in the information you’ve entered, or if the code doesn’t understand that you’ve entered “null” or “no information,” the program is liable to get confused and default to “0,0”. Recent search-engine requests for a bike-sharing location in the Netherlands, a car-rental agency in Portugal and a polling station in Washington DC have all been sidetracked to Null Island because of the result of typos or coding errors. On one day in June, GIS cartographers counted 1,708,031 misguided location requests that had landed there from a single mapping software—a fraction of the total daily when including all mapping services and applications world-wide. Several years ago, the crime mapping application for the Los Angeles Police Department made LA City Hall look like the centre of a crime wave when its mapping analysts made it the default location for hundreds of crime reports with undecipherable addresses. To fix that problem, the analysts instead routed mislabelled crime reports to Null Island. While the exact origins of “Null Island” are murky, it reached a wide audience no later than in 2011 when it was drawn into Natural Earth, a public domain map dataset developed by volunteer cartographers and GIS analysts. In creating a one-square meter plot of land at 0°N 0°E in the digital dataset, Null Island was intended to help analysts flag errors in a process known as “geocoding.” If it was first enjoyed as a cartographers’ in-joke, the previous sections of this brief suggest a darker side to this non-place. With algorithms dutifully classifying the characteristics of our contemporary world through this openly available data, is Null Island’s apparently booming economy, thriving tourism industry and disproportionately high crime rate beginning to skew global perceptions about equatorial Africa’s circumstances?

10


11

Null Island is the remarkable product of conflicting forms of classification. As such, it provides a useful prism through which to gauge our contemporary condition—a vehicle for exposing the latent biases, mercurial privileges and invisible structures that (as a result of technology) govern our world today. Tom Greenall, Nicola Koller, Matteo Mastrandrea


B 1

EXHIBITION 12 Map ‘Behind Closed Doors’ 14 Justin Bean and Guy Mills

2 ‘Made in Our Image(s)’ 18 Alex Findley and Paul Bisbrown 3

‘World Wide West’ 24 Ben Mehigan, Kane Carroll and Divya Patel


13

The exhibition is comprised of three works, each of which uses the data set found on Null Island in a different way. These approaches can be broadly delineated as: Interpretation, Organisation and Interrogation.

2 1 3


1 BEHIND CLOSED DOORS Organisation In our current era, every human trait and action can be translated into a data—data which can be understood by a computer. Machines are now party to every event or decision; our human world has been comprehensively adapted for, and mediated by, machines. This process may aid productivity, and profit, however it is useful to consider that transposing a scenario to a dataset isn’t simply an act of translation. These processes have far reaching consequences in the real world which we should all be aware of. We have access to an online library of information, exponentially growing in size; each of these pieces of data can be linked back to an individual, event or object in reality. The Internet Nation Embassy attempts to highlight the importance of what it means to have open-source data online the Embassy has the power to grant asylum to data, investigate the repercussions of geo-located data, and utilise its organisational role to represent data at null (0,0). This exhibit takes the form of a security scanner— the universal symbol of threshold. This workstation offers an insight into the usually private processes of authentication through an autonomous of office scene. Taking references from London embassies, this endeavours to expose the inner-workings of these decision making processes; exhibiting the archaic systems embassies typically use to process individuals and expose the empathetic, human worker who manually organises the data and ultimately makes decisions on what can be granted the geo-location of (0,0). A

Four scenarios and asylum applicants are presented:

The exponential increase of rhino poaching due to poachers downloading open-source image data from image

14


15

sharing websites such as; Flickr. The images, often taken by tourists visiting safari parks, unwittingly give poachers daily updates of the rhino’s locations. B

Fitness app, Strava, compromised secret military bases in Syria and Afghanistan by publishing an opensource heat map showing the routes of all it’s users to the whole world and subsequently exposed supposedly dormant military bases. C

The cartographic errors of the Wisconsin 2012 election led to many addresses tagged at null (0,0), and left many people fearing they were unable to vote. D

The accelerated erosion of natural landmarks with insufficient infrastructure due to geo-tagging on Instagram. This phenomena produced an unprecedented footfall on an unregulated and exposed landscape. Each of these scenarios are represented by a series of inanimate objects, inviting the viewer to consider the reality behind the binary 12-digit location tag. The scanner assesses each scenario by considering a variety of consequences and real-life events. The outcome of this process is a binary response, in or out. We all benefit from endless conveniences delivered by machine efficiency, however we are increasingly becoming victim to the potential privacy issues which have arisen as an side-effect. We now each have a digital shadow following our every move; every train we board and cash withdrawal we make is another piece of data that adds to the digital image we have made of ourselves.


16


17


2 MADE IN OUR IMAGE(S) Interpretation Society, generally speaking, has become consumed by the images. In 2014, it was estimated that a total of 1.4 billion images were uploaded to the internet everyday using applications such as Facebook, Instagram and Flickr. This total is set to increase exponentially. The proliferation of imagery and visual data has largely been driven by the increased accessibility to and availability of, various methods of uploading images. This has fostered the development of a superficial, ocularcentric society, continuously uploading pictures depicting every pointless nuance of quotidian life. These images, once uploaded, are pinpointed to an ‘exact’ set of coordinates, giving a detailed description of their location through a process known as Geotagging. Flickr is just one of the many photo-sharing platforms that uses the geotagging feature. As a consequence—through misinterpreted or altered metadata—a vast amount of images are misplaced globally. Since 2006, over 300,000 geotagged images uploaded Flickr have been mislocated to Null Island. This number continues to rise. The project, Made in our Images(s), The First Judgement, interprets the images uploaded to 0°N, 0°E (Null Island)—treats them as if they had a logic that needs to be deciphered. To process the vast amount of data uploaded, a process of Machine learning, namely General Adversarial Networks, was applied to this data set. When input with thousands of images, this process learns how they are arranged and starts to generate its own version of that data set. This process allows for significant amounts of images to be interpreted allowing for a ‘general image’ to be created often with uncanny and obscene results. The resultant output of Made In Our Images(s) is a fluctuating depiction of 21st century society, culture and

18


19

beliefs, brought through the lens of machine learning, in the form of an animated triptych and a series of physical artefacts. The three panels that comprise the piece represent different stages or sectors of images found on Null Island: A

The Left Panel depicts the conception of Flickr, with the panel blurring in and out of existence and beginning to reveal what people start to upload at its origin. Due to it being relatively new, and hence lacking in data, the scene depicts a sparse landscape revealing a few ‘fads’ of that bygone era. B

The central panel is based, much like the primary reference of Bosch’s The Last Judgement (1486—see image on following pages), on a hierarchy of image that make up the piece in relation to the overall size of the panels. Through Flickr’s tagging feature, images are quantified and compared against each other, generating a hierarchy in relation to the frequency of occurrence. This in turn generates a 2D size allocated to each of these popular images with relation to the original Triptych. C

The rules and regulations that are apply Flickr, through restriction on explicit sexual and violent images, inform the right-hand panel, with a pulsating landscape built from all that may have been removed. The Panel captures how society chooses to acknowledge (or not) this content, through a process of blurring and pixellation.


20


21


22


23


3 WORLD WIDE WEST Interrogation ‘The complex undertaking we call ‘law’ requires at every turn the exercise of judgement, and that judgement must be exercised by human beings for human beings. It cannot be built into a computer.’—Lon L. Fuller World Wide West acts as a window into past, present and future systems of technology, allowing us to debate their neutrality, and as judiciary, formulate a verdict on the ethical and social consequences of them. Consciously aware of its status—as an artefact created by designers—the film sets out to objectively interrogate the unknown profession of the Law; following seemingly unconnected rabbit-holes of technological and ethical research. The film—a succession of impressions; interviews and video fragments combine in a way that is less cinema, and more internet; using Null Island as a starting point, particularly incorrectly geotagged LA COMPSTAT data located at null (0,0), it begins to question the role that technology is playing not only crime but policing and law. Constructed from a series of live interviews, the narrative intends to cast a holistic perspective of the emerging and controversial landscape known as Legaltech. The interviewees: selected for their valuable perspectives across the contemporary judiciary system, offer an insight into: artificial intelligence systems passing sentence; regulatory bodies, legal education and the inconsistencies found within criminal prosecution. These avenues are presented visually as a series of panning landscapes; ambiguous in their location, endless in their boundaries, these cinematic shots are an analogous representation of the uncertainty that surrounds Legaltech.

24


25

Through the considered curation of these seemingly unconnected interviews, a colloquial conversation appears, one which is both easy to understand but difficult to comprehend. The viewing experience of the piece encourages visitors to inhabit the role of the jury; with the evidence presented on screen, spectators are urged to critically engage with this nebulous legal world, analyse what is presented before them and offer a verdict on its wider implications and potentially nefarious consequences.


26


27


NULL ISLAND ‘An exhibition exploring the busiest place on earth that doesn’t exist...’ Opening hours Private View Location

28

25.04.2019-05.05.2019, 11am-5pm (Thursday-Sunday) 24.04.2019, 6-9pm (Wednesday) Sluice Gallery, 171 Morning Lane, London E9 6JY

Exhibitors Justin Bean, Paul Bisbrown, Kane Carroll, Alex Findley, Ben Mehigan, Guy Mills, Divya Patel Roaming Projects Will Rees Collaborators Hannah Barry, founder of Hannah Barry Gallery, Peckham. Tom Clark, co-directed and co-curated Arcadia Missa Gallery, Soho. Editor in chief and founder of Arcadia Missa Publications. Laura Jouan, graphic designer and long running collaborator of the Architecture Department. Adrian Shaw, curator, Tate Britain. Introduced large scale music programming to Tate over 10 years ago. Now established as ‘Late at Tate’. ADS4 Nicola Koller, Tom Greenall, Matteo Mastrandrea Students Vanessa Assaf, Justin Bean, Paul Bisbrown, Kane Carroll, Larry Chan, Eleni Elia, Alex Findley, Astrid Jahncke, Lyndal Mackerras, Ben Mehigan, Guy Mills, Divya Patel, Benedict Spry ADS4 (Architectural Design Studio 4) is a critical design studio embedded within the School of Architecture at the Royal College of Art, London. Since 2001, the studio has been specialising in speculative near-futures and alternative nows. Using strategies and techniques from different disciplines in order to challenge conventional approaches to architectural design, representation and communication, we explore the complex and often contradictory world around us, analysing key shifts in thinking & emerging trends, technological breakthroughs and the new behaviours they bring with them. ADS4 weaves these developments into alternate future scenarios, exploring a world where the design outcomes are very different from ones indicated by existing surprise-free masterplans. We accept no givens; everything must be questioned, dissected and examined, with design as our critical tool. Driven by social fiction and real people, not science fiction or broad-brush demographics, our projects make possible futures familiar and in doing so become powerful tools that enable us, and others, to question our design values. website: ads4.co

instagram: @rca.ads4

twitter: @ads4_rca


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.