all watched over by machines of loving grace
Drones in the Urban Zeitgeist A Participatory Technology Assessment exploring the Democratic Implementation of Unmanned Aerial Vehicles (Drones) into Civilian Urban Environments. Alexander Farr 140205543 Submitted in partial fulfilment of the Degree of MArch Architecture & Town and Regional Planning University of Sheffield, September 2016.
2
Abstract
UAVs, or commonly drones, are becoming more commonplace within the civilian Cover image world. Over the last half century, UAVs have developed from a primarily military Fig.1 technology, into far reaching, progressive technological system. Most notably Drone filming. within the fields of surveillance and logistics, the drone threatens the status quo in more ways than one. This paper documents and categorises the complex web of definitional narratives surrounding UAVs, both natural and constructed. Four critical case studies are identified as central to contemporary discussion surrounding UAVs:
Left Fig.2
Not automated logistics; police enforcement; mass surveillance; and the use of UAVs as Drone Drones, antidrone music private-professional tool. It is argued that the UAV technological system has wide- campaign. ranging impacts with broad, and tangential actants, and hence it is the democratic and ethical right of societal actors to participate in the conversation of legislation and implementation. This is achieved through a variant on the Participatory Technology Assessment, specifically designed to account for temporal and geographical distances between participants within a small sample base. The report then critiques the relative merits of this variation. The analysis concludes that effective regulation and legislation are critical to ensuring effective implementation of unmanned aerial vehicles in civilian urban environments, and makes policy suggestions towards such goals. Additionally, it argues that education and participation should be central to further consultation toward UAV implementation and calls for government consultation efforts to consider citizen values.
Key Words Public sphere, technology assessment, participation, public involvement, consensus, citizens’ juries, drones, unmanned aerial vehicles, UAV, UAS, SECT.
3
Acknowledgements The author would like the thank the following people for their suggestions and contribution toward the production of this dissertation: Matthew Cotton, Amy Garrod, Sandra Shipley, Dee Turnell, Julia Udall, Aidan While, and all anonymous participants who took part in this study.
Glossary of Terms ALS
Automated Logistics System
ANT Actor-Network Theory, as developed by Latour (2005). pTA
Participatory Technology Assessment, a variation on traditional, expert-led technology assessment.
RPA
Remote piloted aircraft, uncommon usage, near-interchangeable with drone and UAV.
SECT Socially and Ethically Contentious Technology. UAV
Unmanned Aerial Vehicle, commonly, and near-interchangeably referred to as a drone.
UAS Unmanned Aerial System, associated with UAVs and including greater adjacent supporting technologies.
4
Contents 3
Abstract
3
Key Words
4
Acknowledgements
4
Glossary of Terms
5
Contents
7
Introduction
9
The Contextual Emergence and Development of UAV Technologies
17
Contemporary Points of Contention: Automated Logistics, Enforcement, Surveillance and Hobbyist Tool
25
Democracy, Ethics and the Emergence of UAS as Technological System
31 000
Designing an Introductory Stage Variant of Participatory Technology Assessment
39
Analysis of Participant Conversations and Emerging Themes
53
Meta-Analysis of Participatory Technology Assessment
59
Conclusions and Further Steps
References 65
Bibliography
81
Figures Appendices
84
Complete List of Participants by Reference
86
Post-Conversation Questionnaire
5
I like to think (and the sooner the better!) of a cybernetic meadow where mammals and computers live together in mutually programming harmony like pure water touching clear sky. I like to think (right now, please!) of a cybernetic forest filled with pines and electronics where deer stroll peacefully past computers as if they were flowers with spinning blossoms. I like to think (it has to be!) of a cybernetic ecology where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and all watched over by machines of loving grace. All Watched Over by Machines of Loving Grace Richard Brautigan (1967) 6
Introduction Decades before UAV’s prevalence, Brautigan sketched a utopian image of the animal and mechanical worlds living in harmony in his 1967 poem, All Watched Over by Machines of Loving Grace. However, as we inch closer to the reality of this singularity (Vinge, 1993; Moravec, 2000), it is becoming more apparent that considered controls and legislation are required to ease the process of integration. This report argues that is necessary to democratise the process surrounding this legislation. As Cotton (2014, p.1) writes, ‘certain forms of technology stimulate public controversy when facts about their risks and benefits are unclear, the values underpinning their governance are uncertain, and when emergent ethical issues are difficult to resolve’. Part of a long heritage of socially and ethically contentious technologies (SECT), unmanned aerial vehicles (UAVs, or commonly drones) present a complex knot of issues regarding ethical and considered implementation. Similar spatially-freeing technologies such as cars and aeroplanes have fundamentally altered the way humans interact with their environment. This report argues that UAVs sit alongside these technologies in offering the potential to affect civilian urban life in both predicted and unforeseen ways at a large scale. At the time of writing, UAV use by professionals and hobbyists is limited in urban areas (Civil Aviation Authority, 2015; FAA, 2013; FAA, 2015; DroneFlight) with stringent regulations. Pressure by private companies and hobbyist groups (Heisler, 2015; Kimchi et al, 2015; Misener, 2014) could see these regulations revisited in the coming years. Indeed, conversations with public officials have revealed that the UK government is holding a consultation on drone flight in the summer of 2016. Meanwhile, in the early 1900s (Rothstein, 2015, p.4-5), the rise of the car was seen as a ‘technological oddity’, one that was used as an object of tinkering by ‘wealthy sportsmen’. Critics derided this technology as ‘a fad, as dangerous, or as a luxury for the rich’. These same critics later came to complain about the lack of parking. But while cars came to symbolise our ability to ‘actively engage with systems of the world to our own benefit’ (ibid., p.123), drones manifest our ‘technological interrelation with each other’. Where a car ‘contains the image of a single person […], drones might tell the story of multiple people becoming part of a network’. However, you cannot just ‘flip
7
a switch to make a technology invisible’ (Carr, 2014). It disappears ‘only after a slow process of cultural and personal acclimation’. This report asks, during this acclimation process, where do contemporary cultural and personal views lie? This report locates itself within the precedent of participatory technology assessments (pTA) (Cotton, 2014; Hamlett, Cobb and Guston, 2008) to collect and review contemporary views on the presence of drones within civilian environments from a selection of relevant professionals, interested amateurs and expert lay citizens. Opening with a brief overview of the contextual emergence and development of UAV technologies, the first chapters of this report explore the key definitional characteristics of a UAV and identify a number of contemporary case studies. Chapters three and four explore the composition of the UAV as a technological system (Sclove, 1995) and ask what the technology means for democracy and personal right. It then explores in more depth the framework of pTAs, and makes the case for the research design used in this dissertation. Chapters five and six present the findings of a series of conversational interviews with participants and place them within a framework of academic points of contention and ethical considerations regarding implementation. The report closes with a brief overview of the findings and potential avenues of further investigation.
Fig.3 The most common narrative in the public eye is that of an unmanned drone committing military strikes abroad.
8
Chapter 1
The Contextual Emergence and Development of UAV Technologies Unmanned Aerial Vehicles, UAVs or commonly drones, have exploded onto the public scene within the last decade with large scale media coverage of the Obama administration’s, and others’, legacy of drone strikes. Tangential developments such as the revelations around mass surveillance, from Snowden’s NSA files in June 2013, (Al Jazeera; Macaskill and Dance, 2013) have converged alongside established media narratives to present drones as a complex, challenging object. However, UAVs have existed as an object since the 1960s largely as part of a military-industrial research program in America long before they emerged into the civilian zeitgeist. In this chapter we briefly investigate the heritage of UAVs, the narratives surrounding them, and key definitional terms.
Military-Industrial Development The drone emerged from the military-industrial ‘projectization’ system of ‘Total Package Procurement’, and is a result of decades of shifting regulations resulting in the military’s allocation of funding granting control over private companies’ research and development goals. What drones look like, how they work, and how they have been traditionally used all stem directly from this history (Rothstein, 2015, p.26). In this manner, the military-to-consumer trajectory of the drone reverses the historical narratives of the car, aeroplane, and computer, which arose from workshops around the world and were appropriated by the military as part of the war effort. This conceptual reversal shifts the zeitgeist from the civilian-beneficial world of the car, toward one informed more by military threats of surveillance, anonymised war and terrorism. The earliest successful model for the drone was a modified Ryan Aeronautics’ Firebee, commissioned by the US military into a reconnaissance drone, leading to dozens of variants of the ‘Lightning Bug’, a hugely flexible model (Rothstein, 2015; Wagner, 1982). However, following the Vietnam War, the Lightning Bug faded from prominence due
9
to the increasing capability of satellite imagery. It wasn’t until the General Atomics Predator in 1995 (Yenne, 2004) that the military role of the drone was cemented in the public view. The process of testing ideas in exceptional military conditions to later import them into the normality of urban life is not new. One can identify legacies of military technologies in our urban forms in the form of green buffer zones surrounding urban targets; fortifications finding their new home in suburban decorations. In the words of Jacob (2015), ‘first Helmand, then Hampstead’: the battlefields of Afghanistan have been ‘de facto experiments in new forms of networked spatial practice’. Apple’s new campus, and Google’s mega-bottery complete with robotic-construction drones exemplify Silicon Valley’s 21st century versions of utopian settlements as we enter the post-digital age of urban planning ( Jacob, 2015; Lange, 2012). With this origin, it is perhaps unsurprising that Rothstein (2015, p.78) concludes that despite surveillance technologies within drone systems being similar to that of Stingray (a phone monitoring system), or CCTV, ‘something’ about the drone as a platform is considered more worthy of regulation and concern. Echoing Brautigan, Jacob worries ‘are we watched over with loving grace but also by machines with suspicion in their eyes?’ [emphasis mine]. He compares UAV systems again with Roombas, which despite being the most popular service robot in the world, is considered a harmless backdrop to civilian life.
Defining a Drone So what sets the UAV apart from the CCTV camera, the guided missile, the hobbyist aircraft or the Roomba? The first paper (2014a) of Clarke’s four-part investigation into the legal ramifications of civilian drones outlines the difficulties of this onestop solution. One strict interpretation is that of ‘an unmanned aircraft that can fly autonomously’ (Villasenor, 2012), while a longer definition from the FAA defines a UAV as ‘a device used or intended to be used for flight in the air that has no onboard pilot. This device excludes missiles, weapons, or exploding warheads, but includes all
10
classes of airplanes, helicopters, airships, and power-lift aircraft without an onboard pilot. UA do not include traditional balloons […], rockets, tethered aircraft and unpowered gliders’ (FAA, 2013, p.A-5). While exhaustive, Clarke (2014a, p.235) notes that regulatory agencies are ‘constrained by constitutional limitations […] and this results in warped definitions of limited value for policy assessment’. Meanwhile, the Unmanned Aerial Vehicle Systems Association lists a number of distinctive definitions separating the UAV from other systems (UAVS, n.d.). In particular, it distinguishes UAVs from ordnance in that they can be re-used; from small hobby planes and remotely-controlled aircraft in that they can operate out of visual sight; and that they can operate with a remote pilot as opposed to onboard pilot. What is apparent from these analyses is that even the acronym is disputed: UAVs, UASs, RPA (Remote Piloted Aircraft) and others are all in use. There does, however, appear to be some movement toward agreement for UAS (Unmanned Aerial System) as the term of choice. Clarke, in his analysis (2014a, p.236), outlines four elements of a drone as being: the device must be heavier-than-air; it must have the capacity for sustained and reliable flight; there must be no human onboard; and there must be a sufficient degree of control to enable performance of useful functions. By excluding size from its definitional terminologies, UASs can begin to form a scale stretching from plane and helicoptersized systems right down to model aircraft in the common market, but a few are getting into even smaller territories. Since expanding from their military origins, UAVs have branched into multiple civilian spheres of use, including load delivery; passenger transport; hobby and entertainment use; journalism and ‘voyeurnalism’; and law enforcement (2014a, p.239). In the next chapter, we identify the critical points of contention among these uses and develop a number of case studies to highlight UAVs potential impact.
Drone as Robot, as Sensing Device Located within the umbrella term of robotics and IT artefacts, in that they gather, compute, and disseminate data, but also act in and on the world (Clarke, 2014b,
11
p.252), UAVs symbolise a progression of ‘ubiquitous computing’, or recently, ‘pervasive computing’, with the notion that they could come to be everywhere through invisible infrastructure (Weiser, 1991; Gershenfeld et al., 2004; Davis, 2001). Therefore, it is critical to understand the limitations and potential of computing and robotic systems to form decisions regarding drone use. For a full discussion of droneas-computer system and robotics device see Clarke (2014b), but within this document we will discuss briefly the drone’s characteristic as mobile computer with data sensing and analysing capabilities. Two key elements that Clarke (ibid., p.252) identifies are that of programmability (‘a robot is a computer’), and mechanical capability (‘that a robot is a machine’). As a computing object, drones are ‘utterly dependent on local sensors, remote data feeds and other sources’ (Clarke, 2014b, p.252). Due to the sometimes fallible nature of communications in this way, drones need to be able to cope with ‘erroneous feeds [and the] absence of data on which it depends’ (ibid.). However, recent developments indicate that drones are close to achieving ‘situational awareness’ (Kolodny, 2016), wherein UAVs can process visual data in real time to circumvent static objects and waypoints mapped using older technology such as GPS, avoiding potential obstacles (ibid.; Iris Automation; Manaugh, 2016). For lay observers, the robot is frequently conceived as in human form (Clarke, 2014b, p.253). The drone’s flexibility to appear as ‘inhuman other’ (Rothstein, 2015), naturally occupy inhuman space (the sky), and provide jobs that humans are physically incapable of performing mark the drone as ‘the first robot that physically surpassed us’. The discussions to emerge within this report need to therefore consider the extent that the participant views the drone as ‘other’ or as naturalized part of our urban environments. Drone as Automated Logistics Device A wider understanding of UAVs as forms of ‘distributed robotics’ (Clarke, 2014b,
12
p.254) widens the definitional net to include ‘independent elements that both process data and act in the real world’ (ibid.). This new definition then encompasses just-intime logistics and various other forms of intelligent transportation systems. This report adopts the term automated logistics system (ALS) as a term to include both the UAV system as physical nodal manifestation of a wider automated logistics network, but also driverless cars and automated delivery vehicles that carry out roles that would otherwise traditionally be performed by human drivers on road infrastructures. This conception of ALS asks to what extent are progressions being made in driverless vehicles and schemes such as Amazon Prime understood and welcomed (or otherwise) by citizens within urban centres.
Drone as Surveyor UAV history has outlined prominent applications by both military and civilian drones in observation and data collection. Surveillance of humans, while not new, has typically been resource intensive, limiting its use. However, it has also come under pressures stemming from the ‘negative impacts’ becoming much more substantial and more difficult to manage (Lyon, 1994; Lyon 2008) as privacy becomes more of a mainstream concern. Increasing levels of data collection made possible by the adoption of the mobile phone; the ATM; and wearable and portable computing devices reporting ‘GPS-derived coordinates’ and Wi-Fi network-based location techniques (Clarke and Wigan, 2011; Michael and Clarke, 2013; Stanley, 2013) have prompted what Michael and Michael (2007) term überveillance. Drones within this überveillance umbrella can provide constant, or near constant surveillance of individuals, or groups of citizens, with much lower resource requirements. Most concerning, the easing of cost-profile of long-term surveillance makes feasible the surveillance, not just of those who have been identified as committing a criminal act, but also for those inferred to commit a criminal act. In so doing, use of UAVs in this way enables the continuation of embedded prejudices against marginalised groups unless checked. Emerging against the increasing threat of surveillance, Mann (Mann et al., 2003; Mann, 2009) proposes a concept of ‘sous’veillance. Initially conceived through
13
‘wearcams’ such as Google Glass, ‘sous’veillance enables the monitoring of the powerful by the weak. In his analysis, Clarke (2014b, p.260) however deems it ‘naïve to anticipate that ‘sous’veillance […] can provide a sufficient degree of counterbalance against ‘sur’veillance by powerful organisations’. Despite Clarke’s misgivings, this poses the question to civilians as to how it is felt this technology could either restrict personal privacy, or instead, enable the reciprocation of monitoring onto powerful organisations.
Contemporary Drone Regulations The final defining framework required to enable discussion of contemporary issues is to briefly outline the contemporary regulations in place by both UK regulatory bodies (the CAA), as well as foreign and international movements (the FAA in America; EASA in Europe). One definition of regulation (Black, 2008; Brownsword and Goodwin, 2012, in Clarke and Moses, 2014) is given as ‘the sustained and focused attempt to alter the behaviour of others according to standards or goals with the intention of producing a broadly identified outcome or outcomes, which may involve mechanisms of standard-setting, information-gathering and behaviour modification’. Currently, many laws governing airspace are applicable to UAVs and unmanned systems, despite their creation being for the purpose of onboard piloted aircraft. Clarke and Moses (2014, p.272) point out in their analysis a number of challenges to the assumption of including drones within the traditional air traffic control (ATC) regime. In particular, they write of the physical separation of UAV and pilot necessitating the importance of reliable communication of commands across space but also the potential dilution of the pilot’s appreciation of the aircraft’s surroundings. Materially, the reduced physical size of UAVs leads to difficulty identifying aircraft to naked and assisted eyes, but also radar. The European Aviation Safety Agency (2015; EASA), while currently only guidance, has outlined a number of categories for the eventual categorisation of unmanned aircraft: open (low risk); specific (medium risk); and certified (higher risk). In short,
14
within agreed reason, open category drones fly without further regulation. Specific category drones, expected to be used by small and medium enterprises using UAVs as a ‘tool’ to replace traditional equipment would require a Specific Operation Risk Assessment (SORA) (ibid., p.24). The certified category applies when the risks rise to a level similar to that of a normal manned aviation (ibid., p.27) with certifications required to fly. Both CASA in Australia and the Convention of International Civil Aviation (ICAO) follow similar categorising frameworks. ‘Sport, recreation and education’ and/or ‘private use’ is distinct from ‘air work’ and/or ‘commercial “hire and reward”’ (CASA, 2013; in Clarke, 2014a, p.239) in Australia. The ICAO, meanwhile, class ‘other-than-commercial’, ‘recreational’ and ‘model’ uses as out-of-scope (ICAO, 2011, p.3). The CAA regulations within the UK generally follow these guidelines (Civil Aviation Authority, 2015). However, currently ex-military craft; research or scientific aircraft and those under 150kg are considered out of scope of EASA basic regulation, although they are subject to CAP722 and national regulation. For a more exhaustive examination of the forms of regulation and regulatory actions, see Clarke and Moses (2014). Notably, drone flight is not permitted out of line of sight, in non-segregated airspace, without an ‘acceptable Detect and Avoid system’ (ibid., p.99, para 1.18). In other words, the CAA regulations for flight without D&A systems restrict the UAS to within 400ft of the surface, within 500 metres, or line of sight, whichever is less, from the pilot, and outside controlled airspace or aerodrome traffic zone without permission from the ATC. The CAA additionally makes no distinction in requirements for non-military state aircraft (police, search and rescue, firefighting) from civil usage (para 2.14). To conclude, these regulatory bodies currently in place pose questions to challenge use of drones in cities. In particular, this paper seeks to explore whether citizens understand the regulations currently in place, as well as the distinction (if any) between UAVs and model aircraft. Further, do current regulations effectively or appropriately control UAV flight now, or in the near future?
15
Conclusions Within this chapter we have outlined the key heritage timeline for the drone, and key definitional terms from the literature. These terms, as shown, are not definitively defined, and different jurisdictions and bodies ring-fence different aspects of physical and operational characteristics. With formal bodies not reaching agreement on terminology, to what extent are these terms understood by professional, hobbyist and lay citizens?
Fig.4 Amazon’s Prime Air service offers delivery of packages up to 5kg.
Fig.5 Amazon are proposing Drones with brightly coloured promotional markings.
16
Chapter 2
Contemporary Points of Contention: Automated Logistics, Enforcement, Surveillance and Hobbyist Tool With the core trajectory and definitional terms of UAVs explored, we come to look more closely at contemporary points of contention. These contentions help shape the development of technologies through ‘pushing’, ‘pulling’, and ‘contextual’ factors (Williams, 2007, p.59). These factors, Rothstein develops (2015, p.71) can help group points into determining forces. Techniques and processes within the technology itself help ‘push’ the technology to new capabilities. Societal and cultural wants and pressures ‘pull’ the technology into new spheres of use. And finally, ‘contextual’ factors are the organisational, economic and systemic issues that create the backdrop to which the technology exists. Following the identification of key terms and impacting factors in the previous section, this chapter develops a key selection of contemporary issues. These studies: automated logistics; enforcement by policing forces; the threat of increasing mass surveillance; and the use of drones as private-professional tool are developed in sufficient detail to sketch out cases for discussion with participants later in the study. It is acknowledged that contemporary reporting of developments in these cases often fall within the remit of grey literature, representing official stances and reports. However, the focus of this chapter is establishing groundwork for an educated narrative and therefore the author argues the use of such literature to establish the state of play is acceptable and necessary in this instance.
Automated Logistics Services One of the most widely covered potential positive impact of UAV technology is automated doorstop delivery (DroneFlight, n.d.), championed by Amazon through its Prime Air service (Heisler, 2015; Malone, 2013; Kimchi at al., 2015). These Automated
17
Logistics Services distinguish themselves from the more commonly used term ‘automated logistics’ through the distinction between automation in the organisation of logistics, and the automation of the physical manifestation of delivery vehicle. ALS technologies enable the discussion of automated workforces, on-demand services, and autonomous systems. Both driverless cars and delivery UAVs present similar benefits, and drawbacks, to consumer society. Osborne and Frey (2013) predict that 47% of all jobs are at risk of automation by 2020, with skilled but repetitive jobs at high risk. Driverless cars in particular have had an upswing of attention in recent years (Bishop, in Messner, 2014, p.15), with automated lorries threatening to replace long distance haulage (Williams, in Messner, 2014, p.62). In an assessment of driverless haulage in the context of Australian surface mining, Bellamy and Pravica (2011) conclude that while such technology will save costs and increase productivity, knock-on social implications would result in the decreasing population of remote towns and a decrease in lower skills labour requirements. Prime Air, as symbol of potential airborne ALS programs, represents the potential for UAV to encroach further on jobs currently held by humans. Meanwhile, the challenges faced by driverless cars reflect the extent of difficulties faced by UAV systems. Importantly, the two-dimensional navigation currently developing in cars is made more challenging by the three-dimensional environment inhabited by UAV systems, with much greater degrees of freedom. While widely believed that three-dimensional navigation will prove a greater challenge, this is not firmly held. Murray (2007) considers the ‘freedom’ of the three-dimensional sphere easier to navigate without constraints of the existing ground network. For both, technologies have not yet overcome the issues of cost, uncertainty and the complex relationship of autonomous and manual control (Lin, 2013; Knight, 2013). A recent incident involving a Tesla automated driving system (Tesla Team, 2016) highlights the complexity of this challenge. In mistaking the side of a tractor against a bright sky, in what Manaugh (2016a) calls an ‘elision’, the trailer is seen as indistinguishable from the wider spatial environment. These events offer ‘momentary glimpses [into] where robotic perception has failed’ (ibid.).
18
In late 2015, Amazon announced an expansion of its radical Prime Air service into tests in UK airspace. Despite the initial announcement in 2013 in the USA, tight FAA regulations put a curb to most testing outside of privately owned land with low flight overheads much to corporate bodies’ chagrin (Misener, 2014). Recently, approval from the government allowed Amazon to step up tests of its ‘sense-and-avoid’ technology in the UK (Titcombe, 2016) in a relaxation of the CAA’s current regulations.
Police Enforcement and the Threat of Mass Surveillance In a quintessentially ironic news story that summarises the contentious dichotomy surrounding enforcement, a surveillance drone was grounded days after a successful use of the technology in stopping a criminal in Merseyside (Sharpe, 2010). This case study outlines two main issues surrounding the use of UAVs in police surveillance: eligibility and accountability. In the case of Merseyside, the UAV was utilised for a specific incident, an identified criminal tracked following an outstanding crime. Alternatively, we pose whether UAVs should be used for pre-empting crime, or for anonymous big data collection at the urban scale. In this first instance, we ask participants: is drone surveillance acceptable in situations when criminals have already committed a crime and are at large? And if so, what methods are in place, or should be, for bodies to regulate when a drone is eligible to be used. Without sufficient overwatch and accountability, existing groups of ‘usual suspects’ and ‘undesirables’ can be placed under increased surveillance by resource-light ‘covert use’ (Clark, 2014c, p.290) UAVs, undermining ‘social cohesion’ (Finn and Wright, 2012, in Clark, 2014c, p.290). Clarke (2014c, p.300) ultimately concludes that natural controls, organisational self-regulation and formal regulatory systems currently in place are not sufficient in monitoring UAV use in privacy terms. He calls for formal regulation to be essential. With the proclaimed threat of increasing terrorism, governments are proposing increasing infringes of civil liberties surrounding privacy and surveillance. On the trail of Snowden’s revelations regarding NSA mass surveillance and the omnipresent
19
threat of CCTV in the UK (Giannoulopoulos, 2010), UAV-mounted cameras offer the opportunity to progress further. Graham (2010, p.127) makes clear that as the ‘new notions of permanent and boundless infrastructural war’ becomes an ‘overpowering obsession’, there will be a legitimisation of the ‘reengineering of the everyday systems’ to create a ‘wholesale hollowing out, or potentially even eradication, of democratic societies’. As a technology, UAVs can effectively be read as a convergence of the police helicopter and CCTV camera. Where the CCTV camera is static but resource light, the helicopter resource intensive and mobile, the UAV can be mobile without the resource load. The helicopter’s ‘distant and abstract’ ‘machinic-prosthetic’ (Adey, 2010; Bishop and Phillips, 2002) perspective is then relevant to UAVs, surveying the symbolically abandoned, ‘non-citizen’ populace below. The technology in this manner seeks to achieve ‘verticalised omniscience’, observing the city ‘in its structure’ (Barthes, in Adey, 2010, p.54). The UAV-as-CCTV, therefore, is a panoptic device (Clarke, 2009c, p.287; also, Bentham, 1791; Gandy, 1993), collecting constant, potentially-anonymised, bigdata of all the ‘non-citizens’ within its purview. One potential use of this technology can already be seen in the abuse of automated number-plate recognition, most notably by the UK government (Clarke, 2009).
Drones as Private-Professional Tool A number of private-professional sectors have expressed interest in using UAVs as tool in services, including topographic mapping (Azmi, Ahmad and Ahmad, 2014), forest monitoring (Wallace, Lucieer, Watson and Turner, 2012; Paneque-Galvez, McCall, Napoletano, Wich and Pin Koh, 2014), fish habitat monitoring (Tamminga, Hugenholtz, Eaton and Lapointe, 2014), agricultural survey (Honkavaara et al., 2013) and as constructional safety inspection tool (Irizarry, Gheisari and Walker, 2012). Other potential fields include architecture and urban planning, in 3D scanning and point clouds, and town and regional planning with relevance to GIS data and realtime data transference on site visits. Even where video is not the primary intent, as
20
in UAV-LiDAR producing 3D point clouds (Wallace, Lucieer, Watson and Turner, 2012), high definition video is used alongside GPS receivers and other technologies. For private-professional uses that focus upon visual data collection, user participation studies in safety inspection (Irizarry, Gheisari and Walker, 2012) show that remote viewing through screens iPad size and larger are close to equalling plain view results. Other potential benefits noted include the potential for photos from non-standard viewpoints to cover blind spots (Ito et al., 2011), a factor which can also find benefit in drone use in unsafe pipe inspection (Shukla, and Karki, 2016). In both these ways, UAV integration into professional services workflow has been shown to increase efficiencies, and safety among workers. However, this consistent attachment of video camera poses the ethical challenge: when does surveillance become acceptable? What marks surveillance UAV systems from private-professional tools is the collection of visual data on citizens, whether warranted or not, while the intent of the tools is often for data collection on other objects, whether oil pipes, fields, or construction sites. Beyond visual data collection, UAVs can operate as collectors of data for scientific and research purposes, including weather and temperature data. For these non-visual responses, the drone offers a higher degree of flexibility, stability, and detailed control than other systems, including weather balloons (de Boisblanc et al., 2014). Removing the requirement for a visual-based data collection system further enables the discussion of citizens’ ethical concerns regarding the technology: is the drone the symbol of objection, or the camera?
Redesigning the Urban Landscape It is not only the collection of data that requires UAVs to be outfitted with sensors. They also control the way in which the drone interacts and senses the world it navigates. Currently, UAVs typically rely upon human operators within direct line-of-sight, often coupled with GPS systems for orientation. However, once issues of cost have been overcome (Lin, 2013; Knight, 2013), new technologies offer the potential for UAVs to become increasingly autonomous, utilising simulated sight (Kolodny, 2016) and similar live detecting systems. However, to enable this level of automated technology,
21
airborne or otherwise, the future design of the urban landscape needs to be carefully considered. In an interview with Geoff Manaugh (2015), robotics researcher John Rogers discusses in length the implications of spatial design on these technologies. He highlights readability issues of reflective materials throwing off ‘his robots’ ability to navigate’, commenting ‘it actually appeared that there was a virtual world beyond the mirror’ creating the illusion of a labyrinth of rooms. In a typical civilian urban environment, heavily glazed towers are increasingly becoming the norm. To a UAV therefore, this extant cityscape would appear labyrinthine, a baffling maze to blindly stumble through. As technology improves however, simulated sight will enable nimbler navigation. Artist Liam Young explores this potential cityscape in his project City of Drones (Young and Cale, 2014), allowing participants to fly through a geometric abstraction of a city through the eyes of a drone. This fragmented vision reveals a landscape of planes and voids, conspicuously indifferent to inhabitants and security. Meanwhile, artists and futurists are drawing potential futures for our cities in the wake of redesigning our streets for inhuman eyes. Ledgard and Foster (2015), write about the notion of a ‘droneport’, a physical node-point for a cargo drone route. These new typologies will not be alone however, as Superflux’s (Anab, 2015) research project outlines a future co-habited city. In this city, drones will function as billboards for humans, while existing urban infrastructure will present drone-legible graphics and data. Two cities overlaid, visible, but not always interpretable by both sides. Ultimately, Manaugh (2016) envisions that ‘the American highway system, as well as all of the vehicles that will be permitted to travel on it, will be remade as one of the first pieces of truly robot-legible public infrastructure’. In this future, the design of our urban environments could be altered to make way for, or to obstruct, the implementation of UAVs. Proponents, governments, and corporations could tailor façades and billboards to the new drone occupants while protestors and opposition could mask their presence using disruptive patterns (Harvey, 2016) or sensor-baffling colour schemes (see Tesla, 2016 for adverse effects arising from blinding sensors).
22
Conclusions Building upon the previous chapter’s core definitions, we have outlined here a number of key cases in more detail. These cases are designed to be easy to understand by the lay-person, yet provide divisive, complex, and knotty problems to discuss. UAVs as surveillance tool can be understood in terms of civil liberties, whilst automated logistics challenge the notion of menial tasks requiring a human workforce. The points of contention discussed in this chapter highlight the broad reach of this technology and its effects on traditional boundaries of societal behaviour.
Fig.6 UAVs equipped with cameras present significant ethical challenges surrounding surveillance of urban areas.
23
Key On-board storage
Technological Artefacts
Technological Practices
Laser scanning
On-board storage
Air control staff
Professional DSLR
CCTV equivalent
UAV-legible signage
On-board automation UAV
Consumer-level camera
Atmospheric sensors
UAV
UAV
Remote pilot
Remote pilot
Remote pilot
Holding capability Camera
Dedicated landing areas
GPS GPS On-board automation
Infrastructure monitoring
Professional DSLR
Future background conditions
On-board automation UAV
Open space
UAV
Allocated airspace ‘sous’-veillance
3D scanning Recreational flying
Wind level data collection
Professional aerial filming
Depot delivery
City-scale surveillance
Amateur photogaphy
Air temperature data collection
Professional photography
Doorstop delivery
State police surveillance
etc.
etc.
etc.
etc.
etc.
Pilots Legislators GPS network Electricity
Hobbyist UAV
Research UAV
Professional UAV
Delivery UAV
Surveillance UAV
Particular Technologies
Other systems
Background Conditions Civilian-Sphere Unmanned Aerial Systems [Technological System]
Technological Order
24
Chapter 3
Democracy, Ethics and the Emergence of UAS as Technological System In the preceding chapters an evidence base has been established regarding UAV Left technologies as originating within a military-industrial context, a fact which has since Fig.7 directed the academic definition, of which there are multiple, and media narrative. The drone UAV technological It has also been established that UAV technologies act as manifestation of multiple system. societal fears, not least including surveillance and privacy. In this chapter we explore the reach of the UAV as technological system, as defined by Sclove (1995) and its impacts on tangential actants and stakeholders. We then go on to explore the impact of this networked system on democracy and the ethical rights of societal actors to participate within the conversation of legislation and implementation.
UAS as Technological System In line with Sclove’s definition of technological systems (1995, p.12), UAVs present background impacts, artefacts and practices both unique and shared. In this section, the evidence base outlined in prior chapters will be contextualised within this sociotechnical system (Latour 1987; Callon and Hughes, 1987) to illustrate its wide reaching effects. Sclove’s illustrative diagram connects actants, both ‘people and social groups, but also artefacts, devices and entities’ within a science and technology network (Cotton, 2014, p.31; also, Latour, 2005). The background conditions for each particular technology provide the support on which the system operates. For UAVs, these are physically light, currently requiring enough open space to land, and the air, which remains physically, socially and politically contentious (Graham and Hewitt, 2012; Adey, 2013). As illustrated in the previous chapter, as the technology increases in potency and seeks new uses, the background condition of our urban environments could see further physical interventions. While the extent is up for debate, our environments could see urban design accommodate
25
UAV simulated sight and landing capabilities. In addition to physical requirements, background conditions require various forms of control, legislation and regulation – elements that have been called lacking (Clarke, 2014c; Farber, 2014). The present scenario for background conditions for a UAS therefore is minimal, but expected to require greater resources in the future as technological integration increases. More specific to the system, particular technologies collectively combine ‘artefacts’, being the physical manifestations of the system, and ‘practices’, being the activity translating the artefacts into use. Detailed analysis of relevant artefacts is beyond the scope of this paper, however, the impact of technological practice has been sketched out in the previous chapter. The key taking, however, is the idea that artefact and practice are intrinsically linked in the embodiment of a particular technology. This idea of practice as central opens the discussion up from the capabilities of the technology toward the will and intent of society for its use. Intent, therefore, becomes a question of participant ethics and a question for strong democratic involvement. While the user intent is a critical contention point, the potential use of a UAS has a more complicated network of affected, including potential ‘non-users’. As Hill explains (2014c), these complex systems often produce unintended side effects: ‘when you invent the car, you also invent the traffic jam’. Economists typically label these phenome ‘spillover effects’ or ‘externalities’ (Sclove, 1995, p.14). In a lawn mower metaphor, a conscious non-user may argue, “since I’m suffering from the noise anyway, why not buy my own power mower and at least benefit from the convenience?” (ibid.). In this way, non-users are affected, agglomerated, and eventually society is transformed by this system. This transformed society, populated in this instance by UAVs of various intent, can no longer be avoided by citizens who choose not to directly participate. This technological-system-as-social-structure therefore necessitates that all citizens are eligible to participate in deliberative discourse regarding the suitable implementation of this technology.
26
Democracy and Ethics This second portion of the chapter outlines the democratic responsibility of lawmakers and planners to consider citizen viewpoints due to the impact of UAS upon citizens, both direct participants and otherwise. As Winner (1986) writes ‘technological innovations are similar to legislative acts or political foundings that establish a framework for public order that will endure over many generations’. The longevity of impact dictates that the decision be not left in the hands of those interested in the potential mainly for ‘economic reasons’. These parties tend to ‘push the development, whereas other actors are lagging behind’, write van Eijndhoven and van Est (in Kluver et al., 2000, p.116). They continue that a ‘lack of scrutiny from the part of these actors may be to the detriment of the direction in which the development takes place’. In this section we look briefly at the ‘meta-ethics’ of decision-making processes, and ways in which this can include or exclude certain voices (Cotton, 2014, p.26). As opposed to disruptive technologies (Udall, 2016), which bring completely new approaches thus making prior developments redundant, the UAS can be seen as an iterative development, building upon prior histories and trajectories of cars, planes, and CCTV camera systems. This ability of the UAS to shift the existing situation through an intervention warrants some form of participation, write van Eijndhoven and van Est (in Kluver et al., 2000, p.117). They conclude that a participatory arrangement can be seen as an effort to ‘broaden the conceptual […] basis for decision-making by bringing more actors into play’. Cotton (2014, p.168) agrees, writing that ‘the elicitation of ethical judgements and selection of ethical principles should remain participantcontrolled in order to satisfy the bottom-up legitimacy of the ethical evaluation argued for’. This bottom-up legitimacy presents an alternative to more traditional top-down forms of technology assessment wherein the board is comprised of experts from related fields (Siegetsleitner, 2011). However, Rawls argued (1995) in moral matters there are no experts, and philosophers have no more moral authority than others. Therefore, whether professional philosophers or not, the advice of only experts is insufficient to provide balanced judgements (Reber, 2006).
27
Further, the insufficiencies of what Clarke (2014c, p291) calls ‘natural controls’ on the application of drones questions the effectiveness of ‘normative ethics’ in determining civic response to UAS integration. Where Phillips (1979) describes normative as being the effect of the particular structure of culture regulating the functions of social activity, Clarke (2014c, p.292) notes that characteristics of UAV technology render these structures deficit. In particular, he notes any harm to corporate reputation has little impact, due to the surveying party either having ‘substantial institutional power’, ‘market power’, or ‘little regard for their reputation’ (ibid., p.291). Soft regulatory controls, such as organisational self-regulation is also currently lacking. Clarke (2014c, p.292) notes that despite a ‘moral right’ to individuals’ rights, Privacy Impact Assessments are virtually non-existent in literature regarding drone surveillance. He concludes (ibid.) that in the ‘absence of any evidence of commitments by organisations’ it is difficult to regard ‘self-regulation playing any role in the control of drone surveillance’. In the absence of effective organisational control, Cotton (2014, p.30) notes that issues such as whistle-blowing predominate in situations where stringent ethical codes of practice are in place for professionals such as engineers. However, these ethical codes are not sufficiently protected, leading to retaliation (Rober and Kleiner, 2005) and a patchwork availability of legal protection (Oliver, 2003). This results in many engineers being insufficiently protected by formal regulation (Peterson and Farrell, 1986). Therefore, neither soft-moral regulatory controls, nor organisational self-regulation is sufficient to ethically protect drone usage and intent. Cotton (2014, p.35) meanwhile notes that in normative ethical justification, ‘negotiation is at the very least undesirable’, and that ‘processes of negotiation are at best inappropriate, and at worst, counter-intuitive to the search for objective ethical truth’. As noted however, in the context of UAVs, existing systems are ineffective in establishing a regulatory structure. Rejecting the notion of an objective ethical truth in favour of case specific ethical structures presents the potential for negotiation between parties to become a more desirable solution to achieve a regulatory baseline for UAS. However, despite the potential for negotiation and bottom-up legitimisation to create effective new forms of regulatory structure, the ethical distinction between different
28
forms of value that citizen actors hold becomes a significant factor. As Cotton notes (2014, p.26), the ‘question over the ethical legitimacy of SECTs stands independently of the strength of public concerns over safety’. He notes in particular that intensifying public fears does not equate to increasing technological danger. It is critical, therefore, that in the discussion of such technologies, the assessment not be swayed by rhetoric: that decision making not be swayed by ‘moral panic without room for independent philosophical justification’ (ibid., p.27).
Conclusions This chapter condenses the expert evidence base of previous chapters into a succinct image where technological artefact and practice are inseparable. Outlining the democratic and ethical responsibility of consultation, this chapter argues that consideration need be paid to not just the technocratic elite who develop and control these technologies. Instead, consultation should be sought by those who are both deliberate users, future potential agglomerated users created through proliferation of the technology and non-users, wilfully or otherwise. It is additionally argued that current natural controls and soft-regulations are insufficient in effectively creating ethical roadblocks to improper usage of UAV technology. In this insufficiency, normative and objective ethics, reflecting cultural moral activity, do not adequately regulate UAVs in their technological system. Instead, we argue here for a form of case-specific assessment, toward a goal of effective regulation through deliberative discussion with citizen participants.
29
Fig.8 Citizens protesting the ethical concerns raised by the military use of drones by the Obama administration, 2013. 30
Chapter 4
Designing an Introductory Stage Variant of Participatory Technology Assessment ‘Whether a technology requires political scrutiny, and if so, where and how exhaustively, should correspond roughly to the degree to which it promises, fundamentally or enduring, to affect social life’, writes Sclove (1995, p.28). This report has shown that unmanned aerial vehicles constitute a new technological system, one that reimagines the use of existing logistical and infrastructural networks and challenges presupposed notions of public, private, and rights to privacy. Rothstein notes in his analysis of drones (2015, p.78) that despite the technologies of surveillance within drone systems being similar to existing systems, something about drones as a platform are considered more worthy of regulation and concern. This concern regarding UAVs as platform, asks the question of legislators and governing bodies: is the technology at issue technically workable? Sclove writes (1995, p.5) that for a technology to be workable in this regard, consideration need be paid to economic costs and benefits, and their distribution; their associated environmental, health, and safety risks; and their implications for national security. Where we have previously outlined key issues, this chapter outlines the proposed methods of consideration and the design of a participatory technology assessment to promote deliberative discussion. With the news that the government is due to hold consultation on drone flight in Summer 2016, this report argues that political scrutiny be opened to the democratic collective of laypersons and hobbyists to consider, argue and illuminate critical contentions for the contemporary urban zeitgeist. This paper therefore asks the critical question, what do lay citizens, amateur hobbyists and relevant professionals consider to be the pressing issues surrounding the effective and ethical introduction of UAV technological systems into civilian urban environments?
31
Participatory Technology Assessments As Hamlett, Cobb, and Guston (2008) found in their holding of the National Citizen’s Technology Forum to discuss nanotechnologies and human enhancements, stakeholders saw a need for ‘informed citizen input early in the process of developing such technologies’ (p.1, emphasis original). The authors found that average citizens ‘very much want to be involved in the decisions that shape technologies that, in turn, shape their lives’ (ibid., p.10), which led to increased ‘feelings of efficacy and trust’ (ibid., p.2). They conclude that information and oversight in transformational technology research should ‘not be restricted to environmental health and safety but should include social risks such as equity, access, and civil rights’ (ibid.). To encourage informed input from citizens, the research design proposed utilises an adapted form of Participatory Technology Assessment, henceforth pTA. pTAs emerged from a narrative of formal technology assessment, shifting from the ‘largely closed, institutional tool of policy analysis and advice’ toward a ‘tool for the social assessment of scientific-technological issues at the interface between politics and public discourse’ ( Joss, 2002, p.229). Moving from a singular, spatially limited point towards a continuum, including processes aimed at assessing and debating sociotechnological issues, begins to include citizens, scientific experts, planners, and consumers into a once discrete dialogue. There is a large academic background on the implementation of pTAs, including Cotton (2014); Dietz and Stern (2008); Guston and Sarewitz (2001); Hamlett, Cobb and Guston (2008); Kluver et al. (2000); and Sclove (2010). These pTAs advance traditional technology assessments that aim to ‘speak truth to the power’, toward a new aim of ‘finding solutions together’ and ‘generating dialogue’ in the words of the European Participatory Technology Assessment (EUROPTA; see Kluver et al, 2000, p.7; also Joss and Bellucci, 2002). EUROPTA defines the aim of pTAs an a ‘direct, interactive inclusion in the TA process of affected social actors, such as interest groups, consumers and members of the general public, alongside professional experts and policy makers’ (ibid., p.9) toward the creation of a ‘pluralistic, two-way communication between technical and non-technical actors’ (Cotton, 2014, p.10).
32
Kluver et al differentiate two forms of pTA, in which this report situates itself in the latter – public pTA, deemed as ‘more appropriate when ethical-moral issues are discussed’ (ibid.). However, Cotton (2014, p.93) writes that there is no single benchmark for evaluating the effectiveness of any specific deliberative methods in academic or policy literatures. The remainder of this chapter explores the research design process in the context of selecting a suitable dialogue for a specific purpose and circumstance. The structure and assessment protocol for this report follows the toolset laid out in flexible terms by Cotton (2014, p.95). Working through the process, this report establishes a participant-led dialogue wherein the researcher prompts first definitions and flows from participant responses. In this sense, the questions remain deliberately high level, opening with personal definition of UAVs, followed by greatest fears and positives of the technology. Reponses to these questions dictate the flow of conversation wherein the researcher builds upon raised issues through prepared case studies (Chapter 2). As discussion progresses, flexibility within the prepared evidence base enables a pseudo-deliberative forum to emerge with researcher as mediator. Identified actants within the sphere of UAV impact are wide ranging. Case studies present UAV technologies as affecting the military-industrial complex; road and air logistics networks; crime enforcement; architecture and urban design; and civil service sectors. In tangential relations, the technology presents reassessments of road technologies and surveillance equipment including satellites, helicopters, and traditional planes. They also present a reimagining of hobbyist and play objects through the integration into traditional model aircraft. With this broad impact network, UAV technologies affect lay citizens within urban environments, the police force, logistics networks, amateur hobbyists, and related professionals. Within this framework, this report comprises a number of interviews, individual and paired, with identified stakeholders from a range of backgrounds – professional, amateur and lay citizen. The selection covers a diverse selection of participants from multiple demographics including gender, education, employment, political leanings, and range in age from late teens to 50s. In the interest of protecting participants’ privacy, transcripts were anonymised, with participants referred to by sector and
33
number. The second professional, then, is referred to by PR-002 – remaining signifiers being HO, hobbyist, and LP, layperson. All participants have had the opportunity to review transcripts of the recorded interviews and comment and redact information as appropriate. Reflecting the widely affecting network of UAV technologies, participants were drawn from a number of geographical locations. While the majority of participants were drawn from urban areas and background, as prompted by the research objective, participants were additionally selected from differing geographical locations, namely rural areas near cities, and in primarily rural counties. This enables the cross-analysis of views and issues of import from both within and without city limits. Due to the wide distribution of geographical locations covered in the study, group forums involving multiple actors were impractical. In compromise, interviews took the form of educated conversations, wherein points of contention and detail from other interviews were integrated into dialogue to explore raised topics in more depth and across discussions. This conception of educated conversation situates the research design as a sort of citizen jury, progressing past the tendency of conventional opinion polls to gather, to quote Fishkin, ‘what people think’. The integration of detail to unpack opinions between conversations moves the dialogue beyond a statement of preferences toward encouraging a reflection on core issues (Fishkin, 1995; also Fishkin et al., 2000). While this paper does not seek to arrive at objective, ethical truth, the framing of the discussion in normative terms focuses the narrative analysis to arrive at unifying themes rather than reaching ‘consensus about dissent’ (Raiffa, 1994) or compromise and mutual learning. Cotton’s fourth stage indicates the assessment of ‘socio-economic, political and technoscientific information’ (2014, p.95). This data has been reviewed in prior chapters to contextualise and place UAV technologies within a historical narrative of militaristic development, stated fears of surveillance and privacy, and future private interests. Following examples set by the NCTF (Hamlett, Cobb and Guston, 2008, Appendix C), preparatory background materials sketch out case studies based upon journalistic reports and private corporation proposals. These sketches, developed in Chapter 2,
34
present opportunity to tease out deeper responses to topics raised by participants. In this manner, the researcher is embedded within the research as active participant, mediating topics and exploring cross-interview themes between individuals. Comparisons can be drawn between this method and that employed in the Netherlands in the Gideon NL pTA (Loeber, in Kluver et al., 2000, p.61), wherein the object was to provide an overview of the ‘various existing views on (future) [technologies]’. As educated, informal conversation, the protocol was deliberately loose ( Jacob and Furgerson, 2012, p.5). Early, easy to answer questions (ibid., p.4) regarding background, relationship to UAV technology, and personal understanding of the term ‘drone’, shaped the progression of later topics, which were keyed to the individual participants’ responses. Thus, the perspectives of the participants served as the starting point for information gathering and analysis (see also the Gideon NL case study by Loeber, in Kluver et al., 2000, p.61). In this manner, the researcher, listening to generative themes, encourages participants to construct stories whilst constructing change (Gubrium, 2009; Friere, 2000) through a sequential decision process (Cotton, 2014, p.96), from ‘general discussion, identification of issues, affected actors and artefacts, [to] the drawing out of implicit ethical issues, reflection on relevant principles and personal reflections in the form of moral judgements’. The final sequencing of the decision-process, that being a weighting and decision procedure, is removed from the interview process as out of scope of this report, due to the segregated nature of the discussions. It is, however, the objective of this report to provoke further reflection in subsequent deliberative workshops.
Research Design Through the methodological framework of teasing out conversational narratives in the protocol, this report argues from a constructivist, relativist worldview. It argues that narratives regarding UAVs, and technological systems in general, are social phenomena and that ‘their meanings are continually being accomplished by social actors’ (Bryman, 2001, pp.16-18). It is expected that social bonds presently entwined around UAVs as observable phenomena are unable to be observed directly in an anti-
35
foundationalist manner (Marsh and Smith, 2001). Therefore, it is posited, these bonds only exist in dependent relationships between UAVs, the observer, and the social structures in place around the technology. Their emergence through social interaction and constant revision has been revealed in earlier chapters through discussion of media narratives, creation myths and the impact of the new aesthetic (Bridle, 2011; Jones, 2011; Ellis, 2011) on social bonds. Once collated and transcribed, the collected data was coded through thematic analysis (Lichtmann, 2013, p.246) to produce a ‘restoried’ (Creswell, 2014, p.13), chronological and coherent argument. This generic coding methodology, as defined by Creswell (2009, p.184; Cope, 2003) enables the reporting of lateral appearance of themes between participants. Agreeing with Creswell’s (2014, p.8) observation that ‘individuals develop subjective meanings of their experiences’, cross-interrogation of subjective stances offers a rich evidence base for further consideration of drone introduction.
Conclusions This chapter has outlined a variation on pTA that constitutes the foundation of the research design of this paper. Utilising Cotton’s ‘toolkit’, or fourteen stage process, as a base reference, the design follows early sections of a typical pTA, with an objective to identify and clarify key issues. This is discussed with the researcher placed as mediator between specialist knowledge, in the form of prepared case studies, together with threads from other conversations, and the participant in question. The following chapters present the findings of the interviews as carried out as described in this chapter. This analysis is presented in two parts: first we discuss the raised points of contention, explore the range of views expressed and link this to the established expert literatures. Secondly, the report critiques in greater depth the meta-analysis of participants’ reactions to the structure of the pTA generally.
36
Fig.9 Citizens’ cower under the perceived threat of mass surveillance by technology in this satirical piece.
Fig.10 Confusion over the use of drones in civilian areas leads to destructive ends in this satirical piece.
37
Interviewees No. Code 1 2 3 1 HO‐001 2 PR‐001 ‐2 2 2 3 PR‐002 4 PR‐003 5 PR‐004 6 PR‐005 7 LP‐001 0 ‐1 ‐2 8 LP‐002 ‐2 0 2 9 LP‐003 0 2 ‐2 10 LP‐004 1 2 2 11 LP‐005 ‐3 1 2 12 LP‐006 ‐2 2 2 13 LP‐007 ‐2 2 0 14 LP‐008 0 2 1 14 LP‐009 ‐1 0 ‐1 15 LP‐010 1 2 1 16 LP‐011 2 3 1 Total ‐8.00 17.00 8.00 12 Mean Average ‐0.67 1.42 0.67
Above Fig.11 Full responses to postconversation questionnaire. Survey can be found in Appendix B.
38
4
5
6
7
4
‐2
4
0
0 2 2 2 0 2 4 ‐4 4 4 ‐2 2 4 ‐2 3 2 2 2 0 2 4 4 2 4 3 0 2 4 ‐1 4 3 2 4 34.00 ‐1.00 37.00 2.83 ‐0.08 3.08
‐4 0 0 1 2 1 2 1 ‐2 1 2 4.00 0.33
Questions 9 10 no response 2 ‐2 ‐4 n/a n/a n/a no response ‐1 1 ‐4 ‐1 0 2 0 ‐2 1 2 2 4 0 2 4 2 2 4 2 0 4 2 1 2 1 0 3 4 4 0 2 1 4 15.00 9.00 20.00 1.25 0.75 1.67 8
11
12
13
14
15
16
17
0
2
2
4
4
‐4
‐2
4 2 ‐2 1 0 2 0 ‐1 0 1 2 9.00 0.75
3 4 1 0 2 2 2 0 0 2 4 22.00 1.83
4 4 1 ‐2 ‐2 ‐2 ‐2 1 ‐1 1 2 6.00 0.50
2 2 ‐4 ‐1 4 4 2 2 2 2 ‐1 ‐2 1 2 4 2 4 2 2 2 2 4 3 2 2 2 ‐2 2 3 3 ‐4 ‐2 1 1 ‐3 0 0 1 ‐4 1 4 4 ‐2 2 29.00 31.00 ‐13.00 6.00 2.42 2.58 ‐1.08 0.50
Chapter 5
Analysis of Participant Conversations and Emerging Themes Taking the pTA variant as a foundation for a researcher-led discussion into contemporary points of contention surrounding UAVs, 14 participants were interviewed from a range of backgrounds, and will be identified in this report by a code for their sector and number. The three sectors represented here include hobbyist and amateur interest (HO), covering non-professional users; relevant professionals (PR), covering professionals and academics in directly related fields; and LP (layperson), covering all other parties. For a full list of participant numbers and outline interests, refer to Appendix A. Relating back to the contemporary points of contention identified in Chapter 2, participants were challenged with case studies designed to draw out opinions through storytelling techniques. This chapter will explore their response to these issues, and delve deeper into a number of threads developed between participants to develop participatory solutions these issues. Following the discussion, participants were invited to complete a short questionnaire (Appendix B) after consideration of the cases. These ranked from -4 (strongly disagree) to +4 (strongly agree), and presented here as a mean average of responses received (Left).
Defining a Drone Participants were firstly asked to summarise their relevant background and interests, and briefly define what they understood to be a drone. While avoiding the technical specifics of Clarke’s (2014a, p.36) four elements, the majority of participants were able to broadly meet the general understood industry guidelines, such as PR-005, who would ‘consider a drone to be remotely piloted aircraft, either controlled or autonomous’. Two interesting developments emerged from this discussion: intent, and the distinction from model aircraft.
39
While the UAVS (n.d.) distinguishes UAVs from remotely-controlled ‘model’ aircraft due to their ability to operate out of sight, for participants this line was less clear. PR001, for instance, compared the model aircraft he had as a child to a drone his son owns, ‘I look at those and think they are almost identical […] to the things that I had as a kid’. Others agreed that drones and model aircraft are completely the same thing, ‘it is the person that uses it in a different way’ (LP-001), distinguishing model aircraft as ‘pleasure utensils’ from UAVs as ‘purposeful utensils’ (LP-004). This distinction can be partially seen in guidance set out by the EASA (2015) where the use of a UAV as ‘tool’ requires a SORA. A similar discourse emerged with a small number of participants placing the distinction not on the UAVs physical attributes, but its attachment. PR-001’s ‘biggest break […] is the sensors, that capability to capture data in a very high resolution and higher quality’. PR-005 concurred, ‘the attachment [gives] it some intent, gives the drone the definition’. However, among lay participants this was disputed, indicating lack of effective public discourse. LP-001 didn’t ‘feel as though attachments or feedback contribute to the classification’ whilst LP-009 believed that was the only distinction, ‘that it watches and records stuff’.
Sensors Not all attachments were considered equal by all, however. PR-001 in particular talked at length about ‘the range of inexpensive and lightweight sensors that you could equip these things with’, and the idea of the UAV as ‘a kind of extension to the internet of things so you have this tool for data capture that is not fixed in one place’. LP-006 succinctly frames the lay participant view that ‘it’s the visual that causes the problems isn’t it?’ Overall, participant reactions to non-visual data collection was positive, with an average response of 2.417, reflecting PR-001’s comments that ‘I don’t think on the whole there is in society any negative association with sensors’. However, the visual capturing potential of UAVs coloured discussion of non-visual opportunities as LP-004 comments ‘you don’t feel the surveillance equipment on a hot air balloon’, yet ‘people might not know that there isn’t a camera on it’ (LP-007). This difficulty
40
separating UAVs from recording component is unsurprising, reflecting existing media narratives (Rothstein, 2015; Jacob, 2015) and which surfaced in later discussion of surveillance and personal security.
UAV Deliveries Overall, participants felt that the potential for UAV use in deliveries was mixed, with an average response of 0.75. A majority of participants felt that the implementation of schemes such as Prime Air were ‘half-baked’ and not ‘fluent on the details’ (LP-003). LP-003 felt that even if the scheme was implemented, if he would use a drone to get an item a few hours quicker, the ‘simple answer is no’. Talking in more detail regarding landing potential for these services, LP-003 noted that ‘it would worry [him] that if they got the wrong address and you land in the back garden with a bunch of kids’ that someone would get injured. PR-005 lent a more cautious eye too, sketching out that ‘you would have to geofence an area in your back garden so that if anything breached the geofence the drone would instantly rise up and not deliver the package’. Participant reaction more generally was in favour of a hub system, particularly PR005 who confirmed ‘they are talking more about the hub system, […] like a courier’s premises, so the final delivery realistically would still be a man in a van’. LP-011 warned that ‘if drones are going to be used as transport, […] they need to be a certain height up [and] they would need to come down in certain spots, kind of like a post office’. This hub and spoke system would circumvent the weight-based priority system developed by Zhang et al. (2014) in favour of multi-delivery at a single point. However, participants did express concerns regarding the over-proliferation of drones, particularly by larger companies, within a free-flying airspace scenario. Where LP009 looks up and ‘it’s just like a sea of flies’, HO-001 comments that ‘the sky is going to be black. It’s like every house on this street is full of six students, and each one of them […] is going to want a pizza. That’s a lot of people and a lot of drones’.
41
Despite concerns of wide usage, narrower markets were more warmly received. Two that emerged were ‘the advantage of delivery items to remote places’ (LP-003) and the use of UAVs in emergencies. LP-009 speaks of ‘where [her] nan lives now, that is remote if it snows. They had four foot of snow [and if ] the supermarket said we do a drone delivery service […] at least she would be fed’. This scenario is not dissimilar to that explored by Ledgard and Foster (2015) in Africa, delivering essentials to those in need and without sufficient, or lacking, infrastructure. Another comparison to Ledgard and Foster is the potential for delivery of medical supplies. Where motorcycle couriers are currently the norm, HO-001 can ‘imagine drones taking hearts and livers […] going at high speeds and saving lives’. Eight participants directly mentioned emergency usage, with all being generally in favour of the scheme. PR-005 mentions that ‘it’s easy for a hospital to have a mini helipad […] to deliver medicines, blood samples, that sort of thing’ while LP-010 and LP-011 were happy for hospitals to have ‘extenuating circumstances’ to their idea of flight paths, with LP-011 concluding that it’s an ‘amazing use for it’. In this distinction, participants’ comments differ from the CAA’s current lack of distinction between civil and state aircraft, and suggests policy exemption in certain emergency situations.
ALS and Autonomy Despite the majority of participants defining a UAV system as remotely controlled, there was significant discussion regarding the autonomous potential of this technology. A number of participants felt they would be ‘instinctively happier if it was a person flying it’ (LP-006; LP-003; LP-008; LP-010) while LP-011 pointed out that ‘if you compare drones to other technology we have like planes and cars […] it demystifies it a lot more’. This ‘inhuman other’ (Rothstein, 2015; Clark, 2014b) was a sticking point for multiple participants, with LP-003 supposing ‘it’s the unknown. We fear the unknown; we fear change’. LP-003 spoke at length regarding autonomous vehicles in ALS due to ‘previous employment’. His worries included failings of reference mapping services ‘only as
42
accurate as the information that is put into it.’ Regarding simulated sight developments (Kolodny, 2016), LP-003 remained unconvinced, speaking of the Tesla automated car crash (Tesla Team, 2016; Manaugh, 2016a) and ‘we’ll get there eventually, but we’re not there yet’. Despite an increase of literature regarding ALS services and employment (Bellamy and Pravica, 2011; Williams, in Messner, 2014; Osborne and Frey, 2013), participants’ responses to the topics were largely muted. LP-005 and LP-004 briefly conversed the impacts of taking ‘away the employment of that little man’ but that ‘the roads would be a lot safer than having lunatic delivery drivers racing up and down’. HO-001 reflected more on the impact of wider society, that it’s ‘going to lead to the universal salary […] eventually’. Ultimately, participants’ concerns for autonomy were based in unknown fears, largely distinct from the ‘ethical legitimacy’ of SECT deliberation raised in Chapter 3.
Personal Agency A vision of finding self in an automated and increasingly efficient world by HO-001 was built on by participants into a narrative of choice and agency surrounding UAV technologies. He speaks that ‘we’re in a new place, and I think that the public are more aware of it but frankly, what can they do? I don’t think that people think they have much power, and it’s going to get worse’. The reclamation of, or choice to reclaim, agency was raised by others, predominately surrounding ‘opting-in’ to surveillance. As PR-001 notes, ‘I make an unconscious decision every day to accept. I’m giving up a lot of my privacy, my private data, in return for convenience, in return for services’. This idea of exchange was repeated by LP-002 who found the ‘idea that a drone could track where you are […] a bit scary, but […] your phone tracks where you are at the moment’. His issue was that ‘it’s the fact that no one knew’ it was happening. PR-001 found that if ‘you make the option [to track] the default […] it’s not a good form of democracy’.
43
Without this agency, HO-001 foresees a future where greater surveillance could hold people more accountable, ‘even demonstrating about [a company] could be seen as losing a company money’. LP-002 agrees, that the choice to opt-out is potentially even illusory, that ‘if these technologies are the reserve of a certain class of people and they’re more visible, […] that could make the distinction between demographics bigger’. He describes that ‘if there was a more obvious difference, a drone area that was always surveillance drones and a drone area that was always delivery drones, then it would portray a bigger distinction between the two areas of a city’. As agency of ‘opting-in’ relies heavily on the ethics of education, significant questions are raised in the legitimacy of UAVs to carry out surveillance and the borders of acceptability.
Surveillance With such a focus on agency and accountability, surveillance was unsurprisingly a major topic of discussion. Two case studies were prepared for this topic, guiding participants to talk firstly about the use by police in specific cases with pre-identified criminals. In this reactions were wary, but overall in support with an average response of 1.67. The second asked participants for their views on mass, anonymised surveillance. Participants were overall against this proposal, with an average response of -1.08. Of the discussion, HO-001 was incredulous, thinking that ‘it’s extraordinary […], “Orwell will never believe that we’re the ones who want the cameras” [sic]’. In answer, LP-004 believes the use of drones in crime stopping is a good idea, where ‘anything like that, I would say, ‘go use it’’. Particular comparison was made between UAVs and policing helicopters and CCTV systems. LP-006 notes that ‘if it’s a resource […] that could save money […] I think it is a shame that it is not being used’. LP-010 agreed, that ‘they’d been identified as a criminal, and they’re using some resource to track him’. The idea that the UAV was acceptable following the crime being committed was popular, that ‘it wasn’t like they were using the drone to find the crime’ (LP-011). Support was also raised by LP-010 who noted a potential use for police and ambulance services following ‘a shooting in the area’ where ‘we now need to identify quickly, in an
44
emergency setting, is it safe for our people to go in?’ LP-010 and LP-011 both agreed with the statement that ‘it’s like an overwatch form of bodycam’. Some participants spoke of the common phrase, ‘if you have nothing to hide, you have nothing to fear’ (LP-002; LP-006). For some, however, the threat to their privacy was insurmountable, with LP-002, LP-003, and LP-006 against the idea of UAVs near their properties. A few participants meanwhile, expanded on the basic idea, with a level of agreement being reached that some surveillance was acceptable in ‘public areas’, provided notice was given. Comparison was drawn to the proliferation of CCTV surveillance, where ‘they will have little placards’ announcing a surveyed area (LP-003). PR-001 spoke of a mass-surveillance scheme capturing CCTV-like ‘high resolution aerial photographs in urban areas […] being sold to police forces’. While he believes ‘the opportunities for abuse are massive’, neither LP-006, nor LP-007 found any issue. PR-001 argued that while ‘it’s very easy to anonymise data by eliminating direct identifiers of an individual, it’s a really trivial type of anonymity’. LP-006 notes that if this trivial anonymity was broken, it could ‘identify people and their whereabouts if they are in refuge’. This raises an ethical concern that the use of UAV technology could unintentionally reveal private, for personal or legal reasons, information and that this should be considered in any surveillance policy. Despite worries about accountability for UAV policing and surveillance, Mann’s (2009; Mann et al., 2003) concept of ‘sous’veillance was largely uninteresting to participants. Only LP-001 thought that a ‘dual pluralistic interest’ of ‘organisations who were also observing other organisations through the drones’ would be suitable. Though whether true, or, as in Clarke’s view (2014b, p.260) ‘naïve’ to assume it would make a ‘sufficient degree of counterbalance’ remains to be seen in practice.
Professional Usage In other-than-surveillance use, such as those outlined in Chapter 2, participants were
45
generally more willing to welcome UAV technology. An average post-discussion score of 2.58 was in favour the use of UAV technology in mapping and GIS systems, while the use of drones in aerial filming and photography was surprisingly more muted, at 1.83. As LP-005 comments ‘I think it is a good thing. It’s cheaper, it’s safer. If you want to abseil down a building in these places, the health and safety implications are huge’. With first-hand experience in UAV use in the film and infrastructure monitoring sectors, PR-005 concurs, sketching out two examples: ‘a drone mounted camera could be more effective and quicker than a man driving up a closed runway with a pickup truck looking out of the window which is how it’s done now’. A longer sketch, drawing parallels to Zhang, Walters and Kovacs’s (2014) study expands: ‘with agriculture […] looking at things like targeted weed killing, […] makes an awful lot of sense to send a drone up automatically and have sensors to identify that broadleaf weed, hover, descend, and single dose an application of weedkiller […]. You suddenly realised that a 5kg tank could do a very big field’. These ecological gains were welcome generally, but LP-011 believes that the current approval process is the minimum case, that ‘they’re not one-off things that you just decide, you have to organise it and have a professional licence and have the land you’re flying on […] be informed’.
Safety and Security Beyond prepared studies, safety became a recurring concern despite the literature reflecting that most incidents have resulted in ‘little to no harm to the public’ (Clarke, 2014c, p264; BBC, 2011 in the UK; La Franchi, 2006). LP-003’s fear that a ‘malfunction, or a lack of communication between the machine and its control’ could bring down a drone near his family echoes Marks’s report (2012) of a death caused by a GPS data-feed loss. He continues that there are ‘blackspots for phone signals, despite technology’, that if ‘they still can’t get round that, then a drone is going to be the same’. In addition to man-made interference, participants noted the impact on and by natural
46
forces on UAVs. LP-003 noted that ‘weather, wind, rain, sunshine’ will have an effect while noting ‘unintended consequences’ of the ‘buzzing things’ on wildlife. LP-007 and LP-006 agreed, noting the effects on horses, farms, and sheep, while LP-011 spoke of personal experience with signage telling users to avoid coves with cliffs to avoid disturbing endangered birds. Participants felt that reports of issues around airports and similar areas lent validation to ideas of ‘no-fly zones’, through the use of what PR-005 called ‘DJAI software’, which ‘automatically prohibits the air product from flying over areas they designate as off limits’. LP-006 in particular noted that a ‘headteacher’ wouldn’t be happy if there was a drone over the school because she would want to know who was in charge of it […], what was going to happen to the information.’ PR-001 agreed that schools and neighbourhoods were welcome exclusions, noting that no-fly zones ‘would feel more democratic if it was not just government buildings’. Other areas were felt abler to accommodate UAV flight however, with LP-010 and LP-011 agreeing that ‘under licence’, ‘recreation zones’ and ‘hobbyist fly zones’ that anybody could use would be acceptable, with the public aware that ‘this is a zone, so [they] know what to expect here’. PR-001 warns, though, that no-fly zones would be unable to completely counteract this technology, that ‘to really avoid radio frequencies you would have to go out into the desert and start a community of likeminded people’. The narrative of intentional attack was also present, picked up from television, as LP002 notes, ‘I would think of them as being quite scary things’, ‘as being a military type’. Most responses on this topic picked up the contemporary terrorist fear (LP-006; LP-003; LP-004) where ‘the day will eventually arrive when you’ve got the modern equivalent of a flying bomb’, worried LP-003. As Luke (2004, p.113) writes, ‘the operational architectures of modern urbanism by their own necessities […] deploy […] what ironically are tremendous assets for destruction as part […] of mobilising materiel for economic production’. In other words, the presence of new and existing technology in civilian centres has long offered destructive potential, not only UAVs (Lewis, 2016). As LP-011 noted, you’d ‘have to know who was operating it. […] It’s the same as a car, the machine itself is not dangerous, it depends who is in charge of it’. In this sense, the perceived threat of attack on personal safety should not be seen as
47
material grounds to make an ethical claim to avoid implementation of this technology. However, valid qualms were raised over the obligation of policy to protect wildlife and ecological areas.
Regulation This increasing concern was reflected in participants’ responses to regulation. An average of 2.83 indicated strong desire for increased regulation on flight provision for UAVs, while response to providing more opportunities was mixed (average -0.08). Despite the context of a ‘post-regulation age’ where it’s ‘political will to deregulate’ (HO-001), PR-001 is concerned about ‘the way that technology is way ahead of the laws […], there’s this gap at the moment.’ The majority of discussion within the field of regulation was concerned with the closure of this gap between the capabilities of the technology and the law, with concerns about monopolising and exploitation by both companies and governments. LP-009 comments that ‘each supermarket [should only be] allowed X number of drones per square footage’ while PR-001 was most concerned about organisations like the NSA who, he felt, ‘jump in and massively take advantage of not just the absence of regulation, but the absence of enforcement’. Ensuring enforcement was upheld became the focus of many participants’ discussions, focusing on ensuring UAVs were identifiable, and owners held accountable for actions. The difficulty with markings, explained PR-001, is that despite different regulations regarding military versus civilian aircraft, ‘the problem [with drones] is, they’re not really visible’. One solution raised was the display of CCTV-surveillance-like flyers on posts in areas with UAV flyovers, or, as LP-002 described, ‘you [could] get a thing on your phone saying ‘you’re being droned’ ‘because you told it to for a parcel’. An alternative form of identification was developed, in line with current aircraft control, concisely explained by PR-005. ‘SSR is secondary surveillance radar […], effectively an identifying code, instead of an aircraft just being a blip on a controller’s
48
radar, it’s an identified blip, it has a four number code which identifies that aircraft’. Multiple lay participants agreed with this system, LP-004 and LP-009 in particular commenting that a server with an IP address equivalent logging of drones could link particular codes to owners’ licences. These codes would be picked up by local towers, LP-004 explains, so the CAA and the local air ground ‘know what’s coming in and out’. As such, ‘you could check that one [UAV] came over some villages at 1 o’clock’. PR-005’s view on the linkage between ‘unique identifier’ and ‘pilot licence’ was that the ‘default setting would be that it wouldn’t work,’ requiring a potential pilot to enter their details, log place of purchase, and ‘read basic rules of the air’ before confirming through ‘positive feedback to hopefully show that you have read and understood’. This expansion of air traffic control (or equivalent) challenges Clarke and Moses’s (2014, p.272) notion that the system would be inefficient due to the physical separation of pilot and aircraft. However, where Moses and Clarke are concerned regarding physical safety, participants’ suggestions tap into non-spatial linkages of pilot licence and drone to increase accountability of personal privacy. By keeping a log and linking it with licences, accountability could be more effectively controlled. LP-001 was particularly concerned about this technology as ‘there’s just the anonymity with who’s using it […], it’s already there in police helicopters. You can’t touch those people […] and they’re not as obvious as just security cameras would be’. LP-011 however, ‘would rather the police were using it than a private company’, with the difficulty being to guarantee ‘what it’s being used for’. With personal experience, PR-005 found that being aware of who was flying the drone, and for what, alleviated most concerns on the ground. Speaking of one encounter, ‘his attitude at first was very much you have no right to film my property’. However, ‘once we gave him all the information he calmed down and went away. It’s about informing people’. He believes as drones become more prevalent, the surveillance issue will ‘not just disappear, but it will recede’ and a ‘natural level [will] organically happen’. Communication and education in this way were considered important by most participants, with a number feeling these were currently lacking. LP-002 in particular claimed ‘it’s not very well articulated as to what a drone is allowed to do’, or ‘where you
49
can fly’ (LP-011). Existing barriers to entry are also hindering efforts: within educated communities, PR-005 found that engaged participants were often unwilling to welcome newcomers, that ‘some people think there is some kind of rite of passage.’ In ethical terms, LP-001 felt that audits need be published on the use of UAVs, and made public, but also that education be offered such that ‘the polis [can] be encouraged to take more of an interest in these things that govern their lives’. These barriers contribute to the findings that participant knowledge of current regulations was mixed, with only PR-005 having first-hand experience. He thinks that the CAA will find the ‘fine line between regulation and innovation’ and that ‘a lot of regulation will spring from the technology’. Where PR-005 hopes the regulations will not be ‘too harsh’, LP-002 thinks the public should have reassurance ‘that it has been properly regulated and thought through’. In this manner, LP-004 thinks they ‘should be licensed as in a gun licence’, such that ‘there’s a record of who owns what, what type of drone’ (LP-006), organised into ‘different categories’, the ‘size’, ‘quality, or level of detail of surveillance’ (LP-009). Despite the lack of clarity in defining these terminologies, by raising categories of use, participants’ showed general agreement with the current suggestions in place by EASA (2015) and other bodies.
Conclusions Ultimately, while participants’ contentions sometimes strayed into weakly grounded public concerns, a number of questions regarding ethical legitimacy of the UAV technological system were raised. LP-002’s statement that ‘transparency, education and regulation are my three buzzwords’ capture most participants’ worries. With generally lively discussion surrounding the case studies, surveillance and delivery services raised many concerns. Tangents of civil liberty, agency, ethics, privacy and necessity were commonly discussed while more theoretical propositions such as job replacement by ALS and urban reconfiguration due to sensor requirement often failed to engage participants for a meaningful length. The ethical right of natural habitats was discussed, notably the designation of endangered habitats as out-of-bounds for UAV systems. Specific ethical concerns raised included the right to be surveyed, and
50
the agency of choice to opt-in, or -out of any surveillance, by UAV or otherwise. As a result of these legitimate contentions, participants developed a potential policy framework for accountability and identification linking a uniquely-ID’ed UAV to a licensed pilot that resulting in publically consultable logs if a specific UAV need be queried.
51
52
Chapter 6
Meta-Analysis of Participatory Technology Assessment Analysis of emerging themes from discussions found that, overall, participants engaged more effectively with case studies related to personal experiences. Comparatively, theoretical discussion was found to be less engaging. In this chapter, analysis of the pTA variant utilised discusses the contribution of pTA structure and the relative successes and failings between individual and group discussions.
Value of Participatory Deliberation Following the discussion session, participants were invited to complete a post-discussion questionnaire (see Appendix B) reflecting on the cases discussed and overall feelings toward UAV implementation. Of the 14 participants, 12 returned questionnaires, with their responses used to calculate mean average responses, utilised throughout this and the previous chapter. Participants, when asked to clarify how certain they were regarding potential benefits of UAV technologies, were initially uncertain, with an average response of -0.67. Following discussion, participants became more positive regarding the benefits, averaging 1.42. However, they were less certain in experts’ confidence, which remained roughly neutral, 0.67. This contrasts with Hamlett, Cobb and Guston’s (2008, p.2) findings that following deliberation, participants held reduced certainty regarding the benefits of technologies. It is posited that this is due in part to reduced uncertainty in Left the capabilities of UAVs revealed in discussion surrounding UAV regulation.
Fig.12
While WCEC As a result of lowered confidence in experts, when asked whether they were more achieved a unique perspective concerned or hopeful about the integration of the technology, responses were mixed. of Park Hill with drone Participants were somewhat more concerned regarding integration, 1.25, while a photography, did the variant muted response of 0.75 indicated participants did see some future hope. Overall, with pTA perform as successfully?
53
an average response of -1.08, participants who completed the questionnaire felt that the potential risks exceeded the potential benefits. However, when asked whether they felt that citizens should be encouraged to participate in UAV regulations, participants were overwhelming positive, with an average response of 3.08, with no response falling below 2 (agree). This directly correlates with Hamlett, Cobb and Guston’s (2008, p.10) conclusion that ‘average citizens very much want to be involved in the decisions that shape technologies that, in turn, shape their lives’. However, it does not reflect their findings (ibid, p.2) that participants’ feelings of efficacy and trust increased following their role in consultation. While participants in this research show increased efficacy at being involved in participation, trust remained low. A number of factors could influence this decision, the first being participant trust in the government prior to consultation, particularly as the studies occurred in different countries. Secondly, while the NCTF (Hamlett, Cobb and Guston, 2008) input to a more formal, national scale study between universities, this study was more obviously distinct from government interaction, thus participants’ perhaps felt their voices were not yet being heard.
Individual Depth and Group Deliberation Discussion formats were divided roughly evenly between individual sessions (HO001; PR-001; PR-005; LP-001; LP-002; LP-003) and paired group discussions (LP004 and LP-005; LP-006 and LP-007; LP-008 and LP-009; and LP-010 and LP-011). As outlined in Chapter 4 the researcher emerged as mediator to enable a pseudodeliberative forum through the utilisation of emergent lateral themes. However, the research uncovered various degrees of success in terms of the deliberative potential of researcher-as-mediator pTA variant undertaken in this report. The research found that individual interviews, particularly HO-001; PR-001; and PR-005, tended toward offering deeper, more specific, views on case studies. These particular interviewees, holding pre-existing relevant interests, sought to discuss topics along individual agendas with only the researcher to offset. However, in group discussion, back and forth dialogue between participants often broadened discussed topics, guiding these discussions toward achieving reflection on core issues.
54
Lay participants additionally participated in back and forth dialogue to educate one another on topics that they were not expert in, for example, debating an agreed definition of UAV. In a collaborative move, LP-005 informed LP-004, who ‘suspects they don’t have cameras’, that ‘they can’. This form of shared learning promoted loose conversation-like interviews where opinions could be raised and discussed as opposed to only offering ‘correct’ responses (Raiffa, 1994). In addition to shared learning, discussion pairings provided space to deliberate and reach conclusions, the results of which can be seen throughout the previous chapter where drone definition and identification techniques were discussed. As an illustrative example, LP-009 and LP-008 held lengthy deliberative passages, including this following piece: LP-009: ‘Where it doesn’t actually have any benefit and it doesn’t help the wider people, it’s purely just because we can, we do.’ LP-008: ‘Yeah, I agree with that.’ LP-009: ‘I think it’s like saying everyone should have personal drones that’s linked to their phone that flies above them for personal security, is that really necessary?’ […] LP-008: ‘What I’m thinking about is my brother, because he rides a motorbike, he wears one of them headcams. It’s effectively the same thing, if that’s the route you want to take for personal security effectively that’s what he’s doing.’ LP-009: ‘I’m not saying that’s a good thing.’ LP-008: ‘But for me, he’s been knocked off his bike three times.’ LP-009: ‘But that camera, would you say it would be better if he had a drone than the camera?’ LP-008: ‘I think it would depend on the size of the drone.’
55
Within this exchange, LP-009 initially does not see benefit to UAV as personal surveillance tool, but following anecdotal evidence from LP-008, a deliberation is undertaken and ultimately the pair arrive at a mutual conclusion. Throughout these discussions, anecdotal storytelling was utilised alongside verbal scenario sketches to communicate response and deliberate presented case studies (Hubble, 2006; Holert, 2011). The majority of these proved effective, especially in paired discussions. As Hemmings (2011) explores, storying can be a critical technique to effect change, especially amongst laycitizens. However, in individual cases, specific relevant knowledge sometimes led to tangential anecdotes that weakened the overall focus of the discussion. HO-001 talked at length regarding his ‘off-grid’ project, his ‘slight intolerance to milk’ and its relation to ‘industrial corn products’ feeding back to the drone as manifestation of the ‘military-industrial context’. While interesting, particularly in the development of sensor and personal agency dialogues, in themselves lacked deliberative conclusions. Following the research design set out in Chapter 4, relevant threads were continued into later conversations to develop and explore related opinions from others. However, given the successes of paired deliberation shown above, it is felt as though more would have been gained to deliberate these tangents live with multiple participants as opposed to through a mediating researcher. Therefore, in cases without geographical and temporal limitations that necessitated a variant pTA in this research, it would be recommended to follow methods of ‘finding solutions together’ and reaching ‘consensus by dissent’ (Raiffa, 1994) in group forums as set out by literature on pTAs by Kluver et al (2000) and others.
Conclusions Reflection upon the variant pTA utilised in this research revealed a number of key findings. Firstly, participants continue to value inclusion within the participation process. Secondly, group forums more effectively drove discussion and conclusionfocused deliberation than individual groups. In comparison, individual discussions
56
resulted in deeper, anecdotal storytelling that necessitated a mediator to draw conclusions between conversations. Lastly, participants opined that inclusion within this process increased certainty of, and knowledge regarding, UAVs. As LP-010 effectively concluded, ‘there’s a lot, when we’ve talked about it today […] my eyes have been opened. Even from just discussion.’
57
58
Chapter 7
Conclusions and Further Steps
Through this report, we have argued the case for participant integration into consultation processes surrounding the implementation of UAVs in civilian urban environments. A review of the literature revealed a complex historical narrative that is shaping how UAVs are seen in the public eye as civilian usage expands. It shows a theoretical threat to jobs through increasing automation, yet currently uncertain regulatory processes present an unclear framework to guide this implementation. Framed within a technological system, it is shown how these technologies could become unavoidable in urban life. Therefore, this report argues, there is an ethical and democratic responsibility to include citizen viewpoints in deliberation through case specific structures. As such, a variant of pTA was deemed appropriate, utilising pre-prepared case studies, informed by an expert evidence base developed through the literature, through which potential usages and opportunities could be analysed. Citizen lay participants organised and offered case-specific ethical judgements working toward a number of regulatory suggestions, which will be reflected upon and summarised in this chapter. Conclusions drawn from meta-analysis of these discussions showed that group conversations produced greater deliberation and effective conclusions than individual discussions despite the researcher-as-mediator coordination designed to mitigate these failings. Conclusions drawn from participant discussions
Left Fig.13
Superflux’s LP-002 summarises the findings from the participant interviews most effectively, Drone Aviary for the V&A design stating ‘I think transparency, education and regulation are my three buzzwords’. festival 2014 images a future drones Stemming from case study discussion, the majority of participants concluded that where and people live together, albeit further, or a maintenance of current tight, regulation was necessary to control drone in controlled, zoo-like use. environments.
59
Through a narrative developed across a number of discussions, participants argued the case for digital marking of UAV systems with identification numbers tied to particular licensed pilots. Whenever a UAV was then spotted in an area, a local air tower could be consulted and logs requested to identify the owner and use of the local drone. In such a manner, every UAV could be identifiable, and held to account. A related dialogue made clear a concern about accountability and ownership of a drone, though the concern itself was pluralistic: terrorist threat; surveillance by police, or others; or monopolising corporations all featured as key reasons to be aware of the skies. Despite some of these concerns being largely groundless, some legitimate ethical concerns were raised, particularly the infringement of civil liberties through surveillance. These legitimate concerns considered a balance of potentials before arriving at a deliberated suggestion as opposed to an outright rejection of an avenue of technology use. The requirement to tie each ID with a licensed pilot emerged from a strong case for a need for greater education surrounding UAV systems. The deficit was plural: from non-participants observing a drone near their property, to interested parties struggling to get into the techno-ecosystem, a clarity of information was lacking as to what a UAV was legally allowed to do, and where it could fly. Policy suggestions made during the discussions, including no-fly zones, designated hobbyist areas, emergency usage, and delivery corridors, would be ineffective without a strong educational policy drive given the non-visually demarcated nature of the sky. As LP-001 explained, ‘there definitely needs to be audits on the use of these technologies in areas and what the products are. And that needs to be published.’ But the bigger issue is, ‘how can the polis be encouraged to take more of an interest in these things that govern their lives?’ With regard to new usage, there was strong positive feedback for the use of UAVs in emergency circumstances, including the delivery of tissue and organ transplants. In emergency situations, UAVs were proposed to survey an area to ensure safety before services moved in. This stance extended to surveillance use: participants were accepting of police use of UAVs in a manner alike to helicopters, in tracking and apprehending criminals for whom there was a warrant. Reaction was significantly less positive with regard to mass surveillance, particularly constant anonymised policing of neighbourhoods and private areas.
60
Finally, reaction to the potential for UAV delivery was muted, with a number of participants believing doorstop delivery was unnecessary. Greater consensus was reached over the proposal to allocate ‘UAV corridors’ between larger depots to prevent ‘swarming skies’. Interestingly, this proposal closely echoes that sketched out by logistics companies such as DPD (n.d.).
Policy Suggestions In summary, this report makes the following policy suggestions relating to the implementation of UAVs into civilian urban environments: • UAV networks must be sufficiently developed to provide an identifiable log for each drone near habitation. • Pilots need complete a test to determine flight worthiness and register the UAV to their licence and ID prior to commencing flight. • Public bodies need develop and provide clear, easily accessible and understood documentation explaining uses, categories, and legal flight areas for both pilots and lay citizens. • Public bodies should explore the potential of no-fly zones and geofencing technologies to control UAV flight. • Public bodies should explore the potential for UAV usage in emergency tissue delivery. • Policing services should encourage the use of UAVs in the tracking and apprehension of criminals with a warrant. • a
61
• UAVs should not be utilised to survey or monitor private dwellings or areas, for any purpose, without the express written permission of residents at that location and neighbouring locations. • UAV flight should be limited to allocated flight corridors or designated areas. Further research is needed to determine such appropriate corridors. • Lay citizens should be consulted alongside technical experts in forms of participatory technology assessment in the ongoing deliberation over UAV integration.
Further avenues for investigation The issues and contentions derived from these interviews are not intended to be generalizable, and are specific to the context of the integration of the UAV technology system. As Farber (1999) writes, these judgements and principles are not immutable, and these tools should be considered evaluation mechanisms to evaluate the complex problem. It is the stated objective of this report to identify actants, socio-economic, political and techno-scientific information, and ethical issues surrounding the introduction of UAVs into civilian urban environments. As outlined in the research design, the final weighting and decision process of Cotton’s (2014) toolkit was out of the scope of this report. This report sought to present an evidence base sketching the groundwork for further participant-led dialogues, particularly forums and workshops wherein participants’ ethical values, constructed through dialogue, explore a richer, highly interactive and iterative process (ibid., p.95) toward a greater weighting of policy proposals.
62
Closing Thoughts In closing, UAVs represent an ongoing increase in technological systems impacting our lives. They are complex, multi-faceted systems that touch on many ethical and personal issues. This report sets out the case for including citizenry within deliberations on this subject matter and outlines a number of policy suggestions raised through the participatory discussion process. As PR-001 explained, ‘the technology is way ahead of the law […], there’s this gap at the moment, which is a concern’. This report agrees with his statement, and encourages increased movement in the regulation and controls in place to effectively push for democratic implementation of this technology. Without an understanding of the government’s plans regarding UAV consultation during Summer 2016, it is hoped that the conclusions drawn from this report will be considered and implemented in the ethical consultation and discussion due to take place.
Fig.14 Superflux’s conceptual future for UAVroad signage. Perhaps regulators should take heed and ‘pull over’ UAV advancement before a consulted regulatory framework is agreed upon?
63
64
References
Bibliography
Books Abel, C., 2003. Sky High: Vertical Architecture. Vicenza: Graphicom. AugĂŠ, M., 2008. Non-Places: An Introduction to Supermodernity. 2nd ed. New York: Verso. Bentham, J., 1791. Panopticon; or, the inspection house. London. Bryman, A., 2001. Social Research Methods. Oxford: Oxford University Press. Callon, M., Hughes, T.P., 1987. Society in the Making: The Study of Technology as a Tool for Sociological Analysis. In: Bijker, W.E., Hughes T.P., Pinch, T.J., eds., 1987. The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. Cambridge: MIT Press, pp.83-103. Cope, M., 2003. Coding transcripts and diaries. In: Clifford, N., and Valentine, G., eds. 2010. Key methods in Geography. 2nd ed. Thousand Oaks: Sage, pp.445-459. Creswell, J. W., 2009. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. 3rd ed. Thousand Oak, CA: Sage. Creswell, J. W., 2014. Research Design. Thousand Oaks: Sage. Dietz, T., and Stern, P.C. eds, 2008. Public Participation in Environmental Assessment and Decision Making. Washington, D.C.: The National Academies Press. Farber, D., 1999. Eco-Pragmatism: Making Sensible Environmental Decisions in an Uncertain World. London: University of Chicago Press. Fishkin, J., 1995. The Voice of the People. New Haven: Yale University Press.
65
Friere, P., 2000. Ramos, M.B., trans. 30th Ed. Pedagogy of the Oppressed. New York: Continuum. Gandy, O.H., 1993. The Panoptic Sort: Critical Studies in communication and in the cultural industries. Boulder, CO: Westview. Graham, S. ed., 2010. Disrupted Cities: When Infrastructure Fails. London: Routledge. Grix, J., 2002. Introducing Students to the Generic Terminology of Social Research. Politics 22(3), pp.175-186. Hemmings, C., 2011. Why Stories Matter: The Political Grammar of Feminist Theory. Durham, N.C.: Duke University Press. Hill, D., 2012. Dark Matter and Trojan Horses. Moscow: Strelka. Hill, D., 2014a. The Commodification of Everything. In: Space Caviar ed., 2014. SQM: The Quantified Home. Zurich: Lars Muller. [draft online] Available at: <http://www. cityofsound.com/blog/2015/10/essay-the-commodification-of-everything-for-sqmby-space-caviar-lars-muller.html> [Accessed 6 October 2015]. Holert, T., 2011. Distributed Agency, Designâ&#x20AC;&#x2122;s Potentiality. London: Bedford Press. Hubble, N., 2006. Mass-Observation and Everyday Life. Basingstoke: Palgrave Macmillan. Joss, S., and Bellucci, S., 2002. Participatory Technology Assessment: European Perspectives. Gateshead: Athenaeum Press. Lange, A., 2012. The Dot-Com City: Silicon Valley Urbanism. Moscow: Strelka. Latour, B., 1987. Science in Action. How to follow Scientists and Engineers through society. London: Harvard University Press. Latour, B., 2005. Reassembling the Social. An Introduction to Actor-Network Theory. Oxford: Oxford University Press. Lichtmann, M., 2013. Qualitative Research in Education. 3rd ed. London: Sage.
66
Luke, T., 2004. Everyday Technics. In: Graham, S. ed., 2004. Cities, War and Terrorism. Oxford: Blackwell. Messner, W., ed., 2014. Autonomous Technology: Applications that matter. Warrendale, PA: Society of Automation Engineers. Moravec, H., 2000. Robot: Mere machine to transcendent mind. Oxford: Oxford University Press. Peterson, J.C., Farrell, D., 1986. Whistleblowing: ethical and legal issues in expressing dissent. Dubuque, Iowa: Kendall/Hunt. Phillips, D.L., 1979. Equality, Justice and Rectification: An Exploration in Normative Sociology. London: Academic Press. Raiffa, H., 1994. The Art and Science of Negotiation. Cambridge: Cambridge University Press. Rothstein, A., 2015. Drone. New York: Bloomsbury. Sclove, R., 1995. Democracy and Technology. New York: The Guildford Press. Siegetsleitner, A., 2011. Ethics in Trouble: A Philosopherâ&#x20AC;&#x2122;s Role in Moral Practice and the Expert Model of National Bioethics Commissions. In: Garner, B., Pavlenko, S., Shaheen, S., Wolanski, A., eds., 2011. Cultural and Ethical Turns: Interdisciplinary Reflections on Culture, Politics and Ethics. Oxford: Inter-Disciplinary Press, pp.41-50. Townsend, A. M., 2014. Smart Cities. New York: Norton. Wagner, W., 1982. Lightning Bugs and other Reconnaissance Drones. Washington, DC: Armed Forces Journal. Williams, K. W., 2007. An Assessment of Pilot Control Interfaces for Unmanned Aircraft. Washington, DC: FAA, Office of Aerospace Medicine. Winner, L., 1986. The Whale and the Reactor: A search for limits in an age of high technology. Chicago: University of Chicago Press.
67
Yenne, B., 2004. Attack of the Drones: A History of Unmanned Aerial Combat. Aberdeen, MD: Army Research Laboratory.
Journal Articles Adey, P., 2010. Vertical Security in the Megacity: legibility, mobility and aerial politics. Theory, Culture and Society 27(6), pp.51-67. Adey, P., 2013. Air/Atmospheres of the Megacity. Theory, Culture and Society 30(7/8), pp.291-308. Azmi, S.M., Ahmad, B., and Ahmad, A., 2014. Accuracy assessment of topographic mapping using UAV image integrated with satellite images. IOP Conf. Series: Earth and Environmental Science 18(2014). Bellamy, D., and Pravica, L., 2011. Assessing the impact of driverless haul trucks in Australian surface mining. Resources Policy 36(2) pp.149-158. Bishop, R., and Phillips, J., 2002. Sighted Weapons and Modernist Opacity: Aesthetics, Poetics, Prosthetics. Boundary 2 29(2), pp157-179. Calo, M. R., 2011. The Drone as Privacy Catalyst. Stanford Law Review [online]. Available at: <http://www.stanfordlawreview.org/online/drone-privacy-catalyst> [Accessed 28 August 2015]. Clarke, R., 2014a. Understanding the Drone Epidemic. Computer Law & Security Review 30, pp.230-246. Clarke, R., 2014b. What drones inherit from their ancestors. Computer Law & Security Review 30, pp.247-262. Clarke, R., 2014c. The regulation of civilian dronesâ&#x20AC;&#x2122; impacts on behavioural privacy. Computer Law & Security Review 30, pp.286-305. Clarke, R., and Moses, L. B., 2015. The regulation of civilian dronesâ&#x20AC;&#x2122; impacts on
68
public safety. Computer Law & Security Review 30, pp.263-285. Clarke, R., and Wigan, M.R., 2011. You are where you’ve been: the privacy implications of location and tracking technology. Journal of Location Based Services 5(3-4) pp.13855 Cwener, S. B., 2006. Vertical Flight and Urban Mobilities: the Promise and Reality of Helicopter Travel. Mobilities 1(2), pp.191-215. Davis, P., and West, K., 2009. What Do Public Values Mean for Public Action? Putting Public Values in Their Plural Place. The American Review of Public Administration. 39(6) pp.602-618. Durant, J., 1999. Public Understanding: participatory technology assessment and the democratic model of the public understanding of science. Science and Public Policy 26(5) pp.313-319. Farber, H.B., 2014. Eyes in the Sky: Constitutional and regulatory approaches to domestic drone deployment. Syracuse Law Review 64(1) [Online] Available at: <http://papers.ssrn.com/sol3/paper.cfm?abstract_id=2350421> [Accessed 17 August 2016]. Fishkin, J.S. Luskin, R.C., and Jowell, R., 2000. Deliberative polling and public consultation National Centre for Social Research, London 53(4) Gerhsenfeld, N., Krikorian, R., and Cohen, D., 2004. The Internet of Things. Scientific American. October, p.76. Giannoulopoulos, D., 2010. CCTV in the UK the omnipresent camera: a sign of a move towards a ‘surveillance society’. Archives de Politique Criminelle 32, pp.245-267. Graham, S., and Hewitt, L., 2012. Getting off the ground: on the politics of urban verticality. Progress in Human Geography 31(1), pp.72-92. Gubrium, A.C., 2009. Digital Storytelling: An Emergent Method for Health Promotion Research and Practice. Health Promotion Practice 10(2), pp.186-191.
69
Guston, D.H., and Sarewitz, D., 2002. Real-time technology assessment. Technology in society 24, pp.93-109. Hennen, L., 1999. Uncertainty and Modernity: Participatory Technology Assessment: a response to technical modernity. Science and Public Policy 26(5), pp.303-312. Hill, D., 2014b. Urban Parasites, Data-Driven Urbanism, and the Case for Architecture. Architecture + Urbanism, 2014:11. [draft online] Available at: <http:// www.cityofsound.com/blog/2015/10/urban-parasites-data-driven-urbanism-andthe-case-for-architecture.html> [Accessed 7 October 2015]. Hill, D., 2015. A Sketchbook for the City to Come: The Pop-Up as R&D. Architectural Design Special Issue: Pavilions, Pop-Ups and Parasols: The Impact of Real and Virtual Meeting on Physical Space, 85(3). [draft online] Available at: <http://www.cityofsound. com/blog/2015/10/sketchbook-for-the-city-to-come-ad-popups-parasols.html> [Accessed 6 October 2015]. Honkavaara, E., Saari, H., Kaivosoja, J., Polonen, I., Hakala, T., Litkey, P., Makynen, J., and Pesonen, L., 2013. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sensing, 5, pp. 5006-5039. Izarry, J., Gheisar, M., Walker, B.N., 2012. Usability assessment of drone technology as safety inspection tools. Journal of Information Technology in Construction 17, pp.194212. Joss, S., 2002. Toward the public sphere â&#x20AC;&#x201C; Reflections on the development of participatory technology assessment. Bulletin of Science, Technology and Society 22(3) pp.220-231. Marsh, D., and Smith, M. J., 2001. There is more than one way to do Political Science: On different ways to study Policy Networks. Political Studies 49, pp.528-541. Michael, K., and Clarke, R., 2013. Location and tracking of mobile devices: uberveillance stalks the streets. Computer Law & Security Review 29(3) pp.216-228.
70
Michael, M.G., and Michael, K., 2009. Uberveillance: microchipping people and the assault on privacy. Quadrant LIII(3) pp.85-89. Murray, R., 2007. Driverless Cars. IET Computing and Control Engineering. 18(3), pp.1417. Oliver, D., 2003. Whistle-Blowing Engineer. Journal of Professional Issues in Engineering Education and Practice 129(4), pp.246-256. Osborne, M.A., and Frey, C.B., 2013. The Future of Unemployment: How Susceptible are jobs to computerisation? Oxford Martin Programme on Technology and Employment. Available at: <http://www.oxfordmartin.ox.ac.uk/downloads/academic/future-ofemployment.pdf> Paneque-Galvez, J., McCall, M.K., Napoletano, B.M., Wich, S.A., Pin Koh, L., 2014. Small drones for Community-based forest monitoring: An assessment of their feasibility and potential in tropical areas. Forests 2014(5) pp.1481-1507. Rawls, J., 1995. Reply to Habermas. The Journal of Philosophy 92(3), pp.131-180. Reber,
B.,
2006.
The
Ethics
of
Participatory
Technology
Assessment.
Technikfolgenabshatzung â&#x20AC;&#x201C; Theorie und Praxis 2(15), pp.73-81. Rober, E., and Kleiner, B.H., 2005. To Blow or Not to Blow? That is the Question. Management Research News 28(11/12), pp.80-87. Sclove, R., 2010. Reinventing technology assessment. Issues in Science and Technology 27(1), pp.34-38. Shukla, A., Karki, H., 2016. Application of robotics in onshore oil and gas industry â&#x20AC;&#x201C; a review, part I. Robotics and Autonomous Systems, January 2016, 75, pp.490-507. Tamminga, A., Hugenholtz, C., Eaton, B., and Lapointe, M., 2014. Hyperspatial remote sensing of channel reach morphology and hydraulic fish habitat using an unmmaned aerial vehicle (UAV): A first assessment in the context of river research and management. River Research and Applications 2014(31), pp 379-391.
71
Wallace, L., Lucieer, A., Watson, C., and Turner, D., 2012. Development of a UAVLiDAR system with application to forest industry. Remote Sensing 2012(4) pp.15191543. Weiser, M., 1991. The Computer in the 2st Century. Scientific American. September, p.94. Zhang, C., Walters, D., Kovacs, J.M., 2014. Applications of Low Altitude Remote Sensing in Agriculture upon Farmerâ&#x20AC;&#x2122;s Requests: A Case Study in Northeastern Ontario, Canada. PLoS One 9(11): e112894. Doi: 10.1371/journal.pone.0112894.
Newspaper Articles BBC, 2011. Police Drone crashes into River Mersey. BBC News. [online] 31 Ocober. Available
at:
<http://www.bbc.co.uk/news/uk-england-merseyside-15520279>
[Accessed 26 August 2016]. Carter, T., 2012. Drones over America: Infrastructure of US Police State. Global Research [online] 21 June. Available at: <http://www.globalresearch.ca/drones-over-americainfrastructure-of-us-police-state/31535> [Accessed 28 August 2015]. Chow, E. K., 2012. Hello Drones, Goodbye Privacy. Huffington Post. [online] 7 May. Available
at:
<http://www.huffingtonpost.com/eugene-k-chow/domestic-drone-
surveillance_b_1324546.html> [Accessed 28 August 2015]. Curtis, S., 2015. Drone laws in the UK - what are the rules? The Telegraph [online] 18 April. Available at: <http://www.telegraph.co.uk/technology/news/11541504/ Where-is-the-legal-line-in-flying-drones.html> [Accessed 28 August 2015]. Eaton, J., n.d. Timeline of Edward Snowdenâ&#x20AC;&#x2122;s revelations. Al Jazeera [online] Available at: <http://america.aljazeera.com/articles/multimedia/timeline-edward-snowdenrevelations.html> [Accessed 27 July 2016]. Huffington Post, 2013. Daniel Biss Drone Bill: State Senator wants to regulate drones used for
72
crime fighting in Illinois. Huffington Post [online] 25 February. Available at: <http:// www.huffingtonpost.com/2013/02/25/daniel-biss-drone-bill-illinois_n_2760471. html> [Accessed 28 August 2015]. La Franchi, P., 2006. Kinshasa UAV Accidence highlights need for standards development. Flight Global [online] 9 October. Available at: <http://www.flightglobal.com/news/ articles/kinshasa-uav-accident-highlights-needs-for-standards-209768> [Accessed 26 August 2016]. Lewis, S., 2016, What we know about the attack in Nice. Time [online] 14 July. Available at:
<http://www.time.com/4407407/nice-france-truck-attack-what-we-know>
[Accessed 26 August 2016]. Lin, P., 2013. The ethics of autonomous cars. The Atlantic [online] 8 October. Available at:
<http://www.theatlantic.com/technology/archive/2013/10/the-ethics-of-
autonomous-cars/280360> [Accessed 5 August 2016]. Macaskill, E., and Dance, G., 2013. NSA Files Decoded. The Guardian [online] 1 November. Available at: <http://www.theguardian.com/world/interactive/2013/ nov/01/snowden-nsa-files-surveillance-revelations-decoded#section/1>
[Accessed
27 July 2016]. Marks, P., 2012. GPS loss kicked off fatal drone crash. New Scientist [online] 18 May. Available
at:
<http://www.newscientist.com/blogs/onepercent/2012/05/gps-loss-
kicked-off-fatal-dron.html> [Accessed 26 August 2016]. Russen, M-A., 2015. Drone no-fly zones in the UK explained â&#x20AC;&#x201C; Where in Britain can you pilot a UAV? International Business Times [online] 15 April. Available at: <http:// www.ibtimes.co.uk/drone-no-fly-zones-uk-explained-where-britain-can-you-pilotuav-1496386> [Accessed 5 July 2016]. Stone, A., 2012. Drone privacy bill would put safeguards on surveillance. Huffington Post [online] 1 August. Available at: <http://www.huffingtonpost.com/2012/08/01/droneprivacy-bill_n_1728109.html> [Accessed 28 August 2015].
73
Titcomb, J., 2016. Amazon to step up UK tests of delivery drones. The Telegraph [online] 26 July. Available at: <http://www.telegraph.co.uk/technology/2016/07/26/amazonto-step-up-uk-tests-of-delivery-drones/> [Accessed 5 August 2016].
Online Articles & Websites Anab, 2015. The Drone Aviary. Superflux Blog, Research Blog, [blog] 9 April 2015. Available at: <http://www.superflux.in/blog/the-drone-aviary> [Accessed 28 August 2015]. Bliss, L., 2014. ‘Ubiquitous as Pigeons’: Imagining Life in the City of Drones. [blog] Available at: <http://www.citylab.com/tech/2014/08/ubiquitous-as-pigeons-imagining-life-inthe-city-of-drones/375568/> [Accessed 28 August 2015]. Bridle, J., 2011. The New Aesthetic. Short Term Memory Loss. [blog] Available at: <http:// www.shorttermmemoryloss.com/portfolio/project/the-new-aesthetic> [Accessed 02 September 2016]. Carr, N., 2014. Your Inner Drone: The politics of the Automated Future. Long Reads. Journalism and Literature Blog, [blog] October. Available at: <http://blog. longreads.com/2014/09/30/your-inner-drone-the-politics-of-the-automatedfuture/> [Accessed 28 August 2015]. DPD, (n.d.). The DPDgroup Drone: Parcel Delivery 2.0 [online] Available through: <http:// www.dpd.com/home/insights/delivery_drones> [Accessed 30 August 2016]. DroneFlight, (n.d.). What’s the legal position with drones / UAV / RPAs? [online] Available at: <http://www.droneflight.co.uk/whats-the-legal-position-with-drones-uav-rpas/> [Accessed 28 August 2015]. DroneLife News, 2014. Drone Technology emerging as New Infrastructure Solution. [online] Available at: <http://dronelife.com/2014/07/09/drone-technology-emerging-newinfrastructure-solution/> [Accessed 28 August 2015].
74
Ellis, W., 2011. The New Aesthetic. [blog] Available at: <http://www.warrenellis. com/?p=12811> [Accessed 02 September 2016]. Finoki, B., 2007. The City in the Crosshairs - A Conversation with Stephen Graham. Subtopia, Urbanism Blog, [blog] 6 August. Available at: <http://subtopia.blogspot. co.uk/2007/08/city-in-crosshairs-conversation-with.html> [Accessed 28 August 2015]. Gettinger, D., 2015. Drone Geography: Mapping a System of Intelligence. [online] Available at: <http://dronecenter.bard.edu/drone-geography/> [Accessed 28 August 2015]. Hahn, J., 2014. Poll: 43% of Americans oppose the commercial use of drones. Digital Trends. Technology Blog, [blog] 20 December. Available at: <http://www.digitaltrends. com/cool-tech/poll-43-americans-oppose-use-drones-especially-deliver-smallpackages/> [Accessed 28 August 2015]. Harvey, A., 2016. CV Dazzle. [Website] Available at: <https://cvdazzle.com/> [Accessed 15 August 2016]. Heisler, Y., 2015. Amazon Prime Air: Patent sheds light on drone delivery service. BGR. [blog] Available at: <http://bgr.com/2015/05/08/amazon-prime-air-patent-dronedelivery/> [Accessed 28 August 2015]. Hill, D., 2014c. Clockwork City, Responsive City, Predictive City and Adjacent Incumbents City of Sound. [blog] Available at: <http://www.cityofsound.com/blog/2014/11/ essay-clockwork-city-responsive-city-predictive-city.html>ÂŹ [Accessed 27 July 2016]. Iris Automation, n.d. Iris Automation â&#x20AC;&#x201C; Industrial Drone Collision Avoidance. [online] Available at: <http://www.irisautomation.ca/> [Accessed 4 August 2016]. Jacob, S., 2015. Machines of Loving Grace. Uncube n.36, [blog] Available at: <www. uncubemagazine.com/articles/15799833> [Accessed 28 August 2015]. Knight, W., 2013. Driverless cars are further away than you think. MIT Technology Review [blog] Available at: <http://www.technologyreview.com/featuredstory/520431/
75
driverless-cars-are-further-away-than-you-think> [Accessed 5 August 2016]. Jones, M., 2011. Sensor-Vernacular. BERG. [blog] Available at: <http://www.berglondon. com/blog/2011/05/13/sensor-vernacular> [Accessed 02 September 2016]. Kolodny, L., 2016. Iris Automation is bringing eyes, and situational awareness, to drones. Techcrunch
[blog]
Available
at:
<https://techcrunch.com/2016/06/27/iris-
automation-is-bringing-eyes-and-situational-awareness-to-drones/> [Accessed 4 August 2016]. Ledgard, J. M. & Foster, N., 2015. Our Robot Sky. [blog] Available at: <https://medium. com/backchannel/our-robot-sky-4f9281b17233> [Accessed 8 October 2015]. Malone, N., 2013. Amazon’s Drones and American Infrastructure. [blog] Available at: <http://www.newrepublic.com/article/115786/amazons-drones-and-americaninfrastructure> [Accessed 28 August 2015]. Manaugh, G., 2015. New Urbanist: Home is where the robots live. New Scientist [blog] Available at: <https://www.newscientist.com/article/dn28059-new-urbanist-homeis-where-the-robots-live/> [Accessed 6 July 2016]. Manaugh, G., 2016. Robot War and the Future of Perceptual Deception. BLDGBLOG [blog] Available at: <http://www.bldgblog.com/2016/07/robot-war-and-the-futureof-perceptual-deception/> [Accessed 6 July 2016]. Manaugh, G., 2016b. The World as a Hieroglyph of Spatial Relationships Yet to be Interpreted. BLDGBLOG [blog] Available at: <http://www.bldgblog.com/2016/06/situationalawareness-in-the-sky/> [Accessed 4 August 2016] Sharpe, D., 2010. Surveillance Drone grounded days after ‘success’ [blog]. Available at: <https://www.bigbrotherwatch.org.uk/2010/02/surveillance-drone-grounded-daysafter-success/> [Accessed 5 July 2016]. Sipus, M., 2014. Zoning and Urban Land Use Planning for Drones. Humanitarian Space. Urbanism Blog, [blog] 18 August. Available at: <http://www.thehumanitarianspace. com/2014/08/zoning-and-urban-land-use-planning-for.html> [Accessed 28 August
76
2015]. Stanley, J., 2013. Meet Jack. Or what the government could do with all that location data. American Civil Liberties Union. 5 December [blog] Available at: <https://www. aclu.org/feature/meet-jack?redirect=meet-jack-or-what-government-could-do-alllocation-data> [Accessed 4 August 2016]. Tesla Team, 2016. A Tragic Loss. [blog] Available at: <https://www.teslamotors.com/ blog/tragic-loss> [Accessed 6 July 2016]. UAVS, n.d. UAV or UAS? Unmanned Aerial Vehicles Systems Association. [online] Available at: < http://www.uavs.org/index.php?page=what_is> [Accessed 27 July 2016]. Udall, J., 2016. Energy, Disruptive Technologies. [blog] 31 May. Available at: <http:// storiesfutureworks.wordpress.com/2016/05/31/energy-disruptive-technologies/> [Accessed 17 August 2016]. Villasenor, J., 2012. â&#x20AC;&#x2DC;What is a drone anyway?â&#x20AC;&#x2122; Scientific American [blog] 12 April. Available
at
<http://blogs.scientificamerican.com/guest-blog/what-is-a-drone-
anyway/> [Accessed 27 July 2016]. Vinge, V., 1993. The coming technological singularity: how to survive in the post-human era. Whole Earth Review [online] Winter. Available at: <http://www-rohan.sdsu.edu/ faculty/vinge/misc/singularity.html> [Accessed 5 August 2016]. Young, L. and Cale, J., 2014. Loop 60Hz: City of Drones. [blog] Available at: <http:// cityofdrones.io> [Accessed 28 August 2015].
Patents Kimchi, G., et al., Amazon Technologies, Inc., 2015. Unmanned Aerial Vehicle Delivery System. U.S. Pat. 20,150,120,094 (Application, ready for examination). [http://appft.uspto. gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fn etahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220150120094%22.
77
PGNR.&OS=DN/20150120094&RS=DN/20150120094]
Press Releases & Corporate Statements Michigan Technological University, 2014. Michigan tech researches feasibility of Drone use in transportation. [press release] 13 January 2014. Available at: <http://www.mtu. edu/news/stories/2014/january/michigan-tech-researches-feasibility-drone-usetransportation.html> [Accessed 28 August 2015]. Misener, P., 2014. Re: Amazon petition for exemption. [online, letter] 9 July. Available
at:
<http://g-ecx.images-amazon.com/images/G/01/rowland/
AmazonPetitionforExemption_ July92014.pdf> [Accessed 28 August 2015].
Reports, Conference Papers, and Policy Clarke, R., 2009. The covert implementation of mass vehicle surveillance in Australia. In: Proceedings of the 4th workshop on the social implications of national security: covert policing. ANU: April 2009, [Preprint online] Available at: <http://www. rogerclarke.com/DV/ANPR-Surv.html> De Boisblanc, I., Dodbele, N., Kussmann, L., Mukherji, R., Chesnut, D., Phelps, S., Lewin, G.C., and de Wekker, S., 2014. Designing a hexacopter for the collection of atmospheric air flow. In: Systems and Information Engineering Design Symposium (SIEDS). Charlottesville, VA, 25 April 2014, pp.147-152. Hamlett, P., Cobb, M.D., and Guston, D.H., 2008. National Citizensâ&#x20AC;&#x2122; Technology Forum: Nanotechnologies and Human Enhancement. CNS-ASU Report #R08-0003. Arizona State University: The Centre for Nanotechnology in Society Ito, T., Taniguchi, M., and Ichikawa, T., 2011. Regeneration of 3D profile line using a combination of photo images and target markers. In: Improving Complex Systems Today. Proceedings of the 18th ISPE International Conference on Concurrent
78
Engineering, 2011. Part 4, 293-300, DOI: 10.1007/978-0-85729-799-0_34 Kluver, L., Netwich, M., Peissl, W., Torgersen, H., van Eijndhoven, J., van Est, R., Joss, S., Belluci, S., and Butschi, D., 2000. European Participatory Technology Assessment: Participatory Methods in: Technology Assessment and Technology Decision-Making. Copenhagen: The Danish Board of Technology Michael, M.G., and Michael, K., 2007. Uberveillance: 24/7 x 365 people tracking and monitoring. In EDPS (European Data Protection Supervisor). 29th International Conference of Data Protection and Privacy Commissioner. Montreal, Canada, 2628 September 2007. Zhang, H., Wei, S., Yei, W., Blasch, E., Chen, G., Shen, D., Pham, K., 2014. Scheduling Methods for Unmanned Aerial Vehicle Based Delivery Systems. In DASC (Digital Avionics Systems Conference). 33rd DASC. Colorado Springs, CO, 5-9 October 2014.
Policy Documents Civil Aviation Authority, 2015. Air Navigation: The Order and Regulations (CAP 393) [online]. London: TSO (The Stationery Office). Available at: <https://www.caa. co.uk/cap393> [Accessed 28 August 2015]. Civil Aviation Authority, 2015. Unmanned Aircraft System Operations in UK Airspace â&#x20AC;&#x201C; Guidance 6th Edition. (CAP 722) [online]. London: TSO (The Stationary Office). Available at: <http://publicapps.caa.co.uk/modalapplication. aspx?appid=11&mode=detail&id=415> [Accessed 27 July 2016]. European Aviation Safety Agency, 2016. Technical Opinion: Introduction of a regulatory framework for the operation of unmanned aircraft. TE.RPRO.00036-003. Federal Aviation Authority, 2013. Unmanned Aerial Systems (UAS) Operational Approval. N 8900.227. Federal Aviation.
79
Federal Aviation Authority, 2015. Unmanned Aircraft Operations in the National Airspace System (NAS). N JO 7210.889. Federal Aviation ICAO, 2011.. Unmanned Aircraft Systems (UAS). ICAO Circular 328. International Civil Aviation Organization. Available at: <http://www.icao.int/Meetings/UAS/ Documents/Circular%20328_en.pdf> [Accessed 5 August 2016].
Theses Davis, J., 1998. An Ambient Computing System. MSc. B.S.C.S. University of Kansas, Lawrence
Kansas.
Available
at:
<http://fiasco.ittc.ku.edu/research/thesis/
documents/jesse_davis_thesis.pdf> [Accessed 4 August 2016].
Poems Brautigan, R., 1967. All Watched Over by Machines of Loving Grace [online]. Available at: <http://www.brautigan.net/machines.html> [Accessed 27 June 2016].
80
References
List of Figures
Fig. 1. [Drone filming] n.d. [Image online]. Available at: <http://4.bp.blogspot. com/-Psgsms3GKDk/VHb6ZY4CTNI/AAAAAAAAF6Q /Zvuf LfgiBrM/s1600/ Drone_foto.jpg> [Accessed 2 September 2016]. Fig. 2. [Drone Not Drones Poster] n.d. [Image online]. Available at: <https://f4.bcbits. com/img/a1645048826_10.jpg> [Accessed 2 September 2016]. Fig. 3. Higgins, A., 2015. Drone Strike. [Image online]. Available at: <http://www. alexanderhiggins.com/wp-content/uploads/2015/06/246.jpg>
[Accessed
3
September 2016]. Fig. 4. [Amazon Prime Air Promotion Picture 1] n.d. [Image online]. Available at: <http://www.amazon.com/primeair> [Accessed 3 September 2016]. Fig. 5. [Amazon Prime Air Promotion Picture 2] n.d. [Image online]. Available at: <http://www.amazon.com/primeair> [Accessed 3 September 2016]. Fig. 6. [Octocopter UAV with camera] n.d. [Image online]. Available at: <https://cdn2. img.sputniknews.com/images/102054/91/1020549158.jpg> [Accessed 3 September 2016]. Fig. 7. Farr, A., 2016. The UAV Technological System. [diagram]. Fig. 8. Sweet, D., 2013. Protestors March against President Obamaâ&#x20AC;&#x2122;s drone wars on the day of his second inauguration on January 21, 2013. [Photograph]. Available at: <http:// www.commondreams.org/sites/default/files/styles/cd_large/public/views-article/ drones_ethics.jpg> [Accessed 3 September 2016].
81
Fig. 9. Horsey, D., 2015. Americaâ&#x20AC;&#x2122;s Near Future. [Political comic]. Available at: <http:// www.weeklystorybook.com/.a/6a0105369e6edf970b019b024d36a3970d-800wi> [Acccessed 3 September 2016]. Fig. 10. Keefe, M., 2013. Drone Surveillance in America [Political comic]. Available at: <http://dronewarsuk.files.wordpress.com/2014/10/m_keefe.jpg> [Accessed 3 September 2016]. Fig. 11. Farr, A., 2016. Post-Questionnaire Survey Responses. [table]. Fig. 12. WCEC Group, 2016. Unique Perspective of Park Hill [Photography] Available at:
<https://pbs.twimg.com/media/CowrPwRWgAAQMW0.jpg>
[Accessed
3
September 2016]. Fig. 13. Superflux, 2014. Drone Aviary [Image online] Available at: <http://cdn4. digitalartsonline.co.uk/cmsdata/slideshow/3517083/VA-Courtyard-Final-HighRes.jpg> [Accessed 3 September 2016]. Fig. 14. Superflux, 2014. RouteHawk, a Traffic Control Drone. [Image online] Available
at:
<http://motherboard-images.vice.com/content-images/
contentimage/20848/142891109540137.jpg> [Accessed 3 September 2016]. Fig. 15. [Kanas crop circles] 2016. [Image online]. Available at: <http://www. iorise.com/blog/wp-content/uploads/2016/06/2016-06-29_KansasCropCircles_ ROW9813676892_1920x1200-550x344.jpg> [Accessed 2 September 2016].
82
83
Appendix A
Complete List of Participants by Reference Relevant Professionals PR-001 Academic in spatial design field; interest in interactive buildings and sensors. PR-002 Professional in public planning; interest in GIS information systems. PR-003 unused. PR-004 unused. PR-005 Professional in film sector; camera operator with direct UAV experience.
Amateurs and Hobbyist Interests HO-001 Arts charity and technologist; interest in surveillance and selfsustainability.
Lay Citizens LP-001 Student in spatial design field; interest in big data and democracy. LP-002 Student in spatial design field; rural, self-identified NIMBYist background. LP-003 Citizen in rural community; no relevant interest.
84
LP-004 Worker in education sector with young children; no relevant interest. LP-005 Professional in unrelated engineering field; experience with UAVs in indirect capacity. LP-006 Professional in education sector with young children; no relevant interest. LP-007 Pupil at pre-university level; no relevant interest LP-008 Citizen in urban community; no relevant interest. LP-009 Young professional in spatial design field; no relevant interest. LP-010 Young professional in health care; interest in information technology. LP-011 Young professional in film sector; experience with UAVs in indirect capacity.
Back Cover Fig.15 Crop circles in Kansas taken by an aerial photography drone.
85
Appendix B
1. Prior to the conversation, you were certain about the benefits of UAV technologies. 2. Following our conversation, you are more certain about the benefits of UAV technologies. 3. You believe that scientists and technologies are confident in the benefits of UAV technologies. 4. The government should provide tighter regulation on the flight provision of UAV technologies. 5. The government should provide greater opportunities for drone flight (professional and amateur) in cities. 6. The government should encourage citizens to have their say in the regulation of UAV technologies. 7. The government should provide more investment into UAV technologies. 8. Following our conversation, you are more concerned about the integration of UAV technologies. 9. Following our conversation, you are more hopeful about the integration of UAV technologies. 86
Strongly agree (+4)
Agree (+2)
Neither agree nor disagree (0)
Disagree (-2)
Strongly disagree (-4)
Post-Conversation Questionnaire
Strongly agree (+4)
Agree (+2)
Neither agree nor disagree (0)
Disagree (-2)
Strongly disagree (-4) 10. You support the use of UAV technologies in policing and surveillance of known criminals. 11. You support the use of UAV technologies in automated and personal logistics services. 12. You support the use of UAV technologies in filming of aerial shots and in aerial photography. 13. You support the use of UAV technologies in hobbyist and amateur products (e.g. remote control aircraft). 14. You support the use of UAV technologies in non-visual data collection data (e.g. wind or weather). 15. You support the use of UAV technologies in scanning and mapping programs (e.g. GPS and maps). 16. You support the use of UAV technologies in constant anonymised surveillance of large areas of cities. 17. You believe the benefits of UAV technologies exceed the potential risks.
18. Do you have any further comments on whether participating in this dialogue changed your views on drone (UAV) technology, and do you think dialogue such as this is necessary in effective government implementation of this technology?
87