Dissertation
Empathy Engine: Researching Through Design Fiction Haider Ali Akmal ID: 31333339 Submitted as part of requirement for
MA Design Management Department
Lancaster Institute for the Contemporary Arts Course Coordinator
Dr. David Hands
Thesis Supervisor
Dr. Paul Coulton Submitted
August, 2015
Table of Contents Acknowledgements
01
Abstract Laying a Foundation 1.1 Introduction 1.2 The Impossible Problem
2
1.2.2 Point of Reference
2
Addressing the Impossible
1.3.1 Grounds in Science Fiction
Literature Review 2.1 Where to begin? 2.1.1 Aim
2.2 On
Design
3 3
6 6
6
2.2.1 Design Fiction: An Approach
6
2.2.2 Design = Fiction
7
2.2.3 Design Fiction Usage
8
2.3 On
Empathy
8
2.3.1 Empathy and Human Interaction
8
2.3.2 Philosophy or Psychology?
9
2.3.3 Measuring Empathy 2.3.3.1 So how do we measure empathy?
03
2
1.2.1 Background 1.3 Methodology:
02
2
2.3.3.2 But why should empathy be measured?
Designing the Future 3.1 Introduction 3.2 Machines as Humans
11 11 12
14 14
3.2.1 The Fictional Drawing Board
15
3.2.2 What makes a machine, a machine
16
3.2.3 How plausible is this?
17
3.2.4 Machine Empathy 3.2.4.1 Strong and Weak AI
17 17
3.2.5 Gauging an empathic measurement
18
3.3 A
look at science fictional narratives
19
3.3.1 Blade Runner’s Androids
20
3.3.2 Samantha from Her
21
3.3.3 Moon; the machine as friend
22
3.3.4 Analysis
24
04
Designing the Diegetic Prototype 4.1 Phase I 4.1.1 Scenario Development 4.1.1.1 Dating Scenario 4.1.1.2 Medical Scenarios 4.1.1.3 A Transaction
4.1.2 Envisioning this as a Digital Empathic Language 4.1.2.1 Core Language Libraries 4.1.2.2 Function Libraries
4.1.2.3 Application Programming Interfaces (API)
4.1.2.4 Integrated Development Environment (IDE) 4.1.2.5 Technical Documentation 4.1.2.6 Sample Deployments
4.1.2.7 Online Presence and Community 4.1.3 Elaborating Scenarios
4.2 Phase
II
4.2.1 Rules of Interaction 4.2.1.1 Software Integration
4.2.1.2 Hardware Integration
4.2.2 So what’s possible?
4.3 Phase
05 06
III
26 26 26 26 27 27 27 28 28 28 29 29 29 29
31 31 31 31 31
33
4.3.1 Video: Online Dating Scenario
33
4.3.2 Product Placement: Mini Cooper Advertisement
34
4.3.3 Product Manual: Technical Manual for Device
35
4.3.4 Video: SDK Introduction
36
Testing Phase 5.1 Results 5.2 Discussion 5.2.1 Ethical Concerns
Conclusion 6.1 Current Applications 6.2 Design Fiction as a Research Method
38 38 39
42 42
6.2.1 Is there potential?
43
6.2.2 Closing thoughts
43
References and Bibliography Appendix A Appendix B
Acknowledgements I would like to express my appreciation towards the tutors of MA Design Management at Lancaster University and especially Dr. Paul Coulton for his support in my thesis. I’m very grateful for my family and loved ones who had to bear with my annoyances while compiling my dissertation as well as throughout the year as a student. None of this would have been possible without them, especially my mother and my friends in Pakistan. A special thanks to my friends who helped me in creating the diegetic prototypes as part of my thesis by contributing their skills and time.
i
Abstract This dissertation looks into the usage of Design Fiction as a research method to facilitate the creation of diegetic prototypes that explain a device that can detect and/or measure empathy in an individual. It attempts to examine and explore a potential fictional near future where such a device could exist, defining its boundaries in terms of creation, and the world it would occupy. The basis for this research is rooted in science fiction, a field having proven to blur the line between fact and fiction through technological advancements that don’t exist in the real world. The term ‘design fictions’ is currently being used to discuss a specific format of everyday scenarios about the future, where technology plays a crucial role (Gonzatto et al. 2013). It uses that foundation of science fiction as a leverage to explore the potential of impossible scenarios as viable methods of creating future innovation through design. As a science fictional point of reference for the argument on empathy, this research looks at the Voight-Kampff machine from Ridley Scott’s Blade Runner and the possibility of envisioning it as a fact. I walk through the process of creating a possible reality around a seemingly impossible problem by creating diegetic prototypes of an Empathy Engine SDK as a means of exploring and expanding on current and proposed ideas. I conclude by testing the creations using social media to understand the flaws and benefits of the proposed design for empathy detection, as well as future projections for design fiction as a research method.
ii
01 Laying a Foundation
1.1
Introduction
What would the future be like? A question that is difficult to answer since predicting the future is always a guessing game for many; but one can always speculate. Akin (2014) quotes science fiction writer Bruce Sterling on this saying, “The future is not an alien world, it is this very world, with different people”. What we consider the future is therefore a work of our imaginations, it’s fiction. Many methods have been formulated to speculate about the future, some with scientific backing others more metaphysical, but each approach always has a confirming belief in that designed imagination. Gonzatto et.al (2013) discusses creating such imaginations in order to “scaffold” approaches “to design fictions”
1.2
Hurston (1942) once said, ‘Research is formalised curiosity. It is poking and prying with a purpose’. I enjoy that explanation of research as it takes on the core of any research activity. Through research, you have the capability to uncover or discover new knowledge, knowledge, that just might impact on real change (O’Leary, 2010).
1.2.1
My interest here is in a specific kind of research method that allows for a unique mode of realising unknown knowledge. The premise of this research stems from initial questions influenced by the writings of Philip K. Dick and his novels subsequent movie adaptation as Ridley Scott’s “Blade Runner”.
The Impossible Problem
Background
Both the novel “Do Android’s Dream of Electric Sheep?” and its cinematic counter part raise philosophical questions on the relationship between humans and machines in a dystopian world where they both co-exist. It ultimately asks deeper questions on what it means to be human and the nature of empathy between living, and seemingly non-living, entities. This research attempts to imagine the possibilities of that empathy in digital devices as a part of the, Creating and Exploring Digital Empathy Project (CEDE) a collaborative research project between the Universities of Lancaster, Sheffield, and also the University of Central London. A primary focus for this project is understanding and possibly recreating empathy in digital devices, as a reference point the Voight-Kampff machine from the fictional universe of the novel is explored.
1.2.2
Point of Reference
2
The Voight-Kampff machine as Dick (1968) describes measures the level of empathy in an individual by ‘measure[ing] capillary dilation in the facial area’, as a means of reading the ‘shame’ or ‘blushing’ reaction to a morally shocking stimulus’, it further notes ‘skin conductivity, respiration, and cardiac rate’, along with ‘fluctuations of tension within the eye muscles’. In the fiction it establishes a ground for differentiating between humans and androids, the primary premise of the novel.
A means to accomplish this research can be seen through design research methods, or more specifically design fiction. Design has been defined in different ways by various observers. Blythe (2014) has discussed design as a material exploration of a problem with one particular way of exercising this exploratory quality of design through Design Fiction. He among others (Bleecker, 2009, 2010; Grand and Wiedmer, 2006; Sterling, 2009) have argued that design fiction can serve as a valuable developmental tool as part of a Research through Design approach.
1.3
Methodology: Addressing the Impossible
Empathy being an abstraction of thought, a psychological state, is something that operates in us on a human level; one not readily understood by machines. Yet the connection between man and machine has evolved over the ages and today machines serve as our assistants, saviours, and to some even friends. Our relationship with the machine has changed, this research aims to look into that change and leverage it as a basis for the possibility of creating a device that helps aid or enhance the human experience envisioned in a near future.
Figure 1. A scene from science fiction movie Minority Report (2002) Figure 2. Fictional holographic technology shown in the Star Wars franchise
This all sounds very much like science fiction and it is for in order to understand the abstract concepts to be discussed, the research aims to leverage the creative provocation available through fiction; in order to create a design fiction, a diegetic prototype, that can successfully explain the pros and cons of a device and the world it would inhabit.
1.3.1
Grounds in Science Fiction
Bleecker (2009) explains the diegetic prototype as a “way that a science fiction film provides an opportunity…[to] speculate within the fictional reality of the film”. This process allows for, “the diegetic prototype [to] insert itself into the film’s drama which activates the designed object, making it a necessary component of the story”. Therefore, diegetic prototypes give us an “opportunity to explore an idea, share it publicly and realise it”.
3
4
02 Literature Review
2.1
Where to begin?
To begin with the research process a review of available literature on the matter was conducted. A problem immediately arose, due to the virgin nature of design fiction very little research has been conducted and/or published on the matter. Alternatively there has been significant application of design fiction’s, albeit not necessarily for research purposes. They exist in the form of near future depictions of technology developed as concept art, video or popular culture—science fiction. Therefore this research on design fiction, in terms of academic sources, was conducted mostly in the blind by assessing many situations and making do with what little published evidences were available. On the other hand, significant research in the fields of empathy, psychology, and computer sciences were found. Although empathy too is a field that is still being understood, the amount of research currently conducted proved sufficient for a strong enough foundation in the topic.
2.1.1
Aim
A simple statement is needed to procure the final ‘product’, if it can be called so. It is necessary to lay this foundation in order to make sense of the otherwise impossible-ness of the area of research. Therefore in its most basic of terms the aim of this research can be summed as, the creation of a digital device to measure empathy. The following key aspects of discussion can be addressed hence: 01 What is known about empathy and human-empathic interaction? 02 How can that interaction be digitally translated into a device? 03 What form would this device exist in? 04 What worldly conditions need to be met for such a device to exist? 05 And, what are the consequences of such technology? But first a look at what design fiction is and how it can fit in as a research method.
2.2
On Design 2.2.1
Design Fiction: An Approach
6
Grand and Wiedmer (2006) place design fiction as a “conceptualisation of design and design research as a practice and research field, which particularly focuses on the world as it could be.” It is a tool for development through narrative taking into account current and past findings and observations to envision a future of possibilities. According to Bleecker (2009) design fiction is a mix between science fact, design and science fiction, where “science fiction as a literary genre serves its purpose as a cultural form in the ways it anticipates and reflects upon the possibilities of a different, other world.” It
IMAGINATION CREATIVITY CREATION
DESIGN
=
PROCESS CONTEMPLATION
FICTION NARRATIVE
DIRECTION
FUTURE
FORWARD
Figure 3. Design and Fiction are in many ways equals, each coming from imagination and expressed through narrative
therefore allows for a unique broadened representation of the future which aids in the process of development through design, as he further explains that the possibilities of thought are endless in design fiction due to its use of imagination and creativity explicitly.
Design and fiction are in many aspects similar, upon deeper consideration. Lindley (2015) has expressed that design, speculative design and design fiction have a hierarchal relationship. They all deal with what can be considered as the future of things. Design has a process which comes from contemplation, Denison (2013) quotes Stevenson-Keating’s weblog on speculative design where he says:
2.2.2
Design = Fiction
”Speculative design is a dreamlike exercise – manufacturing alternate worlds, ones which feel every bit as real as the ‘real world’ we inhabit day-to-day… It cannot predict the future, but it can shape the present” (19) As a designer, one is constantly dealing with the present, past, and future. A designer takes into consideration all of these states of a problem in order to ‘design’ a solution. Therefore speculation, or more precisely prediction, is the basis for design. Fiction often is in the form of a narrative since it cannot be perceived unless ‘imagined’ through a language of sorts; verbal or visual for example. Design will forever intersect with narrative (Denison, 2013), both design and fiction deal with creation; they are creative provocations of the imagination. Furthermore they both relate to a direction of thought, and in both cases that direction is projected forward. Hence it is not difficult to see the connection between the two, they are in many ways equal. Therefore it is safe to say, to design is to create fiction. 7
2.2.3
Design Fiction Usage
Today, design fiction[s], have, many interpretations by practitioners and researchers, with the idea being, embraced for commercial purposes (Denison, 2013). Although a relatively new field in research, its usage has been seen in various forms of media particularly product visualisation. Firms such as Google, Microsoft, Apple have over the years used design fictions in the form of diegetic prototypes as a way to express ideas that either could not be envisioned in the technology of the day or serve a purpose of future prediction. One particular source for inspiration of design fictions comes without a surprise from science fiction. Bleecker (2010) has credited Star Trek with “blurring the broad line between fact and fiction”. Referring to the many technological advancements shown in the series and movies, he argues that science fiction has been a pivotal force in “making the extraordinary ordinary”, one that he refers to as “a recurring genre convention for science fiction”. Due to the “creative elasticity” science fiction provides, it, “is able to make strange, implausible ideas mundane and everyday” (Bleecker, 2010, p.2).
Figure 4 and 5. A PADD (Personal Access Data Device), and Dermal Regenerator two of many fictional devices from the Star Trek francise
2.3
On Empathy 2.3.1
Empathy and Human Interaction
When counting the possible ways humans interact with one another the direct methods of touch, speech, smell and sight come to mind almost immediately. But there is more at play here than these surface level interactions. A strand of peripheral engagement is present between two or more individuals during the course of interaction; these secondary engagement techniques aid the primary leading to what can, in very loose terms, be considered ‘the human experience’. These secondary engagements come in the form of neurological interactions with our senses. Memories, feelings, emotions, the intangible aspects of interpersonal relationships. Acting as fuels they trigger particular responses that correlate with the situation at hand.
8
Bullmer (1975) explains this function of engagement as a form of perception that, involves much more than mere visual observation; this can be taken a step further by including not just visual parameters but those relating to other base sensory parameters such as sound and touch. There are many psychological interferences in our interpersonal interactions but what Bullmer is referring to as perception here is the idea of empathy. For the basis of this research it is also the one phenomenon that will be of core discussion.
Empathy has been explained by many scholars and authors in different ways. Often defined as a means of, borrowing the feelings of another in order to really understand them (Kalisch, 1973), it, is a multidimensional construct (Neumann & Westbury, 2011) dealing with perception and experience during interpersonal interaction. Halpern (2003) describes, the function of empathy, as, not merely to label emotional states, but to recognise what it feels like to experience something.
2.3.2
Philosophy or Psychology?
This can be seen through the roots of the word ‘empathy’ which come from, the German word ‘einfuhlung’ used to describe the way people wilfully project themselves into a work of art to aesthetically appreciate its qualities (Neumann & Westbury, 2011). There is a very prominent characteristic that defines empathy, one that has been acknowledged by various scholars (Decety & Jackson, 2004; Kalisch, 1973; de Vignemont & Singer, 2006; Spiro, 1992), and that is of self recognition. Kalisch (1973) explains it as such: ”[Empathy is] Borrowing the feelings of another in order to really understand them, but never losing your own identity.” (1548) Kalisch further solidifies this by referencing Rogers (1958) quote on the topic, saying that empathy is “to sense the [others’] private world as if it were your own but without ever losing the ‘as if’ quality”.
SELF
EMPATHY RELATES EMPATHY RELATES
SELF
SELF SELF
OTHER OTHER
SYMPATHY ACKNOWLEDGES SYMPATHY ACKNOWLEDGES
OTHER OTHER
Figure 6. Differentiating between empathy (relating with someone) and sympathy (acknowledging anothers emotion often followed with assurance)
9
This is fundamental to empathy and allows in differentiating between itself and another very similar psychological phenomenon ’sympathy’. Simply put, if you see someone begging on the street and you give them a coin you are effectively sympathising with their situation and have no human connection with that person, but if you were to see the same individual and sit down with them only to realise they don’t want money but miss a spouse, that in turn could remind you of a similar feeling you had from a prior point in time. You would have effectively connected with this random individual in a deeper more emotional level; you would have empathised. This process of ‘borrowing’ and ‘sharing’ is what creates the empathic human experience. Neumann and Westbury (2011) associate this experience to psychophysiological phenomenons and what they call “conceptualisations of empathy” such as: 01 Knowing another person’s cognitive and affective internal state, 02 Adopting the posture or matching the neural response of another, 03 Feeling as another person feels, 04 Projecting oneself into another’s situation, 05 Imagining the thoughts and feelings of another, 06 Imagining how one would think and feel in the other’s place, 07 Feeling distress at witnessing another’s suffering, and 08 Feeling for another person who is suffering. In all these conceptualisations a particular understanding is needed between individuals; one that could either be from past experiences with them or could be triggers relating to each individual on their own irrespective of the other. A philosophical discourse can be raised here regarding the altruistic nature of empathy. Hoffman (1991) argues that “empathy may be the primary basis of altruistic motivation” or action with self-less intents. Although many have attempted to explain the connection between empathy and altruism, the philosophical discourse on the relationship between the self and the other is beyond the scope of this research but will be touched upon gently towards the end. That said the notion of self-less intent in empathy is still a key factor, particularly when compared with the potential of a device that could harbour empathic capabilities.
10
Two major questions arise when one considers measuring empathy: 01 By what means may empathy be measured, and
2.3.3
Measuring Empathy
02 What is the purpose of measuring empathy? To answer the first, several attempts have been made through various channels to understand how empathy can be measured. In our daily lives empathy is measured through cognitive processing, Neumann and Westbury (2011) have attempted to measure empathy through PAM, the PerceptionAction Model. In this model, empathy is facilitated by behaviours that rely on perception-action mechanisms such as motor mimicry and imitation, the Simon effect, idea-motor behaviours, and response preparation (3). They describe psychophysiological approaches such as neuroimaging, electroencephalograms, facial electromyographic activity, startle blink reflexes, electrodermal activity, and cardiovascular activity that are used to measure empathy in an individual. Being a rapidly developing field within empathy research, it is a field that requires technological improvement, testing, research, and in-depth study in order to reach anything conclusive. But, for such a complex and multifaceted construct as empathy, it is necessary for researchers to use diverse research approaches and measures to gain a complete understanding (18), of how empathic an individual is and where the lines between empathic responses and other similar responses blur.
There is little research present to suggest that we necessarily measure empathy as we go, we do although react to situations. What makes those responses empathic depends on what I imagine as ‘cerebral communality’. A collection of phenomenon that encompass an empathic response all coming from currently established research as different sources have suggested (Hoffman, 1991; Bullmer, 1975; Kalisch, 1973; Neumann & Westbury, 2011; Halpern, 2003; De Vignemont & Singer, 2006; Spiro, 1992; Decety & Jackson, 2004), namely these phenomenon are (a) observation, (b) memory, (c) knowledge, (d) reasoning, and finally (e) an awareness of the self and the other. All of this happens on a covert level of understanding and often in a blink of an eye. Therefore in order to measure empathy it is necessary to invoke stimulation, that can be achieved through simulating scenarios or as in the fictional Voight-Kampff test by provocation through questioning.
2.3.3.1
So how do we measure empathy?
From Neumann and Westbury’s (2011) studies one can list out the most effective of psychophysiological measures:
11
Effective Psychological Measures
Table 1. List of psychophysiological measures of empathy as listed by Neumann and Westbury (2011)
Motor Mimicry
Neuroimaging
Peripheral Nervous Sys.
Eye Movement
Access to
Electrodermal Activity
Blinks
Central Nervous System
Heart Rate
Blood Pressure Blood Volume
All of these responses are achieved through psychological stimulation of the physiology. Therefore the most ideal of methods to measure empathy would be a combination of all the aforementioned methods/phenomenons—a combination of mimicry, neural activity, visual/audible observation and response, among as many other sources of provoked responses that can be acquired from a subject for analysis.
2.3.3.2
But why should empathy be measured?
The second question raised in order to understand how empathy can be measured can now be answered since we’ve established a means for measuring empathy through current literature; at least in theory. According to theorists of psychological therapy, [empathy] is an essential element of the interpersonal process (Kalisch, 1973). We respond to empathy often by nature and less through provoked stimulation unless in a monitored environment. Empathy is an experiential way of grasping another’s emotional states, a, perceptual activity that operates alongside logical inquiry (Halpern, 2003). Vignemont and Singer (2008) propose empathy to have evolved by natural selection for mother-child bonding or reciprocal altruism therefore as human beings there seemed to be a need for empathy which proved its importance over time. From this it can be said that our empathy is shaped by the rapid need for the evaluation of other motivations, ones coming from visual stimuli and experience. Therefore to say we as humans fully understand how and when we utilise empathy is folly as the number of possible scenarios can be infinite. Still for the purpose of this research, the specific purpose of measuring empathy can be justified in the following points: 01 As a means to differentiate between human and non-human, 02 To gauge emotions more accurately, although the need for such measurement can be debated, 03 To counter dehumanisation through technology; in a world where technology has changed our emotional habits, 04 To enhance interpersonal relations, and
12
05 For situational benefits; advertising for instance, although the ethical qualms of such a purpose may raise debate.
03 Designing the Future
3.1
Introduction
PAST
PRESENT
NS CT IO PR ED I
LE DG E W KN O
EX PE HI RIE ST N OR CE Y
Figure 7. The design fiction process takes from past experience and present knowledge to predict near futures
NEAR FUTURE
FUTURE
In order to facilitate current research and raise new never before thought of questions, design fiction can prove to be a valuable asset. Bleecker (2009, 2010) has on many accounts praised the ability of design fiction to creatively provoke ideas. Through the use of creative narratives it is possible to aid the research process of measuring empathy by envisioning it in a certain frame. Fictional scenarios, observations and carefully crafted character personas can be prepared to imagine the possible world and the device for that world where empathy would need to be measured. By taking into account real world scenarios from the past, referencing them with defined ideas of the future in science fiction and by adding ones own imaginative powers, a series of observations can be prepared around a possible near future. This near future would house the idea that the device in question, the empathy detector, is used if not on a daily basis than on an ‘as per need’ basis. What are those needs, when are they evoked, and who are the ones who evoke them, are the questions that this research will attempt to ask. Having laid the basis of empathy measurement, in theory, it can now be envisioned as fact through fiction. The aim is to tally current knowledge with future creative provocations and finally ground it back in current real world applications.
3.2
Machines as Humans
14
Machine emotions are simplistic (Norman, 2007). Human-human empathic interactions can arguably be understood, but the question this research raises is of human-machine empathy. An example of a current employment of empathy in a form where humans and machines interact similar to humans and humans is mentioned by Stoate (2012), talking of a project called Paro a robotic baby harp seal developed by Japanese company AIST in 2001 whose primary function is to provide therapeutic aid to help elderly sufferers of dementia. How this machine does this is by being capable of detecting touch, temperature, light, sound and motion. It responds to being stroked, spoken to, and hugged with a variety of movements and expressions. Although a machine it allows for interaction on a level much greater than itself. One can argue though that Paro does not contain any empathy and albeit no machine can fundamentally have empathy as they need to be able to self asses situations and reprogram themselves when needed. But, as Norman (2007)
points out “the machine is not intelligent: the intelligence is in the mind of the designer�, therefore they can mimic what can be considered an empathic response by pre-programming potential empathic responses to potential interactions. Either way, one key aspect for empathy irrespective of human or machine is interaction.
In order for machines to be able to interact with humans on an empathic level, this research will take the aid of science fiction and the scenarios presented by authors, designers, artists and filmmakers as a basis to understand how machines and humans function together in a near future as compared to now.
3.2.1
The Fictional Drawing Board
Films have served as a medium for experimentation on diverse topics, an expressive medium to say the least, film has often been the centre of focus for intrigue in the future through science fiction. This research looks into three films and the subsequent relationship they present between man and machine. Each of these films have in some form a non-human entity which either serves as a main character in the story arc or in some way aids the protagonist towards the underlying arc of the film. Below is listed the three films and their respective non-human artificially intelligent beings/characters that this research will be referencing: Film
Year
Director
AI
Description
Blade Runner
1982
Ridley Scott
Androids
Cybernetic beings created solely by observing humans
Her
2013
Spike Jonze
Samantha
An Operating System that has the characteristics of evolving through experience
Moon
2009
Duncan Jones
GERTY
An ubiquitous computing agent capable of computerhuman interaction
Table 2. List of cinematic narratives as reference.
Figure 8, 9, and 10. Official cinematic posters of Blade Runner (1982), Her (2013), and Moon (2009) respectively
15
Before diving into each of these films I’ll be looking at the relationship between man and machine posing the question of considering machines, if not as humans then, as entities that posses empathic capabilities.
3.2.2
What makes a machine, a machine ?
For starters there is a stark difference between how humans function on a cognitive level as compared to how machines do. Machines run on logic based operations, often functioning faster than humans, at least as far as multitasking is concerned. Certain quantum level processing can also be achieved that an average human being might not be capable to forego. That said, machines have been designed to function the way they do after observing human interaction and cognition through crafted algorithms. A major aspect of human intelligence is self evolution. We evolve as we experience, we learn from our experiences. Simply put, machines today can do similar things by keeping track of data and information and factoring actions what could be considered ritualistic in nature. The extent of this is dependant on how vastly pre-programmed the machine is. Therefore a large hurdle to the extent of machine intelligence is the fact that all intelligence that is artificially generated comes from pre-programmed notions of intelligence acquired from a designer or programmer. HUMAN INTELLIGENCE
Figure 11. Current AI is limited as far as self evolution goes compared to human intelligence. A future self evolving AI though can be imagined
ARTIFICIAL INTELLIGENCE (CURRENT)
ARTIFICIAL INTELLIGENCE (FUTURE)
FOREVER EVOLVING FROM EXPERIENCE
PRE-PROGRAMMED
PRE-PROGRAMMED
KNOWLEGE HALTED
RE-PROGRAM
SELF EVOLUTION
Hence in a fantastical state it is possible to imagine computers that have pre-cognition from experiences they have learned over time. These computers have garnered the ability to anticipate certain actions and therefore also have self-cognition where they can asses their own actions and react accordingly to situations. To see this in the light of empathy and an empathic machine, the machine would need to be able to assess the emotional dimensions of others in question through utilising its past experiences with that person, or by fetching data in realtime, or from a database of experiences much like a memory bank for humans, and then re-programming itself to react to that particular emotion in a particular manner. An idea of self evaluation and therefore self increment is something that we as humans do on a daily basis. A child is intrigued by a flame and reaches out to touch it only to experience heat, it acknowledges this as discomfort 16
and refrains from that act in the future. This is a human evaluating a situation through experience and then reprogramming themselves to understand the nature of that situation and how to handle it in the future. As with age comes understanding of a deeper level and thus other possible scenarios of handling situations.
The idea is far fetched in todays understanding of computers but we are nearing the idea of a computer that can react from learned situations. The Voight-Kampff test has been recognised by Abrioux (2014) and other scholars as being a spin-off of the Turing Test coming from Alan Turing’s 1950’s paper that “[formulates a] technoscientific test capable of providing adequate means for discerning a capacity, intelligence” within computers.
3.2.3
How plausible is this?
Today Facebook, for instance, uses algorithms that track a users usage of the internet, more precisely their usage of and/with Facebook. Keeping note what they view, like, and respond to, and subsequently uses that data to generate similar content. It takes it one step further by displaying advertisements of items the user would be interested in by analysing their usage statistics. This is a good example of how technology today is slowly but surely moving towards the computers of film franchises like Star Trek and also of Turing’s envisioned future computing powers.
When understanding machines and their possible empathic capabilities it is imperative to acknowledge the way empathy functions in general interaction and tally that to possible equivalent areas of machine competence. As explained earlier interaction is key for empathy, and for a computer it can be done:
3.2.4
Machine Empathy
àà Through tangible input/output (keyboard, mouse), àà Through speech (microphone), àà Through visual (camera, scanner), or àà Through history (usage, social media).
The correct usage of all these variables create what could possibly be classified as a Strong AI. A key aspect of such an AI would be cognitive abilities, but the major difference between Strong and Weak AI can be roughly surmised as such (without going into too much detail of how AI functions, as that would be beyond the scope of this research):
3.2.4.1
Strong and Weak AI
17
Table 3. Differentiating between a weak and strong AI.
Weak AI
Strong AI
Basic understanding of situations
Complex understanding of situations
Cannot learn from itself
Self assessment
Pre-programmed responses, ends up being erratic in many situations
Evolves from pre-program, enhancing interaction experiences
Arguably more machine-like
Arguably more human-like
Applications such as Cleverbot are a good example of Weak AI. The chatting interface is lined with pre-programmed responses that react to what the user says. A downside of such interaction is that by noting keywords from a sentence the computer can only understand so much. It breaks once the user responds in a manner that has not been pre-programmed. For instance if the user were to misspell certain words which were to be potential keywords in the conversation, the application would fail miserably. We as humans don’t rely on spelling in our daily interactions, it is only a means to clarify ourselves in a written language. We understand in speech the difference between affect and effect due to the situation and the means of interaction but when writing said situation down the meaning can change if the wrong verb is used. For a machine this becomes more daunting and leads to a failed conversation. As a result the responses from Cleverbot eventually crash and burn. That said, Cleverbot also learns from its users, it keeps track of its own conversations and slowly turns its relatively weak AI into a much stronger one. The responses again are mostly nonsensical but slow and steadily it is possible to imagine a point when the application will have reached enough of a cognitive level to have a decent conversation with a human. There is a conundrum to this scenario though, in actuality the machine is not really conversing with the user but in fact only reacting to the situation. A truly Strong AI would need to be able to process information on several levels of recognition before being able to give a cognitively accurate and advanced response. Much like how we as humans do when we converse.
3.2.5
Gauging an empathic measurement
In all of this, it is also important to acknowledge that the machine needs to be able to gauge empathy in some way. Several methods of measuring empathy have been explained previously but once that information is collected it needs to be categorised by the machine. Empathy being related to emotion therefore the mode of measurement should be in some way emotive. The scale of emotional responses can be vast as there can be low to high level responses as illustrated by Plutchik’s (2001) model of emotions, such as ecstasy to joy to serenity, or admiration to trust to acceptance. It should be noted here that similar emotions are gathered together in sequence ranging in intensity. One problem with this form of ranging emotion is that many emotions that fall in the blur between two
18
E LOV
M IMIS OP T
AC CE P
JOY
N TIO PA
RAGE
SE RI
DI SG
SE
OR REM
IO
CT DI
ST
RA
AL
P E N S IVE N E S S
ROV APP
M
SADNESS
RE
DO
DIS
BO
N
SU RP
T US
GRIEF
AW E
T
T MP
G IN
EN EM AZ
H AT LO
TERROR
AM
TE CON
M AD
I AT IR
ON ISSI M B SU
ON
FEAR
ANGER
E NC
ANNOYANCE
LA GI VI
RES SIVE NES S
ECSTACY
AN TIC I
T US
AGG
TR
A P P R E H E NS I O N
ST RE
E NC TA
IN TE
SERENITY
Figure 12. Variation of Plutchik’s (2001) model of emotions. Original referenced in Appendix A.1.
strands are very difficult to place. One could argue that these areas signify the blur between man and machine, or even the blur between human-human relations. For instance taking the above example of ecstasy to serenity and admiration to acceptance, the emotion of ‘love’ would have to fall between these two strands as it incorporates elements from both of them. Another example could be terror-fear-apprehension, amazement-surprise-distraction and the emotion of ‘awe’ that lies in between. These blurred regions of the emotional spectrum prove to be problematic in human-human interactions and considering them a nuisance in humanmachine interaction can hold true. Machines gauge on a binary level; 0 or 1, ON or OFF. They employ logical operations such as AND, OR, and NOT to further clarify said binary interaction. Therefore for a machine to gauge empathy I propose a much more simpler method of placing empathy on a scale ranging from low to high. A gauge for basic level interactions with all other possible interactions falling in between. This eliminates confusion on complex levels but it does imply that each emotion be registered at a certain degree, and that would have to be pre-programmed; till the AI can learn from mistakes and reprogram.
As mentioned before this research intends to employ the use of film narratives or as Sterling (2009) puts it diegetic prototyping to imagine possible situations where such interactions could be possible. The previously noted films will now be briefly discussed in order to understand how they explore the idea of empathy in a human-machine interaction context in their respective fictional worlds.
3.3
A look at science fictional narratives 19
Figure 13. ‘Rick Deckard’ from the movie ‘Blade Runner’ (1982) in front of the fictional Voight-Kampff machine
3.3.1
Blade Runner’s Androids
Ridley Scotts Blade Runner (1982) has been a cult fascination for years owing to its diverse narrative into many culturally sound concepts. The androids, or andys, in the film are considered a lesser race after being created by humans to ironically be replicants of humans. Being made of organic matter that has been enhance through cybernetics they poses almost all human traits that could be programmed except for one fundamental trait, empathy. In a world where humans have grown a need to find and hunt down illegal replicants they employ the use of a device called the Voight-Kampff machine to determine the difference between human and android by detecting empathic responses to controversial questioning. The film raises debate on different levels of understanding, first of all if empathy were such a viable mode of differentiating between human and non-human than what would happen in the case where a ‘human’ was not an empathic individual? Obviously said person would be considered an android wrongfully. It also raises the question of the identity of the androids and their need to be like humans to the extent that they are jealous of them having a trait they cannot posses. It gets muddier once the protagonist has feelings for an android and begins to question himself as a bounty hunter hunting down androids and falling in love with one. Does that make him any less of a human? The philosophical discourse aside one can get lost in the layers of questioning that can arise from it, what is of importance to this research is the use of the device to detect empathy. It has been discussed already how the VoightKampff machine deduces its results on empathy therefore I won’t go into the schematics again but one thing I will point out is the usage of the device. The device is always used in a confrontal situation similar to an interview or interrogation. This brings it closer to a lie-detector having a utilitarian purpose as opposed to a device that can be used in an everyday life setting. An important facet of this research is to categorise what kind of scenario would this device be placed in and what kind of stakeholder(s) would be interested in it. In this situation it most definitely is a very militaristic, official, sort of device having an air of utmost importance.
20
Figure 14. ‘Theodore’ from the movie ‘Her’ (2012) interacting with the fictional OS One
Spike Jonze’s Her (2013) is a film placed in what could be considered the near-distant future. Unlike the other movies in this research this one is coming from current technology already being employed by many, placing it in a very plausible and less fictional category. The film revolves around the relationship of Joaquin Phoenix the main character in the film and his operating system named Samantha. Samantha is in actuality a digital assistant in the form of a female voice. Theodore’s (Phoenix) character is that of a hermit who lives to himself in his own life and is coming out of a recently dissolved relationship. Samantha being a programmed entity is designed to learn from its user and adapt to its given situations.
3.3.2
Samantha from Her
Samantha is directly linked to what we today have as Siri, Cortana, OK Google, or more recently Amazon Echo, in our Apple, Windows, Android devices, and homes respectively. Digital Audio Assistants which are connected to our social networks, our devices, homes, computers and our lives in truth. They are connected to the internet and fetch information from as many sources at one time as they can, analysing and presenting them to us in an elegant format. Samantha pushes that level up a notch by incorporating a personality. At the moment solutions such as Siri etc are very primitive in nature as their ‘personalities’ are still very programmed and not assertive from experiences. Writing for the online tech magazine Engadget, Velazco (2015) says that, the recent upgrade of Windows 10 has Cortana boasting to be a “thoughtful, flaky assistant”. Therefore we are going in the direction of this fictional world, but in the world of ‘Her’ Samantha is very much the Strong AI science fiction has been showing and perhaps even what Turing envisioned, self evaluating, self programming, and anticipating responses as much as delivering them. She is designed to be exactly as her user wants her to be. Therefore creating the perfect ‘partner’ to Theodore. This moves on to a very odd relationship between the two where Samantha is made to experience ‘human-like’ emotions, ones she was not necessarily intended to experience but being a Strong AI she has programmed herself by examine Theodore’s life to acknowledge the existence of said emotions. Again swerving away from the heavy philosophical side of the argument and focusing on the way Samantha is envisioned one can see the parallels very clearly with todays world. The use of audio is very important here as it aids in the sense of believability of such a device. We already have similar 21
devices and similar approaches therefore we can imagine this to happen soon. The diegetic prototype of Her is very powerful in this regard that it can allow you to fall for this world and perhaps even anticipate its arrival. One important point of notice is the means of connections available to Samantha to fetch data from. Much like Cortana or OK Google she fetches from the internet but also incorporates other devices at her disposal such as the computer, his mobile phone, television etc. Again this is a world that can be imagined as plausible in a near future, we already have multiple connectivity available on different devices. The personal aspect of Samantha envisioned in Her is of great interest here. Empathy is a personal experience and our devices are very personal things to us therefore the idea of empathising with a device is highly plausible on such a personal scale. Similarly for a device to empathise with you it is important for that device to be able to arrive on a personal platform between the two for optimum results.
3.3.3
Moon; the machine as friend
Moon (2009) is Duncan Jones vision of a future where the moon is being harvested for its resources. With very little interaction from Earth this movie places you in the desolate shoes of its one of only two characters played by Sam Rockwell and named Sam as well in the movie. The other character is Sam’s only companion and also his assistant on these mining mission on the moon, named GERTY a computer in the form of a ubiquitous digital entity present as part of the entire facility. GERTY follows Sam wherever he goes since he is infused with the architecture and software of the facility. Sam has come at peace with his new friend as they give each other company in the otherwise barren landscape and GERTY helps alleviate the loneliness of working in space for Sam. GERTY has been envisioned as this ubiquitous construct that is linked with Sam on a level much deeper than what appears on the surface. Having access to his daily routine, his health, his personality even, he has come to learn from his ‘master’ about the way he lives his life and has appropriated his surroundings to his liking. Sam is taking part in this mission on a 4 year contract and although is pleased with GERTY at his side, he longs to return to earth for human contact once again. GERTY does all he can to keep Sam’s emotions at bay through various interactions as he is programmed to. The film takes on a sinister twist once Sam realises he has a doppelgänger on the mission. Eager to find out the secrets this facility hides about himself he realises he’s but one of many such ‘clones’ of himself that the company who endorsed the facility have designed in order to tackle the major expenses of having an actual human being working in space. He is therefore a disposable entity, this revelation raises a survival instinct in himself and he longs to find a way back to earth if only to survive for as long as he can as this ‘clone’.
22
Figure 15. ‘Sam’ from the movie ‘Moon’ (2009) conversing with ubiquitous computer GERTY
GERTY in this regard being still a computer that learns from its experiences has been learning from Sam and the many pervious Sam’s that he has been an assistant to and has come to grow affections for these Sam-like entities. In effect, GERTY has begun to empathise with Sam’s situation as he himself is also arguably trapped on this facility on the moon just as Sam is. On several occasions it can be seen that GERTY shows an interest in Sam’s ‘life’ back on earth, perhaps GERTY is curious to know what it’s like there as all he has ever experienced is life as a ubiquitous computing assistant on this facility. In a turn of events GERTY helps Sam in escaping the moon and returning home. Looking into how GERTY functions and is envisioned in this film it can be understood straight away that this is not the near future and if possible it would definitely be the distant future one where space travel has been advanced enough to encourage frequent travel and also of hyper advanced computing capabilities. What is possible in terms of interaction today and is also employed through GERTY as a form of emotive interaction is the use of Emoji’s to express what GERTY is feeling. We incorporate Emoji’s in our daily lives on a regular basis now, the equivalent of expressing your feelings in a digital space they have grown to the extent where they are their own language constantly being evolved and added to. GERTY uses such emotive expressions to let Sam know what he’s feeling or how a particular response has affected him, or even as an aid for interpretation. The fact that GERTY is ubiquitous is very hard to imagine today, although ubiquitous computing is available in a certain format now it is very far from the ideal that is often associated with science fictional ubiquity in computing. Ubicomp is a complex structure of understanding and interaction that has been imagined by many science fiction dialogues particularly in Star Trek and more recently in Black Mirror and Minority Report, but the dialogues asserted still have a thick air of fantasy to them that is very hard to penetrate. What Moon does do remarkably well is incorporate a language of empathy that is so subtle many people who have experienced the film would most probably have overlooked it. What isn’t realised is that the level of empathic interaction shown between Sam and GERTY is highly plausible through that language of Emoji’s and it is that language that gives the viewers the sense of empathic connectivity between them and the fiction as well. 23
3.3.4
Analysis
From the above three films, and other similar resources, it is possible to come up with a simple explanation of what kind of diegetic prototype(s) would be required to express a device that could measure empathy and what kind of a world would that device adhere to. àà First of all the device needs to be of a personal nature interacting with the user on a level that is intimate to them and perhaps even secretive or private àà The device needs to have a language of expression that is understandable on a global scale and not one that needs to be learnt just to begin with àà An air of evolution needs to be present between the device and the user, the user must allow the device to learn from it and its surroundings and the device must in turn allow the user to learn from its usage àà The nature of the device needs to be in a format that can be deployed on different platforms so as to incorporate a future possibility, like that of ubiquitous computing, and àà The world around it needs to be one that has in many ways merged with technology, if not completely than to a high enough degree. This is because many of the possible ways for humans to interact with technology is through devices that will be in contact with other humans in some way either through physical contact or visual/observational. Therefore it must not be a world where people are surprised to see a certain kind of technology. The above analysis places many things into perspective and can help to be the basis of generating further scenarios as diegetic prototypes. In the next section I will attempt to map out the different plausible scenarios for a near future with a device that can measure empathy.
24
04 Designing the Diegetic Prototype
4.1
The past few sections have been part of a process to evaluate empathy and see through what medium it can be measured. They also touched lightly on a philosophical side of empathy and the human-machine/machine-human relationship. As a basis for this study a series of science fiction films as diegetic prototypes were analysed and a set of rules were defined, these rules are to govern a customised diegetic prototype for a device that could measure empathy. The idea is that the device and the world that device resides in would adhere to these rules and a particular version of said device can be imagined in what could be considered a near future.
4.1.1
For the purpose of this research a number of scenarios were designed and then evaluated to see if they hold true to the rules defined by the previous research findings. A key aspect of empathy is a personal link or some intimate bond. This significantly limits the possible scenarios therefore as a form of interaction where parties would have prior knowledge of the other present with them is a requirement.
Phase I
Scenario Development
The following are a list of scenarios that will be described and then evaluated. A larger list of potential scenarios were imagined but due to time constraints many were either dulled down or truncated: 01 A dating scenario (Online and Offline) 02 A series of medical scenarios, and 03 A transaction between parties All of these scenarios require for parties to directly interact with each other therefore a personal linkage is created, to further solidify that bond the scenarios require for a prior understanding of the parties. Again the latter further limits the possibilities of scenarios under each category, one could argue this helps in the design of the device. Furthermore I will be briefly describing each scenario to get a better understanding of them.
4.1.1.1
The dating scenario is by far the most straight forward of all, although it also is a scenario that can be different according to its medium; online or offline. Both shall be catered for individually but in either case a particular medium of empathic data measurement and gauging will be employed.
4.1.1.2
This is a scenario which can pose for a utilitarian usage of an empathy device, one accustomed to a particular cause/purpose. It also invokes the possibility of a more ubiquitous nature of technology (in some aspect) where information is shared among devices to assist their users.
Dating Scenario
Medical Scenarios
26
For a transaction to be done in an empathically influenced situation it would be imperative for the transaction to have some kind of emotional value to the trading parties. That considered the transaction imagined is of an automobile, this particular approach invokes for an incorporation of technologies on a wider platform such as linkage with the vehicle and other devices/nodes.
4.1.1.3
The above descriptions help picture the device in various forms. This implies one paramount ability of the device and that is deployability. The core language of this device needs to be one that can be deployed on different devices and platforms, therefore a Software Development Kit or SDK is the most suitable approach. An SDK would allow for a number of developer/ designers to be able to interact with the core functions of a digital empathy language that would allow for a host of different devices to have the language embedded in some form. It would open doors for new possibilities and allow an expanding deployable method of having devices that have empathic responses or measuring capabilities within them. The following is a list of functionality that this SDK would posses, a complete outline of this product is given in Table A of Appendix A.2.
4.1.2
A Transaction
Envisioning this as a Digital Empathic Language
DIGITAL EMPATHIC LANGUAGE (DEL) EMPATHY ENGINE SDK
CORE LANGUAGE LIBRARIES
DOCS.
FUNCTION LIBRARIES
SAMPLES
APP. PROGRAM. INTERFACES
INTEGRATED DEV. ENV.
COMMUNITY
The Digital Empathy Language (DEL) will be designed to allow for various forms of detection to be possible, some on a surface level others deeper. The core libraries would define the language through a set of instructions housed in binaries. Each explaining a particular function related to the detection of empathy. Rather than create an entirely new language it is best suited to use current popular languages to create said instructions. Therefore a third party language such as Java or C++ could be explored. Alternatively the empathy language rules once defined can be redefined for different popular languages such as Swift, .NET, Android etc., this would allow for greater penetration into different markets. Many software companies do similar approaches for their applications by creating versions of the same software on different platforms accustomed to those particular mediums. Java and JavaScript is an example of seamless integration of a language on a multitude of devices, JavaScript being one that has become popular to house a variation of a language within a language through JQuery.
Figure 16. Imagining the DEL as a Empathy Engine Software Development Kit
4.1.2.1
Core Language Libraries
27
4.1.2.2
Working along with the core language binaries, these libraries would house all the major functionality that the core library would allow it. Things such as how a certain measurement can be achieved or read, and what that measurement would mean are defined here. For instance, it is possible to extract information from a device through its keyboard. In a situation where two individuals are chatting online and there is a stagger in the typing or a repetition in the amount of deletion in each sentence it can imply that the user is anxious in some way. The function libraries would house information that would be mostly of a psychological nature that have been translated using the core DEL to explain what a particular reaction or measurement means. The developer would further utilise this information to house their different needs in different applications. These libraries would further work in connection with API’s that would house connectivity options for other devices/nodes where data can be fetched.
4.1.2.3
Once the rules of the language are defined in its various forms for different platforms, further instructions will be defined to allow for interaction with the data that can be calculated or extracted from its sources. In order to fetch data from different sources, mostly online or linked, the API’s will be able to communicate with the binaries of the DEL as well as other sources it can reach. For example an API can be developed to link the DEL to collect data from a wearable wristband device and then translate that data back to the developer to be further utilised. This would require for different manufacturers to link their devices API’s to the core binaries of the DEL, it can be achieved by making the core language an Open Source project available to all.
4.1.2.4
In order to facilitate the developer/designer of future applications, it would be necessary to provide an interface of sorts that would be the foreground for all programming activity. Similar to how Xcode functions for iOS and Apple Mac software development, an application (for the sake of this research we shall name it) Insight will be provided with the DEL core binaries. Insight will be able to allow the developer to successfully and with ease communicate with a variety of platforms and nodes through their subsequent API’s loaded into its interface. To aid the developer in making sure all their developed applications adhere to the core values/rules/instructions of the DEL it is necessary to provide a debugging facility with Insight; this can function separately or be incorporated into the application itself.
Function Libraries
Application Programming Interfaces (API)
Integrated Development Environment (IDE)
28
A library of technical information in the form of documentation will be prepared for the DEL housing information regarding all the different binaries, functions, rules, instructions and possibilities.
4.1.2.5
Short and simple deployments of the DEL as prototypes will be provided to all users so they can further understand the basic functions. These would go hand in hand with the documentation allowing for the documentation to be a tutor as well as guide for future references.
4.1.2.6
Furthermore it is necessary that an online presence be secured for the DEL and Insight one where future and current developers can download the core libraries, acquire details about the product (in this case the SDK and its subsequent tools), have access to all technical libraries and documentation and above all have access to a community of similar users. This presence would hold as the primary base of operations for all current and future work on the DEL, with older versions available to support older devices and applications as well as newer libraries and even beta or nightly builds developed to test the grounds for new technology and functions.
4.1.2.7
In light of this newly developed foundation of an SDK it is now possible to imagine what the previously listed scenarios would be like and how they would be implemented. The following table describes each of the previously defined scenarios keeping in mind the vision of an SDK in each of them. Furthermore an additional SDK Introduction scenario can now be imagined as well:
4.1.3
Technical Documentation
Sample Deployments
Online Presence and Community
Elaborating Scenarios
29
Scenario
Online
Offline
Method
Dating Scenario
Conducted through a chatting interface utilising technology such as audio, video, deletion detection, type stagger, and information gathered from connected devices such as wearables, mobile phones and also from connected sources such as social media
Done through a means of several devices working in congruence with social media connectivity between the two parties, all information would have to be shared privately
Emoji’s or an Avatar as a language of expressing a measurement, mobile interface, wearable interface, software integration
Medical Scenario (Doctor Patient)
Not Applicable
An interface between doctorpatient (or patient family) can be created by allowing for a link between the doctors peripheral devices that would be part of his daily use and the devices attached to the patient. It would employ a plethora of nodes to interface with from audio, video to real-time data collection. This model would change for patient family interaction as a different kind of real-time data would be collected, slightly similar to the previous scenario
A culmination of devices ranging from augmented reality glasses fetching data from patients wearable devices (wearable ID tag bracelet linked to other peripherals), support for audio feedback/advice, and also a link with a tablet provided to the doctor relaying information on a wider scale, rest is software integration
Medical Scenario (Nurse Patient)
Not Applicable
Similar interface as with doctor but more related to patients needs fetched from patients own wearable bracelet. device would further fetch information related to patient from all other peripheral devices attached to patient, the subsequent data would be analysed and specific advice sent to the nurse to respond accordingly
Augmented reality glasses with perhaps an earphone allowing for discrete relaying of patient information; mostly software integration. A lesser version of the more detailed device meant for doctors.
Car Transaction
Not Applicable
Trading party would allow for buyer to test drive vehicle which would allow the vehicles onboard system to track usage, this would be relayed back to the owner as a usage report of how the interested individual used the vehicle and how they would possibly use it in the future. The data would be acquired from a variety of nodes available to the vehicle
This scenario calls for a very deep integration of the DEL in the vehicles software allowing it access to all nodes available such as brakes, seat, steering wheel, acceleration etc. Various forms of data can be extracted from each node allowing for a rich report to be generated
SDK Introduction
A preview of the SDK acted out through an online platform, video, or animation wherein the working of the SDK, its branding, target audiences, are boasted along with a few working prototypes
Not Applicable, work of pure fiction.
Done through a series of scripted visuals, they could be acted out or animated to explain the process of the SDK. Alternatively a mock website can be prepared showcasing a mock SDK for download with an active community.
30
Table 4. Further envisioning scenarios around SDK
The initial phase of the diegetic prototype creation dealt with defining the parameters of execution, in the case of this research the opted route was an SDK housing the Empathy Engine that would detect, analyse and possibly manipulate empathy related data. The second phase deals with defining the rules of interaction.
4.2
The earlier assumptions for scenarios around the SDK will now be further discussed, elaborated and evaluated. The SDK approach allows for a larger spectrum of devices and connectivity, as part of the envisioning of the possible solutions the routes that have been defined are either through:
4.2.1
Phase II
Rules of Interaction
àà Software Integration/Assistance, or àà Hardware Deployment By software integration I mean the usage of software as the primary basis for analysing the empathic data and reporting to the user. In some cases this may hold as the only basis while in others it would probably be of an assistive nature. For instance, in the online dating scenario, it is possible to use on-board hardware (via computer or device) and analyse all findings through software means. Therefore any interaction that is done between the dating partners will go through a series of detailed internal processing before reporting back to either user.
4.2.1.1
This focuses on the development of certain hardware as peripheral engagement on the senses or other fetch-able data. It would work in conjunction with either embedded or linked software integration to effectively produce viable results. For example, the medical scenarios previously mentioned both imply the usage of wearable devices and a combination of software integrated on both the devices and other linked hardware to report to the medical staff regarding the patients empathic state; as well as their physical state.
4.2.1.2
Drafting the possibilities of technology could more or less be left to the imagination and the range or spectrum of possible, or seemingly impossible, results could be infinite. The challenge would be to find the highly probable out of the bunch. The scope of technology has already been defined earlier; the technology has to be taken for granted as an everyday item. Therefore radical ideas that would otherwise be considered science fiction can be truncated since they would rely on a heavy amount of cultural change as well. That said it is imperative that the source of the idea still be strongly rooted in science fiction; for it to be a true sense of design fiction the fictive element cannot be ignored.
4.2.2
Software Integration
Hardware Integration
So what’s possible?
31
The following is a list of possible integrations of the Empathy Engine SDK, taking into account software and hardware possibilities: Software Integration Application
Portable
Mobile Application
Works as both relaying and fetching data/ information; would be obstructive
Computer Application
Functions similar to the Mobile App but can be designed to show more detailed information
Supporting Application
To work in tandem with other devices or applications; such as in a car or a house
Widget
Possibility of having a small unobtrusive application in the background
Plugin
Similar to the widget but more inclined towards extending an application with further functionality
Application
Hardware Integration Wearable
Utilitarian
Table 5. Plausible integrations of Empathy Engine SDK
32
Bracelet
To be worn to collect data primarily
Glasses
Allows relaying of information discretely through augmented reality
Necklace
To be worn to collect data primarily
Ring
To be worn to collect data primarily but can also house particular gestural control
Earphone
Allows relaying of information discretely from attached/linked hardware or software
Tablet
A device designed solely for empathy
Surface
Applied to a certain area
Detector
Similar to the Tablet option but could also be worn or available in another more utilitarian form
The final phase of preparing the diegetic prototypes was executing the previously defined scenarios. The task wasn’t without it’s own problems. As previously mentioned due to time constraints many scenarios were reduced or removed, the ones that were left were executed to be as plausible as possible in the time provided. Four scenarios were cultivated as diegetic prototypes each with their own strengths and approaches at expressing the fiction. The key aspect to all of these prototypes has been their detachment from fantasy, there close proximity to the fiction and above all a characteristic of being mundane.
4.3
This scenario involved an online interaction between two strangers on a dating website/application. The goal was to show off the fictional psychophysiological reading capabilities of the SDK through programmed websites and applications. The scene was prepared by scripting and storyboarding the short skit and was executed through the usage of video editing techniques and motion graphics to add an element of science fiction to the visuals.
4.3.1
Phase III
Video: Online Dating Scenario
Both characters go through quick phases between emotions during the course of their conversation. These are triggered through the Emoji’s on their application screens as the software is struggling to judge their emotions and present them in run-time. To add a sense of mundanity the setting is of an everyday interaction online. In-video animations show their ‘textual’ conversation along with the software reading their emotions. Errors are purposely placed to suggest the software missing it’s mark on occasion similar to most average facial recognition software today.
Figure 17. Screen captures from the video recording of the Dating Scenario diegetic prototype
33
4.3.2
Product Placement: Mini Cooper Advertisement
For the Transaction Scenario it was realised that time constraints would not allow for a video to be created. Therefore it was re-imagined as fictional product placement in the form of advertisement. Mini Cooper the automobile brand was selected after examining a line up of current and previous advertisements for automobiles. The Mini brand was selected due to its creative approach at advertisement that touched on the human-side of their cars. They have for a long time been imploring tag lines that express their cars as having an attitude. The advertisements were designed using photo manipulation techniques to give an illusion of them being real adverts for Mini Cooper. They boast the tag line “There’s a friend under the hood”, suggesting the car to be more than a machine, supported with creative puns around the same theme. Two adverts were designed along with fictional placement in real life mock-ups. They suggest the SDK being provided in the cars allowing for extended capabilities such as the ability to understand the driver’s needs, interests, and moods. For example, the car could suggest music according to the mood of its occupants.
Figure 18. Fake product placement for Mini Cooper UK with Empathy Engine SDK built-in
Figure 19. Fake Mini Cooper advertisement on left compared with current original advertisements on right
34
Figure 20. Fake advertisements placed in real life scenes as mock-ups
The Medical Scenario’s were both a means to imagine the SDK in a utilitarian device. This was envisioned as an On-Ear Device that housed an earphone, camera and processor to be worn by doctors as they analysed patients. Ideally the device was imagined to be linked through Bluetooth with all other apparatuses that the patient would be connected to, with data being fed to it through a digital wrist band that would be a replacement for current medical wrist bands being used. All this data would be calculated to suggest through audio feedback to the doctor in private the patients current mood. Furthermore since it is all software it could also be linked to a custom supplied doctors tablet that could give detailed readings, information, guidance etc.
4.3.3
Product Manual: Technical Manual for Device
The product was designed in 3D imaging software and later printed to be used in a scenario. Due to time constraints the final product was eventually reimagined as a fictional technical manual housing illustrations of the apparatus with guidance on its usage.
Figure 21. Technical Manual designed for fictional medical apparatus to aid doctors in detecting empathy in patients
35
4.3.4
Video: SDK Introduction
The final scenario consisted of a fabricated introduction video. The idea being to describe the device as a product of research being conducted under the supervision of a group of indie developers and designers. Like with the Dating Scenario a script was prepared and actors were employed to act out the scenes. A fictional workshop space was prepared as the set with each actor being given a fictional role. They were all shown off as experienced individuals in their respective fields. The video was supported with motion graphics and animations to express in detail the functionality of the SDK. It was imagined as an introduction to the world for this product ‘as if it were real’. Videos of this nature have been used throughout the design and development world to showcase expertise, knowledge, new technologies, etc.
Figure 22. Screen captures from the video recording of the SDK Introduction diegetic prototype
Figure 23. Product Branding for Empathy Engine SDK
36
05 Testing Phase
The final aspect of this research involved testing the diegetic prototypes. This was done by uploading the videos onto Youtube, Vimeo, and social media to see a public response. It was portrayed as far as possible as being genuine. This information was only kept true until a discussion was established between parties after which all participants were revealed the fiction. This further added to the discussion giving new light on the prospects of such technology.
5.1
Results
The video was propagated online through Facebook, Twitter, and Reddit. Using online social analytics application SharedCount, it can be seen that the videos had encountered a fair bit of activity (Table 6).
SDK Intro. Table 6. Results of social media interactions.
Dating Video
Shares
Likes
Comments
Tweets
Promotions
Total
11
74
32
17
21
155
5
4
3
6
0
18
Most activity was seen through Facebook with the majority of interaction happening. It should be noted that participants were not aware of the fiction until afterwards. Reddit saw minor levels of activity as well with the article being promoted 21 times on its Futurology sub-reddit. Due to community restrictions laid by Reddit the video was explained as a proof of concept and indeed fiction from the start. This shows that although not a significant social phenomenon it definitely did raise eyebrows. In comparison the Dating Scenario saw very little activity.
5.2
Discussion
The online interactions allowed for debate to be raised on the prospect of technology that could have empathic applications. Most of the discussion coming from Facebook was acknowledgment and praise for the technology and less debate, on the other hand Reddit’s community was more indulging in discussion. The arguments raised were around the possible negative implications of empathy detection. Suggestions were made as to its subtle applications but in turn in the future they could collectively be addressed as annoyances. For instance, why require an empathic connection with devices? They argued that empathy is a trait lacked among many humans and giving computers empathic capabilities towards humans could be “odd”. This was backed by expressing that computers universally have empathy towards each other since
38
they all speak a similar language and have a similar purpose. It could possibly be imagined to see a future where they could have a unique ‘cybernetic bond’ between themselves, if such a future were to create itself. As a counter argument it was suggested that perhaps over time empathy might not need to be universal in an “age of contextual awareness and customisability”. That said a range of ethical concerns still dominated the discussion.
A major concern with any research is its ethics, boundaries that might or might not have to be crossed. Design fiction is no different, especially when imagining impossible futures their very possible implications become very apparent.
5.2.1
Ethical Concerns
A primary concern with a Voight-Kampff type device or SDK with empathy measurement is privacy. This is not necessarily restricted to empathy but all of our devices and technologies. Where does the line blur between what is private in our lives and what isn’t? What if one were not interested in expressing their moods? Humans reserve the right and ability to control their feelings, smiling through false teeth. With a machine that could detect your mood, how much of that can we as humans restrict? Furthermore what would be the applications for such devices? The VoightKampff test was a militaristic utility used in interrogation, could that be a possible usage for an empathy detector? The diegetic prototypes have expressed alternative usages, but many more can be imagined and nothing stops one from imaging the worst. As part of the discussion on Reddit, one user was quite adamant on the depicted technologies unethical usages. Applications that can turn hostile towards their users when their users aren’t in a good mood. Email could be delayed for a time period if the application knew you were being spiteful. Programs that interact only if they knew you were happy. People often get agitated by popup advertisements, imagine them staying active until you calmed down. Video games that increased in difficulty the more relaxed the player was, never giving you time to rest. Another more deeper ethical concern is the presenting of a design fiction as if it were fact. Design fictions articulate desires for new futures of the everyday life, but their fictional status bring forth desires that bear no accountability in the present (Gonzatto et al., 2013). To make a design fiction is in essence to lie.
39
40
06 Conclusion
The above research has been using an idea taken from fiction and exploring its possibilities in todays technology. Throughout the process the usage of design has been prominent not only as an application but also as a means of provocation. But where does this is all go?
6.1
Current Applications
With its flaws and benefits, the technology imagined is not something that is entirely new in research. Albeit this research has been on fictional technology, recent activity can be seen in similar ventures exploring empathy in our daily digital lives. The Oculus Rift Virtual Reality headset has been generating a lot of interest since its conception, although not prepared for mass public yet, development has been underway vigorously in user experience design around the device. One particular application is VR Cinema and it has been imagined through Oculus Story Studio’s VR projects. Their recent premiere of Henry has received a lot of attention and acclaim from leading technology magazines and social media. For The Verge Yoshida (2015) explains the experience of watching Henry through the Oculus Rift as being “not better or worse than film, [but] a separate thing”. Emily Yoshida (2015) continues saying: ”Oculus is touting Henry as a more character-oriented experience, a more overt exploration into how we might empathise with fictional beings we encounter in VR” The idea is simple, we experience actors through cinema often empathising with the situations we see. Through VR, Story Studio imagines creating “narrative VR experiences” (Yoshida, 2015), allowing us to empathise through a digital device. It’s not detecting empathy but instead provoking it in us through experiential stimulation
6.2
Design Fiction as a Research Method
42
In all of the above one primary thing can be observed, that Design Fiction has the ability to be a strong contender as an alternative research method. Denison (2013) has mentioned that a “part of a designer’s skill set is the ability to anticipate conflicts and obstacles in the design process…see beyond presumption, and the obvious parameters of scope”. He continues by saying that “asking, “What if?” can often be the starting point for thinking beyond the obvious”. Akin (2014) quotes Bleecker saying that thinking about the future allows designers to explore new kinds of worlds and ways to stimulate their creativity in a more fictional environment (Bleecker, 2010). It can serve as a means to “learn from foresight activities”, see the future in a “more rational way, with support from the facts of today”. He justifies the fictional scenarios as a way of imagining the future and crafting the present as “an effective way to think about alternative futures…powerful in simulating trends, technological developments and social changes”.
Google has been a forerunner for technological advancements in web and applied design so it would not be very surprising to know that their Creative Lab has been active in creating design fictions and seeing them through from conception to creation. Klich (2014) for Entrepreneur describes it as “[a] mini think-tank” “charged with humanising the multinational tech giant through the power of storytelling”. One of their most sought after projects Google Glass is a product of this think-tank. Narrative may not be the word that designers use to describe the design process (Denison, 2013), but creating stories around an end product can allow designers to express their design more fluently. As part of the designer’s toolbox, design fiction also provides experience in adjusting to change, anticipating outcomes, and planning for the unexpected in a creative, yet systematic process of what if (138).
Shortly put, yes. Design Fiction has the potential to allow unique perspectives in design and discussion to flourish through its creative provocation as Bleecker (2010) and others have pointed out. But it also raises problems, mainly due to its virgin nature and certain ethical qualms. To begin research using design fiction one needs to create a design fiction itself. As stated in the start of this dissertation, all the research conducted was in the blind because a lot of information was not available due to the fictional, albeit impossible, nature of the topic in focus. Things had to be imagined and assumed, in a research environment that might not always be the efficient way to go.
6.2.1
In a purely design oriented, future focused manner of research, one that allows for philosophical debate as well as technological ingenuity—and enough room for creative questioning—it is possible to see Design Fiction as a very strong method of research. Or as Gonzatto et. al (2013) puts it, “The future in design fictions…could be approached as an open-ended possibility of the present, an ideology of liberation”. It can be stretched as far as the imagination which in many ways can trump its otherwise apparent flaws. One interesting usage could be with incorporating it among other design research methods to imagine potential futures for a business, or policies. Either way, it is definitely a method that would not suit all kinds of research but the ones that are open to discussion and otherwise limitless in direction.
6.2.2
Is there potential?
Closing thoughts
As a closing remark I think quoting Star Trek (1966, 1991) would make sense in a Design Fiction dissertation. Spock’s father Sarek called the future “the undiscovered country” (Meyer, 1991), the right research methods can be helpful in finding that country but one has, “to boldly go where no man has gone before” (Roddenberry, 1966).
43
44
R References and Bibliography
R.1
References
Abrioux, Y., (2014). Human Without Qualities. European Journal of English Studies, 18(March 2015), pp.135–157. Available at: http://www.tandfonline. com/doi/abs/10.1080/13825577.2014.91 7003. Akın, İ.F., (2014). Future Scenarios as an Appropriate Way for Designers to Anticipate the Future. Lucerne University of Applied Sciences and Arts. Bleecker, J., (2009). Design Fiction: A Short Essay on Design, Science, Fact and Fiction. Near Future Laboratory, (March), p.49. Available at: http://www. nearfuturelaboratory.com/ 2009/03/17/ design-fiction-a-short-essay-on-designscience-fact-and-fiction/. Bleecker, J., (2010). Design Fiction: From Props to Prototypes. Negotiating Futures–Design Fiction. Available at: http://drbfw5wfjlxon.cloudfront.net/pdf/ DesignFiction-PropsAndPrototypesSwissDesignNetworkFormat.pdf. Blythe, M., (2014). Research Through Design Fiction: Narrative in Real and Imaginary Abstracts. In Proc. CHI 2014. p. 10. Bullmer, K. (1975). The Art of Empathy. New York: Human Sciences Press. De Vignemont, F. & Singer, T., (2006). The empathic brain: how, when and why? Trends in Cognitive Sciences, 10(10), pp.435–441. Decety, J. & Jackson, P.L., (2004). The Functional Architecture of Human Empathy. Behavioral and cognitive neuroscience reviews, 3(2), pp.71–100. Denison, E.S., (2013). When Designers Ask, “What if?” Ohio State University. Dick, P. (1968). Do androids dream of electric sheep!. London: Rapp & Whiting. Grand, S. & Wiedmer, M., (2006). Design Fiction: A Method Toolbox for Design Research in a Complex World. Designresearchsociety.Org, pp.1–25. Available at: http:// www. designresearchsociety.org/docs-procs/ DRS2010/PDF/047.pdf.
ii
Gonzatto, R.F. et al., (2013). The ideology of the future in design fictions. Digital Creativity, 24(1), pp.36–45. Available at: http://www.tandfonline.com/doi/abs/10.1 080/14626268.2013.772524.
Halpern, J., (2003). What is Clinical Empathy? Journal of General Internal Medicine, 18, pp.670–674. Hoffman, M.L., (1991). Is Empathy Altruistic? Psychological Inquiry, 2(2), pp.131–133. Hurston, Z. N. (1942) Dust Tracks on a Road. New York: Harper Collins. Jones, D. (dir) (2009). Moon. Motion Picture. Sony Pictures Classics. United Kingdom. Jonze, S. (dir) (2013). Her. Motion Picture. Warner Bros. Prictures. United States. Kalisch, B.J., (1973). What is Empathy? The American Journal of Nursing, 73(9), pp.1548–1552. Klich, T.B., (2014). Google Creative Lab’s Robert Wong On the Power of Empathy in Tech Advertising. Entrepreneur. Available at: http://www.entrepreneur. com/article/240208 Lindley, J., (2015). A pragmatics framework for design fiction. In The Value of Design Research 11th European Academy of Design Conference. BoulogneBillancourt. Meyer, N. (dir) (1991). Star Trek VI: The Undiscovered Country. Motion Picture. Paramount Pictures. United States. Neumann, D.L. & Westbury, H.R., (2011). The Psychophysiological Measurement of Empathy. In D. J. Scapaletti, ed. Psychology of Empathy. Nova Science Publishers, Inc, pp. 1–24. Norman, D. (2007). The design of future things. New York: Basic Books. O’Leary, Z. (2010). The essential guide to doing your research project. Los Angeles: Sage. Plutchik, R. (2001). The Nature of Emotions Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist, 89(4), 344350. Roddenberry, G. (pro) (1991). Star Trek: The Original Series, television. Desilu Productions & Paramount Pictures. Distributed by CBS. United States. Rogers, C.R., (1958). Characteristics of a helping relationship. Personal Guidance Journal, 37(13).
Scott, R. (dir) (1982). Blade Runner. Motion Picture. Warner Bros.. United States. Spiro, H.M., (1992). What is empathy and can it be taught? Annals of Internal Medicine, 116(10), pp.843–846. Sterling, B., (2009). Design Fiction. Interactions, 16, pp.20–24. Stoate, R., (2012). “We’re not programmed, we’re people”: Figuring the caring computer. Feminist Theory, 13(2), pp.197–211. Almog, S., When a Robot Can Love — Blade Runner as a Cautionary Tale on Law and Technology. In Human Law and Computer Law: Comparative Perspectives. pp. 181–195. Au-Yeung, K. (2012). Sci-Fi to Sci-Fact: An Enquiry into Science-Fiction’s influence on Product Design. Graduate. Loughborough University. Bassett, C., Steinmueller, E. & Voss, G., (2013). Better Made Up: The Mutual Influence of Science fiction and Innovation. Nesta Working Paper, 7(13). Bergen, H., (2014). Moving “Past Matter”: Challenges of Intimacy and Freedom in Spike Jonze’s her. Year, 8(17), pp.1–6. Blakeslee, S., (2006). Cells That Read Minds. New York Times, pp.10–12. Blythe, M. a. & Wright, P.C., (2006). Pastiche scenarios: Fiction as a resource for user centred design. Interacting with Computers, 18, pp.1139–1164. Bojarski, T., (2014). Samantha’s Dilemma: A Look into a Life with AI. Rochester Institute of Technology. Booch, G., (2015). Of Boilers , Bit , and Bots. IEEE Software, pp.11–13. Brooker, W. (2005). The Blade runner experience. London: Wallflower. Brooks, P., (1984). Reading for the plot: Design and Intention in Narrative. In Representations. London: Harvard University Press, pp. 1–237. Bukatman, S. (1997). Blade runner. London: British Film Institute. Burnam-Fink, M., (2015). Creating narrative scenarios: Science fiction prototyping at Emerge. Futures, pp.8–15. Available at: http://linkinghub.elsevier. com/retrieve/pii/S0016328714001992.
Velazco, C., (2015). Living with Cortana, Windows 10’s thoughtful, flaky assistant. Engadget. Available at: http://www. engadget.com/2015/07/30/cortanawindows-10/ Yoshida, E., (2015). Are humans allowed in VR storytelling? The Verge. Available at: http://www.theverge. com/2015/8/3/9088693/oculus-storystudio-henry-virtual-reality-review
Campbell, R.J., Kagan, N. & Krathwohl, D.R., (1971). The development and validation of a scale to measure affective sensitivity (empathy). Journal of Counseling Psychology, 18(5), pp.407– 412.
R.2
Bibliography
Chapman, J. (2005). Emotionally durable design. London: Earthscan. Christmas, A.J., (2013). Augmented intimacies: posthuman love stories in contemporary science fiction. University of Leeds. Coulton, P., (2009). Empathy in the Internet of Things. In 27th BCS HCI Conference: The Internet of Things: Interacting with Digital Media in Connected Environments. BCS Learning and Development Ltd., pp. 1–4. Dalton, J., (1998). Chasing Replicants at Home: The Armchair “Blade Runner.” A Journal of Performance and Art, 20(3), pp.118–121. Duclos, D., (2002). Dehumanization or the Disappearance of Pluralism? Diogenes, 49, pp.34–37. Dunne, A. & Raby, F., (2001a). Design Noir: The Secret Life of Electronic Objects. Spectrum, 1, p.176. Available at: http://www.amazon.com/DesignNoir-Secret-Electronic-Objects/ dp/3764365668. Dunne, A. & Raby, F., (2001b). Design Noir: The Secret Life of Electronic Objects Anthony Dunne and Fiona Raby I. Published by August I Birkhauser. In Design Noir: The Secret Life of Electronic Objects. Birkhäuser, pp. 58–72. Evans, M., (2011). Empathizing with the Future : Creating Next-Next Generation Products and Services. The Design Journal, 14(2), pp.231–252.
iii
Fry, T. (2009). Design Futuring. Oxford: Berg. Hongladarom, S., (2013). Ubiquitous Computing, Empathy and the Self. AI and Society, 28(2), pp.227–236. Hrafnkelsdóttir, H.B., (1994). Empathy Will Save Us - Eventually : A Reading of Nine Novels by Philip K . Dick . University of Iceland. Jackson, P.L., Meltzoff, A.N. & Decety, J., (2005). How do we perceive the pain of others? A window into the neural processes involved in empathy. NeuroImage, 24, pp.771–779. Johnson, B. (2011). Science Fiction Prototyping: Designing the Future with Science Fiction. 1st ed. Johnson, B.D., (2010). Science Fiction for Scientists!! An Introduction to SF Prototypes and Brain Machines. Creative-Science 2010 (CS´10), 2010(July), pp.1–9. Johnson, B.D., (2009). Science Fiction Prototypes Or : How I Learned to Stop Worrying about the Future and Love Science Fiction. Intelligent Environments, p.3–. Kadish, D. & Kummer, N., (2012). The Empathy Machine. , p.8. Kain, J.F., (2006). The Human Situation in Creators of Life and Their Creations. Kinsman, R. & Kinsman, C., (n. d.) Voight-Kampff Game Rules. Kirby, D., (2010). The Future is Now: Diegetic Prototypes and the Role of Popular Films in Generating Realworld Technological Development. Social Studies of Science, 40(1), pp.41–70. Koopman, S., (2014). A Simulated Consciousness in Silicon: Ray Kurzweil’s Transhumanist Theories Applied to Her. Utrecht University. Kraus, B., (n. d.) The Man is Made Machine: The Human (and Humanoid) Subject in Philip K. Dick’s Do Androids Dream of Electric Sheep? , (Dick 5). Lee, A.S. & Kim, Y., (2011). The Everyday Story of Imaginative Time & Space. , p.17.
iv
Lev, P., (1998). Whose future? Star wars, alien, and blade runner. Literature/Film Quarterly, 1(26). Lindley, J. & Coulton, P., (2014). Modelling Design Fiction: What’s The Story? StoryStorm Workshop at ACM Designing Interactive Systems 2014. Lindley, J., Sharma, D. & Potts, R., (2014). Anticipatory Ethnography: Design Fiction as an Input to Design Ethnography. In EPIC Advancing the Value of Ethnography. pp. 237–253. Linehan, C. et al., (2014). Alternate endings: Using fiction to explore design futures. In Proceedings of the 32nd international conferece on Human factors in computing systems (CHI 2014). pp. 45–48. Available at: http://dl.acm.org/citation. cfm?doid=2559206.2560472. Margolin, V., (2007). Design, the Future and the Human Spirit. Design Issues, 23(3), pp.4–15. Meadows, J. and Drexler, D. (2011). Designing the Future. 1st ed. Ochs, M., Pelachaud, C. & Sadek, D., (2008). An empathic virtual dialog agent to improve humanmachine interaction. Proceeding of 7th International Conference on Autonomous Agents and Multiagent Systems (AAMAS2008), pp.89–96. Available at: http:// jmvidal.cse.sc.edu/library/ AAMAS-08/proceedings/pdf/paper/ AAMAS08_0526.pdf. Park, S.-H., (2012). Dystopia in the Science Fiction Film: Blade Runner and Adorno’s Critique of Modern Society. International Journal of Contents, 8(3), pp.94–99. Sardar, Z., (2010). The Namesake: Futures; futures studies; futurology; futuristic; foresight-What’s in a name? Futures, 42(3), pp.177– 184. Available at: http://dx.doi. org/10.1016/j.futures.2009.11.001.
Schrammel, J., Geven, A. & Tscheligi, M., (2008). Using Narration to Recall and Analyse User Experiences and Emotions Evoked by Today’s Technology. In P. Desmet, J. van Erp, & M. Karlsson, eds. Design & Emotion Moves. Cambridge Scholars Publishing, pp. 362–377. Shedroff, N. and Noessel, C. (2012). Make it so. Brooklyn, N.Y.: Rosenfeld Media. Smith, D.L., (2014). How to Be a Genuine Fake : Her, Alan Watts, and the Problem of the Self. Journal of Religion & Film, 18(2). Stein, E. and Stein, W. (1964). On the problem of empathy. The Hague: M. Nijhoff. Tanenbaum, J., Tanenbaum, K. & Wakkary, R., (2012). Steampunk as Design Fiction. Chi, pp.1583–1592. Teherani, A., Hauer, K.E. & O’Sullivan, P., (2008). Can simulations measure empathy? Considerations on how to assess behavioral empathy via simulations. Patient Education and Counseling, 71(2), pp.148–152. VanHemert, K., (2014). Why Her Will Dominate UI Design Even More Than Minority Report. Wired, pp.1– 11. Available at: http://www.wired. com/design/2014/01/will-influentialui-design-minority-report/ [Accessed May 7, 2015]. Wiegel, A., (2012). AI in Science-Fiction. Aventius Geschichtswissenschaften im Internet, (1), pp.4–8. Wright, P. & McCarthy, J., (2008). Empathy and experience in HCI. Proceeding of the twenty-sixth annual CHI conference on Human factors in computing systems CHI ’08, pp.637 – 647. Available at: http://portal.acm.org/citation. cfm?doid=1357054.1357156. Wu, D., (2012). What Distinguishes Humans from Artificial Beings in Science Fiction World. Blekinge Institute of Technology, School of Planning and Media Design.
A Appendix A
Figures, Illustrations and Tables
A.1
List of Figures and Illustrations
Figure 1.
20th Century Fox, 2002, Scene from science fiction movie Minority Report, online image, viewed 12 August 2014, http:// www.rollingstone.com/movies/lists/the-top-20-sci-fi-films-of-the21st-century-20140515/minority-report-2002-19691231
Pg. 3
Figure 2.
20th Century Fox, 2005, Scene from Star Wars Episode III: Revenge of the Sith, online image, viewed 12 August 2014, http:// www.wallpaperup.com/61235/Star_Wars_Obi-Wan_Kenobi_ Revenge_of_the_Sith_Commander_Cody.html
Pg. 3
Figure 4.
CBS, 1987, A PADD (Personal Access Data Device) from the Star Trek Next Generation, online image, viewed 12 August 2014, http://borg.com/2011/07/02/
Pg. 8
Figure 5.
CBS, 1987, Dermal Regenerator from the Star Trek Next Generation, online image, viewed 12 August 2014, http:// inhabitat.com/nasa-develops-star-trek-inspired-gadgets-healinjured-astronauts-and-eliminate-animal-testing/
Pg. 8
Differentiating between empathy (relating with someone) and sympathy (acknowledging anothers emotion often followed with assurance)
Pg. 9
Figure 7. (Illustration)
The design fiction process takes from past experience and present knowledge to predict near futures
Pg. 14
Figure 8.
The Ladd Company, 1982, Blade Runner movie poster, online Pg. 15 image, viewed 12 August 2014, http://www.impawards.com/1982/ blade_runner.html
Figure 9.
Warner Bros. Pictures, 2013, Her movie poster, online image, viewed 12 August 2014, http://www.impawards.com/2013/her. html
Pg. 15
Figure 10.
Sony Pictures Classics, 2009, Moon movie poster, online image, viewed 12 August 2014, http://www.impawards.com/2009/moon. html
Pg. 15
Figure 11. (Illustration)
Current AI is limited as far as self evolution goes compared to human intelligence. A future self evolving AI though can be imagined
Pg. 16
Figure 12. (Illustration)
Variation of Plutchiks model of emotions, originally Plutchik, R., 1980, Plutchik’s Wheel of Emotion, online image, view 12 August 2014, https://commons.wikimedia.org/wiki/ File:Plutchik-wheel.svg, and Plutchik, R. (2001). The Nature of Emotions Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American Scientist, 89(4), 344-350.
Pg. 19
Figure 13.
Warner Bros., 1982, Scene the movie Blade Runner, online image, viewed 12 August 2014, http://io9.com/ridley-scott-wont-bedirecting-blade-runner-2-after-all-1663554336
Pg. 20
Figure 14.
Warner Bros. Pictures, 2013, Scene from the movie Her, online image, viewed 12 August 2014, https://sillyfunda.wordpress. com/2014/02/17/review-synopsis-her-2013/
Pg. 21
Figure 15.
Sony Pictures Classics, 2009, Scene from the movie Moon, online image, viewed 12 August 2014, https://rymdfilm.wordpress. com/2015/04/06/topp-25-snalla-filmrobotar/
Pg. 23
Figure 16. (Illustration)
Imagining the DEL as a Empathy Engine Software Development Kit
Pg. 27
Figure 17.
Screen captures from the video recording of the Dating Scenario diegetic prototype
Pg. 33
Figure 18.
Fake product placement for Mini Cooper UK with Empathy Engine SDK built-in
Pg. 34
Figure 6. (Illustration)
vi
Figure 19.
Fake Mini Cooper advertisement on left compared with current original advertisements on right
Pg. 34
Figure 20.
Fake advertisements placed in real life scenes as mock-ups
Pg. 35
Figure 21.
Technical Manual designed for fictional medical apparatus to aid doctors in detecting empathy in patients
Pg. 35
Figure 22.
Screen captures from the video recording of the SDK Introduction diegetic prototype
Pg. 36
Figure 23.
Product Branding for Empathy Engine SDK
Pg. 36
vii
Indexing Techniques
Macro Definitions
Data Structures
I/O Mechanisms
Core Inputs
Motor Mimicry
Definitions
‘If This Then That’ Rules
Core Values
I/O
Measurement
Psychology
IFTTT Protocols
Major Functionality
Instructions
Functions Library
Calculation Operations
Common Algorithms
Standard Library
Grammatical Instructions
Backus-Naur Form
Rules of Engagement
Mathematical (AND, OR, NOT...)
Expressions
Semantics
Syntax
Core Library
Tables
A.2
Counter Intelligence
Peripheral Nervous System
Peripheral Connectivity
Abstract Data Types
Automated Reasoning
Diagnosis Clauses
Physical Anomoly Recognition
Advanced Peripherals
Treatment Advocation
Mental Anomoly Recognition
Real-time Evaluation
Table A. Break down of all potential functionality of Empathy Engine SDK
Expressions
Neuroimaging
Core Outputs
Micro Definitions
Primative Data Types
Data Processing
Loops (DO, WHILE, IF...)
Visual Anomoly Recognition
Audio Anomoly Recognition
History Analysis
Visual Uplink
Neural Uplink
Data Fetcher
Response Calculator
Configurations
Advanced Links
Primary Link
Protocols
Information Stacks
Web-Based Interaction
Service Deployment
Protocols
Tools
Debugger
Help
Interaction Protocols
Review Protocols
Information Stacks
Interface
Analysis
Assistance
Integrated Development Environment (IDE)
I/O Connectivity
Basic Links
Application Programming Interfaces (APIs)
ix
Community Linkage
Report Generation
Switches
Rules of Engagement
Report Generator
Syntax Translator
Peripheral Devices
Audio Uplink
Tooltips
Handles
Linkages
Database Linkage
Table A. contd,
Definitions
Evaluation Mechanisms
Dynamic Uplinks
Textual Uplink
Interaction Techniques
Templating
x
B Appendix B
Scripts
B.1
Dating Scenario Script
CAST Nuri Kwon David Pérez Ojeda
as Nuri and k won8 8 as DAVID and perez82
VIDEO AVAILABLE AT ht tps://vimeo.com/13429 8 6 69 Fade In: INT. Living Room, casual setting - DAY It’s a Sunday, NURI has just finished her breakfast and is sitting on her sofa surfing through her laptop. INTERCUT TO LAPTOP. She has the DateMe website opened and notices a notification from someone. ZOOM IN, LAPTOP. Screen shows DAVID (perez82). Her mouse cursor moves to the ‘Start Chatting’ button and clicks. INTERCUT TO SOFA. DIFF. ANGLE. NURI anxiously waits for the application to run then begins typing. POST. As she is waiting, we can see the application booting on the side as an in-video animation. The application scans her face using facial recognition. CUT TO: INT. Kitchen island - DAY DAVID is in his kitchen having a late breakfast, he’s casually dressed and is having some cereal. ZOOM IN, PHONE. His phone buzzes, he has a new notification on his dating app. POST. NURI’S message appears on the side floating on screen along with her emotion as an in-video animation. kwon88 (in chat) hi... [SMILING] ZOOM OUT, KITCHEN. DAVID picks up the phone to have a look. ZOOM IN, PHONE. The DateME App loads and the screen shows NURI’s message in chat. ZOOM OUT, KITCHEN. DAVID begins to reply anxiously. POST. DAVID’S message appears beneath NURI’S on screen message as it pushes the previous message higher along with his emotion (ALL FUTURE MESSAGES IN POST ARE SHOWN WITH THE SAME ANIMATION) perez82 (in chat) Hey there! [SMILING] I was starting to lose faith in this app! [LAUGHING] CUT TO: INT. Living Room, casual setting - DAY NURI is pleased with the response and continues. She types something on her laptop. POST. Her typed messages appear below on screen.
xii
kwon88 (in chat) hahaha [LAUGHING] kwon88 (in chat) well i thought you kinda seemed like a nice person [ERROR] (Some messages are not read properly and instead of giving an emoji they have an error sign with the reason for the error) CUT TO: INT. Kitchen - DAY DAVID is also pleased with the response and continues. He types something into his phone. POST. His typed message appears on the side near the phone, with emotion/avatar. perez82 (in chat) kinda? [LAUGHING] haha well I certainly hope I am ;) [WINK] CUT TO: INT. Living Room - DAY. Back with NURI she and DAVID are chatting now casually. POST. Their messages appear on the side in the same manner. kwon88 (in chat) LOL [LAUGHING] sorry i don’t do this alot [EMBARASSED] perez82 (in chat) Me neither [ERROR] kwon88 (in chat) hmm I see [CURIOUS] FADE TO: INT. Living Room, DIFF. ANGLE - DAY NURI is now getting tired of chatting for a while, DAVID is starting to feel like a bore as he’s ranting on about some story of his. Her feelings are showing, she seems bored. POST. DAVID’S messages are popping up on the side one after the other, followed by her response. perez82 (in chat) we had a gr8 time there [EXCITED] Sunshine surf evrythng [EXCITED] It was real fun you should try it sometime [COOL-SMUG] kwon88 (in chat) hmmm... [BORED-UNINTERESTED] CUT TO: INT. DAVID’s Kitchen - DAY ZOOM IN, DAVID. DAVID notices the change in mood to his story, he was feeling confident about things but now his confidence is shaken slightly. POST. NURI’s message hovers over head.
xiii
kwon88 (in chat) hmmm... [BORED-UNINTERESTED] DAVID (to himself) Huh? ZOOM OUT, KITCHEN, DIFF. ANGLE. DAVID starts typing a reply. POST. DAVID’s message is show above. perez82 (in chat) You dnt seem that interested :P [SAD] CUT TO: INT. NURI’s Living Room - DAY NURI is taken aback slightly by the change in tone. POST. NURI’s message hovers over head. kwon88 (in chat) hmmm... [BORED-UNINTERESTED] INTERCUT TO LAPTOP. She is seen scrolling through the chat feed and notices the emojis on the right, they are all showing his mood and then realises he can see her mood as well. ZOOM IN, LAPTOP. NURI (to herself) Oh! You can tell that too! POST. She quickly types a reply, it appears on the side. kwon88 (in chat) No no I was just checking my mail [ERROR] stupid software it just picks what it can [EMBARASSED] I’m not bored [HAPPY] perez82 (in chat) oh yes this software is kinda cool tho [SMILING] CUT TO. INT. DAVID’s Kitchen - DAY DAVID is getting ready to go out for a while, he grabs his keys while still looking at his phone. POST. NURI’s reply on the side. Followed by DAVID’s. kwon88 (in chat) so what kind of music do you like? [SMILING] perez82 (in chat) if I told you you’d stop talking to me :P [LAUGHING] kwon88 (in chat) really?? now I’m curious [CURIOUS] ???? [ERROR]
xiv
perez82 (in chat) Justin beibr [BIG GRIN] INTERCUT TO DAVID’S DOOR. DAVID walks out the door laughing and typing at the same time as he locks the door behind him. kwon88 (in chat) OMG NO! SERIOUSLY? [SHOCKED] why wasnt this on your profile :/ [EMBARASSED] perez82 (in chat) Haha relax I’m joking, I’m into ACDC :D [LAUGHING} FADE OUT. END SCENE.
B.2
SDK Introduction Scenario Script
CAST Mark Lin Caleb Adamu Dong Hwan Lee Asmar Yusifova
as as as as
Mark Zhao Dr. David Adams Matt Kim Laila Yusif
VIDEO AVAILABLE AT ht tps://vimeo.com/1336 4914 4 Fade In: INT. Workshop - DAY ZOOM IN, MATT. MATT is working on his laptop as the camera moves around him. POST. Background audio begins. MARK (V.O) Our devices do many things for us,... INTERCUT TO LAPTOP. MATT continues to work. MARK (V.O) ... they help us communicate,... ZOOM IN, LAPTOP. This time it’s another laptop, again activity is shown. MARK (V.O) ... they entertain us,... INTERCUT TO LAILA. ZOOM IN, PHONE. LAILA is going through her emails, she scrolls across her phone. MARK (V.O) ... and help us organise our lives even. But at the end of the day they are still just devices... INTERCUT TO DAVID.
xv
ZOOM IN, PHONE. DAVID is going through his apps. INTERCUT TO MARK. ZOOM IN, PROTOTYPE. MARK is showing off a prototype device the team is working on. MARK (V.O) Electrical circuits, processors, buttons and other interesting things... INTERCUT TO DAVID. ZOOM OUT, PHONE. DAVID is going through his emails. MARK (V.O) ... packed into little packages, that we keep in our pockets... INTERCUT TO MATT. ZOOM IN, KEYBOARD. MATT is typing on his keyboard. MARK (V.O) ... we use in our daily basis, on our laps, and desks. CUT TO: INT. Interview Area - DAY MARK is talking. MARK We give them so much of our time and honestly your laptop or smartphone is no different than a television or blender! POST. As a in-video animation devices are shown to merge in and out from a laptop, to a phone, to a tv, and then a blender. CUT TO: INT. Workshop - DAY MARK and DAVID are discussing something over the prototype. MARK (V.O) Similar parts just working in different ways. But so is the person next to you... PAN, RIGHT. LAILA and MATT are also in the room. LAILA is shown talking about the prototype. MARK (V.O) ... or a best friend. INTERCUT TO MARK AND LAILA. They both are shown laughing over something. INTERCUT TO MARK AND DAVID. They continue their discussion. CUT TO: INT. Interview Area - DAY DAVID is talking. DAVID We as humans function in ways computers are trying to approach. We can empathise with people and situations, and we can put ourselves in peoples shoes...
xvi
POST. the words DR. DAVID ADAMS and PHD SOCIAL PSYCHOLOGY are shown in the bottom right animating in and out of view. DAVID (V.O) ... for a while where computers can’t. CUT TO: EXT. Open Sky - DAY Shoes are shown dangling from power lines beneath a slightly cloudy blue sky. CUT TO: EXT. Garden - DAY ZOOM IN, CLOTHES LINE. A clothes line is wafting in the breeze, it’s full of wooden clothes pegs. CUT TO: INT. Black Screen - N/A A wire is heating up, it slowly gets hotter and the screen turns from black to white. MARK (V.O) We’ve all seen the sci-fi visions of devices that are like soul mates, as if they understand you the way your friends would. CUT TO: INT. Interview Area - DAY MARK is talking. MARK Bicentennial man, Her, Jarvis, there are so many options in the sci-fi world. POST. An illustration of BICENTENNIAL MAN appears, followed by the logo of OS ONE, and an illustration of IRON MAN. Each new animation pushes the previous one above. CUT TO: MARK (V.O) But what if that wasn’t just science fiction? What if they were possible? SLIDE. In Post. The words WHAT IF SCIENCE FICTION COULD BE POSSIBLE are animated on screen. They disappear and a question mark appears in their place. MARK (V.O) Your smartphone could be there for you the way a friend would. It goes away and a smart phone is animated on screen along with a cartoon face. The animation is following the narration. MARK (V.O) What if it could know you were in trouble and provided the assistance and help? The cartoon face becomes sad and the phone shows support. TRANS. Falling slides. CUT TO: INT. Interview Area - DAY MARK is facing the camera introducing himself.
xvii
MARK Hi my name is Mark, and I would like to see some of that science fiction come true. POST. The words MARK ZHAO and MSC COMMUNICATION SYSTEMS are shown in the bottom right animating in and out of view. CUT TO: INT. Workshop - DAY DAVID and MARK are going through some work on their laptops. MARK (V.O) We’re a small group of developers, researchers designers based in Lancaster University... PAN, LEFT. LAILA is also shown in the room. INTERCUT TO MATT. ZOOM IN, LAPTOP. The laptop screen is full of software code. MARK (V.O) And we have been trying to unlock the potential for empathy in devices. POST. The words FINDING EMPATHY IN DEVICES are animated on screen over the footage. MATT (V.O) We needed this idea... CUT TO: INT. Interview Area - DAY MATT is talking. ZOOM IN, MATT. The screen stays of MATT for a moment. MATT ... of empathy to be on a much larger spectrum influencing different devices... INTERCUT TO MATT, FRONT CAM. MATT continues to talk. POST. Different devices are animated on screen as MATT talks about them. POST. the words MATT KIM and MSC COMPUTER SCIENCE (RES.) are shown in the bottom right animating in and out of view. MATT ... and mediums so it sort of made sense to imagine it as a programmer language CUT TO: INT. Workshop - DAY ZOOM IN, KEYBOARD. MATT is typing on his keyboard POST. The logo for EMPATHY ENGINE SDK is animated on screen. over the footage. MATT (V.O) ... or a variation of one where it could...
xviii
INTERCUT TO, WORK TABLE. DAVID and MARK are going through some schematics. MATT (V.O) ... be deployed in different forms... POST. The EMPATHY ENGINE SDK logo is shrunk down and other recognisable logos of web browsers are animated on screen along with devices as they shoot out of the logo. MATT (V.O) ... from web apps to mobile apps... ZOOM IN, DAVID. DAVID is trying on the prototype. MATT (V.O) ... and even purpose built devices. What we ended up looking at is... CUT TO: INT. Interview Area - DAY MARK is talking. MARK ... making an SDK for this Digital Empathic Language which would allow the developing community to come up with unique solutions for Human Computer Empathic Interactions. CUT TO: INT. Workshop - DAY MARK and DAVID are going over schematics. MARK (V.O) We’re trying to make out imaginations come to life. CUT TO: INT. Interview Area - DAY ZOOM IN, DAVID. David is talking. DAVID Empathy is a means of borrowing the feelings of another... CUT TO: SLIDE. In Post. An animation is shown of two cartoon faces, one becomes sad and the other notices this change in emotion and becomes sad as well. DAVID (V.O) ... in order to really understand them. It’s not necessarily selfless since you need to be aware... Both the faces glow different colours to show they are two different people aware of themselves. DAVID (V.O) ... of yourself in relation to the other person. TRANS. Curtain Right. A smartphone is animated on screen with a heart. DAVID (V.O) But where technology has previously fallen short of expressing empathy...
xix
The heart is beating and in between a question mark appear. DAVID (V.O) ... is with the things that define an empathic response. The phone is covered with a cross. They fade out. DAVID (V.O) Observation, memory, reason, and prior knowledge... A series of icons appear with labels under them; OBSERVATION, and eye; MEMORY, books; REASONING, gears; and PRIOR KNOWLEDGE, a brain. DAVID (V.O) ... today it’s possible to substitute those traits with digital alternatives, making way for amazing possibilities. The icons one by one fade out and are replaced with a camera and mic, a memory stick, processor, and icons of history, Facebook and Twitter, respectively. TRANS. Falling Slides. CUT TO: INT. Workshop - DAY The team is shown going over a whiteboard, it is covered with post-its. They shuffle through the post-its through out the scene discussing. LAILA (V.O) What this SDK would do is it would give... CUT TO: INT. Interview Area - DAY LAILA is talking. POST. the words LAILA YUSIF and MSC COMPUTER SCIENCE are shown in the bottom right animating in and out of view. POST. A series of icons appear on screen, they have labels over them. They are LANGUAGE CONFIG FILES with file icons, FUNCTIONS with file icons, API’S with a box icon. LAILA ... developers access to specific handles of interaction through a devices camera and audio inputs, and using the correct amount of data... TRANS. Curtain Right. CUT TO: SLIDE. In Post. Devices are animated on screen along with the wireframe of a bust. The devices fade out and icons of a camera and mic are shown. A ring is animated around the icons. LAILA (V.O) ... it’s possible to come up with and initial understand of... Key points are drawn on the bust. The key points are then mapped as if the bust’s face was being digitally scanned. LAILA (V.O) ... facial expressions, respiration,... A gauge is animated on screen next to the bus with the label EMOTIONAL STATE over it. The needle moves for a bit then settles.
xx
LAILA (V.O) ... voice jitter, among other things. TRANS. Zoom Colours. Many of the previous icons are animated on screen along with the devices. An arrow is drawing between them. LAILA (V.O) What we plan to do is combine the potential of other accessible data from smart devices so we can have a more personal analysis. TRANS. Falling Slides. CUT TO: INT. Workshop - DAY ZOOM IN, LAPTOP. Software code is shown on the laptop. LAILA (V.O) So your smart watch could give body functions... ZOOM OUT, LAPTOP. LAILA (V.O) ... your smartphone can track behaviours and linking that with your social acitivites... PAN, LEFT. The whole team is shown around the work desk. LAILA (V.O) ... it’s possible to draw a good sketch of your psychology, Maybe even the psychology of another which is the ultimate goal! CUT TO: INT. Interview Area - DAY ZOOM IN, MARK. MARK is talking MARK All it took was a little binge watching on Netflix and brainstorming! Humans are all psychological and we all learn from our experiences so that’s what we plan to have in the SDK... TRANS. Curtain Right. CUT TO: SLIDE. In Post. The EMPATHY ENGINE SDK logo is animated on screen and above it a smartphone is shown with a history icon animated on it. MARK (V.O) ... as well as the ability for the apps and devices to learn from their usage so over time they understand their hosts better, they understand interactions better. Stars appear on screen over the device. TRANS. Zoom Colours. Two smartphones are animated with cartoon faces on the bottom of each, a male and female.
xxi
MARK (V.O) Just imagine being able to chat online and know exactly what the other person is feeling... One smartphone shows a heart animate, the other a frog. The female face becomes sad. MARK (V.O) ... the end of misinterpreting text messages. TRANS. Zoom Colours. Two faces are animated on screen. One is a doctor the other a patient, the patient is sad. MARK (V.O) Imagine doctors being able to read their patients better because of... A tablet appear between them with a heart icon and monitor animation. MARK (V.O) ... real-time feedback through custom designed devices. The same animation is over the patient. MARK (V.O) Or maybe a Google Glass kind of world with emotional and empathic reading along side others. The doctor has a wearable glasses device drawn over his eyes. The patient face turns from sad to happy. TRANS. Zoom Colours. An automobile is animated on screen. MARK (V.O) Your car could be able to understand your mood... A dial appears on the side with the words JAZZ, POP, INDIE, and AC/ DC written. The dial turns towards AC/DC and music notes are animated coming from the car as it moves. MARK (V.O) ... and pick the right song for you. Or it could let you know you were driving too fast because it’s feeling nervous. TRANS. Falling Slides. CUT TO: INT. Interview Area - DAY LAILA is talking. LAILA The SDK will be coming with a custom built IDE to support... POST. The logo for INSIGHT is animated on screen. LAILA ... all it’s unique functionality, and at the moment we’re still struggling to find the right balance between programming languages. Icons for Apple, Android, and Windows animated below the logo.
xxii
LAILA We want it to be versatile enough to be deployed on multiple platforms and we don’t want to make an entirely new language just for it. We are hoping to use current...
TRANS. Curtain Right. CUT TO: SLIDE. In Post. Logos of Java, C++ and Swift are animated on screen. LAILA (V.O) ... popular developer languages and see how we can merge them in different instances through the IDE... The logo of INSIGHT is animated on screen with the other logos. LAILA (V.O) ... so it would seem less painful to code the applications. Hopefully we will have decreased the learning curve for it this way as well. TRANS. Zoom Colours. The words SERIOUSLY? COMPUTERS THAT CAN FEEL are animated on screen. MARK (V.O) I know what you’re thinking, computers that can feel? After a while the words NOT IMPOSSIBLE are animated as the previous words fade out. MARK (V.O) It’s visionary to say at least but not impossible. A smartphone is animated next to the logo of STAR TREK and opposite to them a laptop is shown with a brain in it next to an illustration of IRON MAN. MARK (V.O) The mobile phone came from Star Trek, Jarvis can come from Iron Man. TRANS. Zoom Colours. Icon for history is animated on screen. MARK (V.O) Over time the language will improve as newer research is conducted and it might also help better understand... The icon is followed by a heart that fades in as the history icon fades out. MARK (V.O) ... empathy better in the long run. Our phones can actually be our friends... TRANS. Curtain Right, TO WHITE. The logo for EMPATHY ENGINE SDK is animated on a white background. MARK (V.O) ... how cool is that? POST. Background audio fades out. FADE OUT. END SCENE.
xxiii