Tap2Hue - Design for Focused and Peripheral Interaction

Page 1

DCM411 - Design for Focused and Peripheral Interaction

Group Report Semester A - Quartile 2 2015-2016

Teaching Professor: Saskia Bakker

Submitted by

Eleni Economidou Grandhi Venkata Somnath Sudheer David Verweij

A report submitted in fulfilment of the requirements for Course DCM411, Department of Industrial Design, Technical University of Eindhoven.

submission date: 14/01/2016


DCM411 - Design for Focused and Peripheral Interaction

1


DCM411 - Design for Focused and Peripheral Interaction

TABLE OF CONTENTS

Collective definition of the terms

3

Discussion following the video analysis

6

Examples of existing interaction designs

9

i. Eleni Economidou ii. Grandhi Venkata Somnath Sudheer iii. David Verweij

10 ​ 12 ​

14 ​

Final Design Description

16 ​

Critical discussion of final design

19 ​

i. Eleni Economidou ii. Grandhi Venkata Somnath Sudheer iii. David Verweij

Appendices

20 ​ 22 ​

24 ​ 26

2


DCM411 - Design for Focused and Peripheral Interaction

Collective definition of the terms: ‘centre of attention’ and ‘periphery of attention’

3


DCM411 - Design for Focused and Peripheral Interaction

Collective definition of the terms: ‘centre of attention’ and ‘periphery of attention’ When observing the physical world, one can comprehend that a significant amount of the information that our sensory stimuli receives is seamlessly/unobtrusively embedded in our surroundings. Information such as the weather, time passing and people’s activities around us exists is in the periphery of our attention (Bakker et al, 2010) without interfering with our focus [1]. On the other hand, technology nowadays, demands our full focus by occupying our centre of attention through display devices. Attention ​ can be interpreted as the allocation of the attentional mental resources over prospective activities. Thus, the term ‘centre of our attention’ can be defined by the activity that the majority of such resources are allocated to [1]. In this case, the received (multi)modal stimuli is considered and acted upon after a conscious decision (from perceptual to motor resources). On the contrary, the term ​ ‘periphery of attention’ ​ indicates all remaining possible activities which are not in the focus of attention. These activities may occasionally take place subconsciously, since, the process requires way less mental resources​ . However, if during an irrelevant activity the brain receives ​ primed stimuli (James, 1890), then, the received signals will be prioritised and, therefore, introduced within the thought process of the brain [2]. Hence, the focus, in this instance, shifts from the periphery of attention towards the centre. Additionally, an unintentional shift in focus may be triggered by an incident of salience (Knudsen, 2007)​ . Unexpected loud noises, pain, the individual’s name, any instance which will draw one’s attention from the periphery to the center of attention can be perceived as a potential ​ salient catalyst​ . According to the term ​ calm technology (Weiser and Brown,1997), the focus of attention may be controlled by the users through opting to ignore any input received from their surroundings [4]and allocating their resources to one or two tasks of their choice [5], otherwise known as selective attention (Norman, 1976). The ​ selective attention and ​ divided attention (Sternberg, 1999) theory are directly related to the decision-making skills of the users since they may focus their attention and resources according to their will [6]. Tasks can shift between centre and periphery of attention according to the user’s experience. ​ Divided attention ​ reduces the efficiency of performing two or more tasks simultaneously, since, the mental resources required, i.e. the working memory, are also divided for two separate tasks. A divided attention example: Expecting visitors while performing a task on a computer device. If one is eagerly anticipating for the visit, waiting to hear the bell ringing will require more mental resources. On the contrary, if the main focus is on the performed task, the doorbell ringing will trigger salience. In case of multitasking; thus, paying attention to all 4


DCM411 - Design for Focused and Peripheral Interaction

auditory input while performing another task, the individual is less able to perform the primary task as effectively.

References: [1] Bakker, S., van den Hoven, E. and Eggen, B. Design for the Periphery. Proceedings of the Eurohaptics 2010 Symposium on Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction, Eindhoven:71, 2010. [2] W. James. The Principles of Psychology. Dover Publications, Inc., New York, NY, 1890. [3] E.I. Knudsen. Fundamental components of attention. Annual Review of Neuroscience 30: 57-78, 2007. [4] M. Weiser and J.S. Brown. The Coming Age of Calm Technology. P.J. Denning and R.M. Metcalfe (Eds.) Beyond Calculation: the next fifty years of computing. Springer-Verlag, New York, NY:75-85, 1997. [5]D.A. Norman. Memory and Attention. An introduction to human information processing. John Wiley & Sons, Inc., New York, NY, 1976. [6] R.J. Sternberg. Cognitive Psychology. Harcourt College Publishers, Orlando, FL, 1999.

5


DCM411 - Design for Focused and Peripheral Interaction

Discussion following the video analysis, in addition to reasoning behind recognised attention theories.

6


DCM411 - Design for Focused and Peripheral Interaction

Discussion following the video analysis, in addition to reasoning behind recognised attention theories.

Following the recording of a 30 minute video recording per each group member, a thorough video analysis was performed in order to observe and address the periphery and centre of attention (Bakker et al, 2010)-based activities during everyday routines. While performing an analysis on the videos, it was observed that the group utilised multimodal input during the performed activities. Interestingly enough, different actions were performed by various paired organs like the hands, the feet and the head. Each of team member performed tasks familiar to them, for instance cooking, making the bed and following a recipe from a book. There were many instances where multi-tasking was required, however, there was always a focused attention on a specific task at all times. A shift between the focused and peripheral interaction was based on the situation and user’s activity. Video recording of a morning ritual of making the bed and preparing coffee: It is quite evident that on the first instance, of making the bed, the individual was faced by a salience event ​ (Knudsen, 2007) since the traffic outdoors caught her attention, therefore, looking outside the window became the main activity in focus (See Appendix A. i). Then the individual describes how she notices the sunlight as she was still making the bed. So there is a shift between the centre and periphery of attention (Bakker et al, 2010). On the second instance, waiting for the coffee to be brewed becomes an activity in the periphery of attention since it does not require many resources but reading from a screen, in this example; reading emails, becomes the centre of attention. During the first activity, making the bed used most of the person’s mental resources but after the salient occurrence, the resources are constantly divided between the said task since looking and listening out of the window requires equal if not more mental resources than making the bed. Video recording of a meal preparation routine: The second video follows an individual while preparing and having a meal (See Appendix A. ii) .During the entire process, instances of shifting in focus of attention (Bakker et al, 2010) were observed. For example, the individual was closing the cupboard door with his left hand while looking at it and opening the fridge door with his other hand. The focused attention was aimed at the cupboard door while in the periphery of his attention, he was able to estimate the position of the fridge door and open it even without looking at it. Another exemplifying instance is one in which the individual was retrieving ingredients from the cupboard whilst fiddling with a knife at the same time without even noticing. This action was carried out subconsciously, without the knowledge of the individual. Such action is usually performed in the periphery of attention.

7


DCM411 - Design for Focused and Peripheral Interaction

Video recording of following a recipe: The group member in the third video (See Appendix A. iii) was chopping vegetables while listening to music at the same time, there were instances where he had to shift between reading the recipe book and have an eye on the flame. An exemplary instance of shift of focus was one when he closed the cupboard without looking at it and operated the remote of the radio. Similar to the other group members, keeping track of the time or indicators of time occurred in the periphery of attention. To check upon the estimation of time, a quick glance upon the dinner preparations, focused, was performed. Hereafter, tracking time shifted back to the periphery of attention.

References: [1] Bakker, S., van den Hoven, E. and Eggen, B. Design for the Periphery. Proceedings of the Eurohaptics 2010 Symposium on Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction, Eindhoven:71, 2010. [2] E.I. Knudsen. Fundamental components of attention. Annual Review of Neuroscience 30: 57-78, 2007.

8


DCM411 - Design for Focused and Peripheral Interaction

Examples of existing interaction designs (Individual) including at least two interaction styles.

9


DCM411 - Design for Focused and Peripheral Interaction

i. Examples of existing interaction designs, including at least two interaction styles. Eleni Economidou Design interactions can be distinguished into three styles; focused, peripheral and implicit [1]. Focused interaction is generally happening intentionally and consciously, taking place in the centre of the user’s attention demanding direct and precise control. Peripheral interaction, similarly, occurs intentionally but, on the contrary, it happens subconsciously in the ​ periphery of the attention (Bakker et al, 2010) and requires direct but not precise control. In our era, implicit interactions manifest the ideal relation the user can have with a system; whereupon computers blend into the background, do not overburden the user with information and appear only when desired [2]. Implicit interactions occur subconsciously and unintentionally outside the attentional field, without the explicit awareness or control of the user, providing smoother day-to-day intercommunication [3].

Figure 2: ​ Dezeen Magazine. ​ Pulse by Christian Ferrara and Jon McTaggart​ . 2012.

Calm technology (Weiser and Brown, 1997), “engages both the centre and periphery of attention” (Weiser and Brown, 1997, pp. 79). ​ Pulse (McTaggart J., Ferrara C., 2012), an apparatus that visualises analogue data unobtrusively, by C. Ferrara and J. Taggart [4], is an indicative example. The device simplifies complex information and by utilising visual reference points. Users have to consciously set up and switch between three data feeds they are interested in - such as sports’ scores, weather forecast or even currency rates by logging-in onto the Pulse website. Thus, the interaction is considered ​ focused since the users intentionally and consciously set their preferences and have direct control if and when they choose to. Pulse converts online information into a red thread tangible graph [5], which can be placed onto walls or other vertical surfaces, that filters out the overload of data from social media, informing users unconsciously on personalised matters. The user can even by glancing, intentionally but subconsciously, at the clear representation of data be updated in their periphery of attention, thus, a ​ peripheral interaction​ .

10


DCM411 - Design for Focused and Peripheral Interaction

Figure 1:​ iRobot.com. ​ Roomba sweep and pre-soak​ . 2015.

Roomba (iRobot, 2002), the autonomous robotic vacuum cleaner by iRobot [6], portrays an example of focused and implicit interaction. Through sensors, the device performs tasks such as changing direction when next to obstacles, detecting dirt, and sensing stairs. The interaction is ​ implicit since Roomba blends into its surroundings, without the need of the user’s awareness - an interaction outside his or her attention field. In case of error the device stops and sounds an alarm, reports issues and suggest solutions, salient events that put the device into the user’s centre of attention, thus, ​ focused interactions​ .

References: [1] Bakker, S., van den Hoven, E. and Eggen, B. Design for the Periphery. Proceedings of the Eurohaptics 2010 Symposium on Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction, Eindhoven:71, 2010. [2] M. Weiser and J.S. Brown. The Coming Age of Calm Technology. P.J. Denning and R.M. Metcalfe (Eds.) Beyond Calculation: the next fifty years of computing. Springer-Verlag, New York, NY:75-85, 1997. [3] Ju, W, and Leifer, L. The Design of Implicit Interactions: Making Interactive Systems Less Obnoxious. Design Issues: Special Issue on Design Research in Interaction Design, 24(3):72-84. 2008. [4] Dezeen Magazine, Pulse by Christian Ferrara and Jon McTaggart. 2012. http://www.dezeen.com/2012/07/30/pulse-by-christian-ferrera-and-jon-mctaggart/ [5] McTaggart J. and Ferrara C. ​ Pulse ​ [video]. 2012. https://vimeo.com/45980795​ . [6] "iRobot Corporation: Our History". 2015. www.irobot.com.

11


DCM411 - Design for Focused and Peripheral Interaction

ii. Examples of existing interaction designs, including at least two interaction styles. Grandhi Venkata Somnath Sudheer The first example is Nokia Glance screen is an interactive design used in Nokia Lumia series mobile handsets. The Glance screen can be configured based on time or feature like peek, always or duration. The glance screen works on the principle of providing information on the home screen when the user desires to see the information. Hovering the user’s hand over the locked handset screen, makes the phone recognise movement over the screen and displays the home screen which contains notifications, time and date. Such information can be noticed even after unlocking the phone, but this increases the effort of the user. In the case of using the glance screen, the user does not have to focus his attention on unlocking the phone, a simple hovering action over the screen displays all the desired information. This is the case of peripheral interaction [2] as the user does not have to be very precise over his action and does not have to be very conscious over his action. Whereas, unlocking the phone to see the desired information would require a certain amount of resources and happens to be a case of focused interaction [2].

Figure 1: ​ Litchfield, S. 2015. ​ Nokia Lumia 930 ​ with Lumia Denim. ​ All About Windows Phone​ , 2015.

The second example is ​ Rain-sensing wipers ​ detect the presence and amount of rain using ​ a​ rain sensor​ .​ The sensor automatically adjusts the speed and frequency of the blades according to the amount of rain detected. These controls usually have a manual override [5]. The rain-sensing wipers are now installed in all automobiles nowadays as it does not require the driver’s attention or resources to switch it on or off. The rain sensor senses the amount of rain and automatically operates. This happens to be a case of ​ implicit interaction ​ [2] where the entire process of automation is not in the control of the human driver and neither needs to focus his attention on this task. There are configurations in this wiper that the driver could opt 12


DCM411 - Design for Focused and Peripheral Interaction

to set the frequency and speed of the viper to intermittent or a higher frequent action of wiping the windshield. Then the ​ interaction becomes ​ focused [2] as the driver is conscious of the action and it is well within his field of attention. The rain-sensing wipers have evolved over a period of time to reduce the effort of the driver especially when driving as this action of setting the wiper configuration is an additional workload on the driver.

Figure 2 and 3:​ Nevonprojects.com,. ​ Rain Sensing Automatic Car Wiper Project​ | NevonProjects. 2016.

References: [1] Litchfield, S. 2015 update: Nokia Lumia 930 with Lumia Denim. ​ All About Windows Phone​ , 2015. http://allaboutwindowsphone.com/reviews/item/20530_2015_update_Nokia_Lumia_930_wi.php​ . [2] Bakker, S., van den Hoven, E. and Eggen, B. Peripheral interaction: characteristics and considerations. Personal Ubiquitous Computing 19, 1, 239-254, 2014. [3] Discussion Forums for Cars in Pakistan - Ask, Answer and Discuss about Vehicles | PakWheels,. Rain sensing wipers setting - 159980. 2011. http://www.pakwheels.com/forums/mechanical-electrical/159980-rain-sensing-wipers-setting​ . [4] Nevonprojects.com,. Rain Sensing Automatic Car Wiper Project | NevonProjects. 2016. http://nevonprojects.com/rain-sensing-automatic-car-wiper-project/​ . [5] Wikipedia,. Windscreen wiper. 2016. https://en.wikipedia.org/wiki/Windscreen_wiper.

13


DCM411 - Design for Focused and Peripheral Interaction

iii. Examples of existing interaction designs, including at least two interaction styles. David Verweij The first to share is the Bruno SmartCan [1]. Bruno is a regular trashcan, that has an integrated vacuum inlet, a motion sensing lid (to open the trashcan) and keeps track of the amount of trash as well as the amount of trash bags left. The latter makes these interactions shift towards an implicit interaction. With regular trash cans, the state of the trashcan and stash of bags is perceived indirectly, possibly within the periphery of attention, when throwing away trash or grabbing something out of the cupboard. The implicit interaction with Bruno shifts however towards a focused interaction when the user’s attention is drawn towards the notification system on the phone.

Figure 1:​ Bruno. ​ Bruno SmartCan​ . 2015.

The second to share is a radio control, situated on the steering wheel of a car. Interaction with the car radio, especially the volume, can clearly shift between all three interaction styles. When a specific volume is required, the users might focus on the turning knob on the dashboard radio. This occurs for example before the user has started driving. As some cars have slight control over the radio within the steering wheel, the user is able to tune their settings (volume or radio channel) without looking and whilst driving. This has potential to become a peripheral interaction. Thirdly, some car radio’s have adaptive volume control, based on the speed of the car. This indicates an implicit interaction, initiated by the car radio. Lastly to share is Drops [2]. Drops is a result of the Tangible User interface class given by the Copenhagen Institute of Interaction Design (CIID) in 2010. Drops is a physical, ambient display of your closest and most important contacts. Using physical tokens representing your contacts you can organise and select the most fitting group for the current setting to interact with. Through stroking and poking, attention is requested from the contacts that use the same system. Requesting for attention is a focused interaction, since selecting and interacting with the tokens is a detailed operation (the space to 14


DCM411 - Design for Focused and Peripheral Interaction

perform is narrow and small). Receiving and perceiving requests for interaction is happening within the periphery of attention, as the display is ambient and non intrusive. However, responding on the requests shifts the user’s attention towards a focused interaction.

Figure 2:​ Andrade, P. ​ Drops​ . 2010

[1] Brunosmartcan.com. Introducing bruno - the world's first smartcan. 2015. ​ http://www.brunosmartcan.com​ . [2] Andrade, P. Drops - Pedro Nakazato Andrade. Pedroandrade.com, 2010. http://www.pedroandrade.com/Drops​ .

15


DCM411 - Design for Focused and Peripheral Interaction

Final Design Description and motivation behind the proposed design.

16


DCM411 - Design for Focused and Peripheral Interaction

Final Design Description and motivation behind the proposed design. In order to free up the hands -- which are in direct sight, thus, demand the user’s focus of attention [1] -- the team decided to suggest a shift towards the lower extremities; the feet. The design is additionally an endeavour to find a solution to the dire problem of having one’s hands full while attempting to switch on the lights. ​ The proposed final design functions based on the principle of capacitive proximity sensors. ​ These whole-floor sensors, firstly used by SensFloor® ​(Future Shape, 2014), measure capacitance; changes to the local electric field caused by a person or any other conductive object that approaches them [2]. The same phenomenon applies in touchscreen devices [3] when detecting where a finger is tapping (Keller, 2014). The concept of utilising the floor as a medium of interaction was based on the fact that human beings are most of the time in direct contact with floor or other walking areas. Due to this fact, the users don’t require any previous training to get used to the required movements. A clear demonstration of the final design can be found in the video the team produced [4]. Paradoxically enough, during this era of automation, everyday routines may demand our full focus as well as rely on carrying out certain tasks with both our hands. Many mobile interaction devices have to be carried around the house to activate them, while on the contrary, the user has to pause any activity and reach the position of the stationary interaction device, for example, the light switch. Even though the first time the suggested interaction between the feet and the floor is considered as focused it could, eventually, be considered as peripheral (Bakker et al, 2010) after a small period of time. During this adjustment period, the user gets accustomed with the system and, through movement repetition, masters the system even unconsciously since the process would require minimal mental attention resources. Based on the video analysis, it has been deducted that many tasks are executed using our hands; this at times uses most of the mental resources since the user ends up using his full focus to perform tasks with his hands. The peripheral part of using hands reduces drastically, so the idea arose to implement the feet as a mode of interaction, which would transform the action into an untampered modality. Initially, the team suggested an interactive design of innovative footwear, which would enable the users to tap and rotate their feet, thus, moving along the footwear to send commands to the system, interacting with the lighting system. Through further design development and iterations of the interactions ​ (See Appendix C)​ , it became apparent that some of these movements were too complicated and demanding to perform the basic gestures or caused confusion in sequences concerning changing individual light units. Further discussions lead to the idea of using the whole floor as an interaction device, since the footwear/shoe attachment had significant drawbacks in terms of accidental activation, i.e stepping on them, the need to be aware of their position in order to attach them to the feet, water-resistance, storage and charging.

17


DCM411 - Design for Focused and Peripheral Interaction

During the performed user tests, the majority of the participants stated that few of the gestures assigned to them (change of the scene and turn on/off mode) were too complicated but also unnatural to execute and demanded a significant amount of their mental resources. In contrast, a few of them found that some gestures were fairly simple to use (changing intensity and temperature), could be performed effortlessly and would be feasible to master in quite a short span of time. The partakers also noted that they felt at ease in interacting with their feet while simultaneously observing the direct feedback from the lighting system. Suggestions from the user evaluation tests prompted the team to perform brain-storming sessions to ideate further and look into much simpler, recursive gestures. Moreover, the user tests provided the group with invaluable feedback concerning the design in general as well as circumstances in which the system might face any malfunction in terms of accidental activation, wrong input or difficulties in setting it up. In case there is a user demand for more complex specified settings, such as controlling individual light bulbs, the Philips Hue smartphone Application is the most suitable alternative.

References: [1] Bakker, S., van den Hoven, E. and Eggen, B. Design for the Periphery. Proceedings of the Eurohaptics 2010 Symposium on Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction, Eindhoven:71, 2010. [2] GmbH, F. Future Shape | Technology | SensFloor® – large-area sensor system. ​ Future-Shape.com​ , 2016. http://future-shape.com/en/technologies/23/sensfloor-large-area-sensor-system. [3] Keller, M. Electronic Floor Sensors Turn Whole Rooms Into Immersive Touchscreens. ​ Gizmodo​ , 2014. http://gizmodo.com/electronic-floor-sensors-turn-whole-rooms-into-immersiv-1531835835​ . [4] Economidou, E., Grandhi Venkata Somnath S. and Verweij, D. ​ Tap2Hue - A Philips HUE controller embedded in your floor​ . 2016. https://vimeo.com/151711290

18


DCM411 - Design for Focused and Peripheral Interaction

Critical discussion of final design (Individual) addressing its location on the interaction - attention continuum.

19


DCM411 - Design for Focused and Peripheral Interaction

i. Critical discussion of final design addressing its location on the interaction - attention continuum. Eleni Economidou Final design The final design has been modified to correspond to the type of interaction the team aimed for. The final version was presented at the client Demonstration (See Appendix C.v.). Insights Based on the user evaluation tests (6 participants) and the stimulated recall interviews, seven different insights were deduced. To begin with, the partakers articulated that (i) some of the tap sequences (switching on/off and changing the lighting scene) were too long and perplexed to be memorised and executed correctly right away. Some movements, such as bending the body or lifting one foot, were described as unnatural to act out and required a lot of effort but gave back little to none compensation to counterbalance the fact. Furthermore, a user mentioned that (ii) feet usually have slower reflexes than arms which led to delays when performing the given scenario. Two participants stated that (iii) the level of difficulty altered from sitting to standing position. Two partakers claimed that, (iv) in the beginning, the interaction seemed difficult that with practice it could become a habitual movement. One participant said that (v) this system could be quite useful in case her hands were full. Conversely, all users asserted that (vi) the tapping patterns related to changing intensity and colour of the lighting required almost no mental effort, thus, could occur in the periphery of one’s attention (Bakker et al, 2014). Half of the users acclaimed that (vii) they expected lighting in opposite corners of the room i.e. by either side of the couch, to be paired and controlled simultaneously. Modifications Following the user tests, the team decided to modify the design, specifically the switching on and off functions and the change of scene sequence to simpler ones as illustrated in Appendix C (v). In my opinion, these interactions may be easily performed and recalled than the previous iteration. Indeed, if the interaction is complicated, the system should provide some type of reward so that the users have an incentive to follow the procedure. Fewer steps are involved (1 foot, 2 taps), consequently, the gestures could be executed in​ the periphery of the user’s attention​ . Categorisation of the final design Based on the user tests, it is concluded that the aforementioned gesture sequences were too complicated. This resulted in a focused interaction since too much mental effort was required. However, the team implemented only six user evaluation tests in a lab-like controlled environment setting, a relatively small sample to extract concrete conclusions.

20


DCM411 - Design for Focused and Peripheral Interaction

From my own point of view, the updated design belongs in ​ all three categories of interaction​ on the interaction - attention continuum. The proposed interaction can, initially, be considered as ​ focused since the first time the users get in contact with it, they requires some time to adjust to the new information also known as the ‘novelty effect’. Hence, this first time experience takes place in the centre of the users’ attention, since the actions require the majority of their mental resources and the movement is enacted consciously and intentionally with direct precise control according to their intent [1]. Nonetheless, with the smallest amount of training or just by habit, the interaction can also be considered as ​ peripheral (Bakker et al, 2014)​ . The participants aforementioned that the gestures designated for changing hue and intensity required minimal mental effort. In my belief, if the team performed a field study in which the user tests utilised the updated version of the design in real life conditions, where the novelty effect faded, the results would indicate that the interactions would require significantly less mental effort. On that account, the interaction although intentional, it would be subconscious, taking place in the ​ periphery of attention​ . Additionally, one of the proposed design’s characteristics, the ‘follow lights’, could be considered as implicit interaction ​ (Ju and Leifer, 2008)​ . ‘Follow lights’ is the interaction in which the light bulbs switch on or off according to the user’s feet proximity. An unintentional, subconscious interaction that happens outside the user’s attentional field upon which the user has no direct control​ [2]. ​ Potential Improvements Haptic, visual (light pulsing) or auditory feedback could notify the user of the system activation. Control of individual light bulbs was not possible due to over-complication of the interaction. Value of proposal The interaction is perceived as innovative as it uses feet, instead of hands. The user is not required to carry a device with him at all times, whilst, the lighting system can be controlled from any position as long as the user’s feet are in direct contact with the floor. The system reacts to specific tapping gestures preventing accidental input. The suggested gestures can be performed whilst multitasking, in parallel with primary tasks i.e whilst one’s hands are in use, thus, transforming the gestures into an actual peripheral interaction [3].

References: [1] Bakker, S., van den Hoven, E. and Eggen, B. Design for the Periphery. Proceedings of the Eurohaptics 2010 Symposium on Haptic and Audio-Visual Stimuli: Enhancing Experiences and Interaction, Eindhoven:71, 2010. [2] Ju, W, and Leifer, L. The Design of Implicit Interactions: Making Interactive Systems Less Obnoxious. Design Issues: Special Issue on Design Research in Interaction Design, 24(3):72-84. 2008. [3] Edge D, Blackwell AF (2009) Peripheral tangible interaction by analytic design. In: Proceedings of TEI, 69–76, 2009.

21


DCM411 - Design for Focused and Peripheral Interaction

ii. Critical discussion of final design addressing its location on the interaction - attention continuum. Grandhi Venkata Somnath Sudheer Six participants took part in the user evaluation of Interaction design (Bakker et al., 2015) concept of Tap-2-Hue. All of them witnessed the interaction design for the first time. The participants were assigned a set of gestures along with sequence in which they have to be performed. The sequence varied among the participants. After few trial runs of the given sequence, the participant was sought to perform a real-life situation like reading a book and operating the lights. They were not restricted to the given gestures but were encouraged to try various other gestures with their feet to observe the reaction of the lighting system. After the completion of the sequence, user feedback about each gesture was gathered. Based upon the user feedback, few suggestions and remarks about the interaction design were also made. Few suggestions shared were matching the number of taps to characters in the name of a mood, symmetrical lights could have coupled effects when being operated, simple gestures, the feel could also be used to toggle the lights, hard tap could be used to toggle the lights and use of voice commands would enhance the experience. Some of the remarks shared were number of taps, few gestures being too complicated to remember, usage of both feet to activate a mode is not natural, gestures by feet are slow, difficulty to get into a habit of using the feet to control the lights and the individual lights being operated only when in proximity is a smart way to control the lights. After discussing over the suggestions and remarks about the interaction design concept, there were few modification rendered to enhance the experience of interaction and to restrict the interaction to field of ​ peripheral interaction​ [1]​ ​ . The purpose of this interaction medium (floor) was to control the lights using the feet (the untampered modality) in the ​ periphery of attention ​ (​ Weiser and Brown, 1997, pp. 79​ )​ . Complex or long tapping sequence could lead to usage of more mental resources. The idea of tapping to rhythms posed a serious problem, as the user had to recall the rhythm and required more effort. Follow lights were activated based on the proximity of the user to the light source and would turn off when the user exits the specified range. The interaction with lights using the feet and floor would fall short when higher functionality of the system was demanded. Controlling individual lights during a mood activation and to retrieve the altered setting was not feasible. Modifying the hue of individual lights in a group of lights or in a mood would pose as a tough task requiring the user to use his mental resources to the fullest. This ​ interaction design (Bakker et al., 2010) concept has an added advantage over other interaction devices as user need not possess the interaction device at all times and the user could control the lights from any corner of the room while the feet are on the floor. 22


DCM411 - Design for Focused and Peripheral Interaction

The feedback of the system would be quick and the user realizes if a wrong tapping sequence was performed. Added functionality or few complex tapping gestures could have been implemented. Any accident tap sequence might be considered as an input. An auditory feedback for the activation of a mood or toggling the lights would enhance the experience of this interaction. Evaluating the ​ interaction design ​ (Bakker et al, 2010) based on peripheral-ness would be satisfactory for the given functionality. The user could multitask and still operate the lights in his/her periphery of attention (Weiser and Brown, 1997, pp. 79​ ) , provided he/she remembers the set sequence. The purpose of this interaction design (Bakker et al, 2010) is to use a modality (the feet) which is quite unused to do few simple and natural movements to operate the lights in the ​ periphery of attention (Weiser and Brown, 1997, pp. 79​ )​ , which could be achieved over repeated practice. In reality, such a system in everyday life in a household could be beneficial if there are opportunities of mood lightings. The skill of controlling and interacting with the lighting system becomes easy as the user gains experience and skill over time. This ​ interaction design (Bakker et al, 2010) concept covers the various types of interaction like ​ focussed or ​ explicit interaction [1], ​ peripheral interaction ​ [1] and ​ implicit interaction [1]. Controlling the lights for the first few instances could require higher amount of mental resources, categorizing this interaction under ​ focussed interaction [1]. Over a period of time, gestures are embedded in the long term memory [2] and the interaction with lighting system could be done in the ​ periphery of attention (Weiser and Brown, 1997, pp. 79​ )​ , which becomes a ​ peripheral interaction​ [1]. The follow lights need not require a control or actuation making the ​ interaction implicit​ [1] in nature.

References: [1] Bakker, S., Hoven, E. van den, and Eggen, B. Peripheral Interaction: Characteristics and Considerations. Personal and Ubiquitous Computing. 19 (1):239-254, 2014 [2] Wickens, C., Gordon, S. and Liu, Y. An introduction to human factors engineering. Pearson Prentice Hall, Upper Saddle River, N.J., 2004. [3] M. Weiser and J.S. Brown. The Coming Age of Calm Technology. P.J. Denning and R.M. Metcalfe (Eds.) Beyond Calculation: the next fifty years of computing. Springer-Verlag, New York, NY:75-85, 1997.

23


DCM411 - Design for Focused and Peripheral Interaction

iii. Critical discussion of final design addressing its location on the interaction - attention continuum. David Verweij Controlling a lighting system with our feet offers many possibilities. Especially since it uses limbs that we do not (yet) use for a lot of instructive tasks. Despite feeling and looking silly, the design based on tapping gestures was iterated to result in a convincing and potential design with a high ease of use. In my eyes, the design had a large potential to be(come) a peripheral interaction [1]. Looking at the video analysis, it seems that most current peripheral interactions are due to simplicity and remembrance, i.e. the interaction require less or no direct or precise control. This light controller design requires a simple interaction and has a clear mapping between interaction and result. The user evaluation consisted out of six participants, receiving a general introduction, some time to get used to the system and five tasks to perform. The evaluation used a Rating Scale Mental Effort [2] and a stimulated recall interview. The results showed one deviating tasks that required less mental effort, yet the other four tasks had similar results. However, in all cases the deviation of answers were extremely broad. Therefore it is difficult to strongly conclude improvements from this user evaluation. In addition, during each user evaluation a slightly different introduction was given, different (verbal) support was given and (at first) a slightly defensive position during the stimulated recall interview was taken. These improved over time, yet might have influenced the results of the user evaluation. However, the user evaluation offered a few insights, both supportive to the first paragraph and raising questions at the same time. The results of the Rating Scale Mental Effort [2] showed that the interaction to in­ or decrease intensity required less mental effort [3], whereas the interactions to set a specific mode required noticeable more mental effort. Through the stimulated recall interview users confirmed the difference in effort in the interactions. They all also found the specific patterns rather complex to operate. Most of them believed the patterns were possible to learn and master, however found them too complex and too time­consuming to execute. This raised the question whether this interaction was an advantage compared to a regular light­switch and the Philips HUE app. In addition, the final design (Tap­2­Hue) is not easy to operate whilst walking. This is no requirement for the design, yet when possible, it would decrease the required amount of mental resources. In response to the feedback from the user evaluation, the concept shifted the balance between simplicity of patterns and possibility of accidental interaction. By simplifying a few patterns and introducing a different movement (turning on/off), more distinction between interactions was made, whilst the ease of learning these patterns was increased. Although the interaction is improved, the mental effort is perceived as low, the interactions do not require you to look anywhere, nor use your hands, it is still unclear whether this interaction is possible with little mental resources [3]. Looking at the scale from focused interaction to implicit interaction [1], these interactions are intentional and offer direct imprecise control. Whether they are subconsciously executable is yet unknown. I am curious whether it is 24


DCM411 - Design for Focused and Peripheral Interaction

possible, mainly due to our current adaptation towards our legs, using it for walking, subconsciously. The interactions for tap­2­HUE seem to require little mental effort, however I still have to see whether users are able to adapt to a second subconscious use of their feet. In particular, we are not used to perform these movement in our everyday lives. This might require some unlearning, which counteracts the ease of learning. An additional concern for this interaction design is the need for feedback through the lighting installation. In the current test­setup the Philips HUE had some delay and sometimes even an incorrect response. The tap­2­HUE system fully relies on the visual feedback (potentially in the periphery of attention) provided by the lighting result. Tactile interaction devices provide some feedback through their state itself, regardless of the result being implemented or not. If this delay is something that decreases the trust in the interaction, or is impossible to get used to, then the interaction design by the tap­2­HUE system is something that will never be an peripheral interaction. In conclusion, this interaction design contains focused interaction, a implicit interaction (the follow­lights) and the potential for some focused interactions to become peripheral interactions. The use of feet is interesting due the current lesser use of their potential, opening up possible peripheral interactions, in this case for the Philips HUE.

References: [1] Bakker, S., Hoven, E. van den, and Eggen, B. Peripheral Interaction: Characteristics and Considerations. Personal and Ubiquitous Computing. 19 (1):239-254, 2014 [2] Zijlstra, F. R. H. (1993). Efficiency in work behaviour: A design approach for modern tools. PhD Thesis, Technische Universiteit Delft. [3] Kahneman, D. (1973). Attention and effort. Prentice-Hall.

25


DCM411 - Design for Focused and Peripheral Interaction

Appendices

26


DCM411 - Design for Focused and Peripheral Interaction

APPENDIX A i.SCREENSHOTS - ELENI ECONOMIDOU

27


DCM411 - Design for Focused and Peripheral Interaction

ii. SCREENSHOTS - SUDHEER GRANDHI

28


DCM411 - Design for Focused and Peripheral Interaction

iii. SCREENSHOTS - DAVID VERWEIJ

29


DCM411 - Design for Focused and Peripheral Interaction

30


DCM411 - Design for Focused and Peripheral Interaction

APPENDIX B - WEEKLY DESIGN ALTERATIONS i. FIRST ITERATION - Shoe attachment

31


DCM411 - Design for Focused and Peripheral Interaction

ii. SECOND ITERATION - Shoe attachment

32


DCM411 - Design for Focused and Peripheral Interaction

iii. THIRD ITERATION - Shoe attachment

33


DCM411 - Design for Focused and Peripheral Interaction

iv. FOURTH ITERATION - Floor Embedded Sensors

34


DCM411 - Design for Focused and Peripheral Interaction

v. FINALISED INTERACTIONS - As Presented At The Demonstration - 07/01/2016

35


DCM411 - Design for Focused and Peripheral Interaction

APPENDIX C USER TEST SCREENSHOTS Screenshot A

Screenshot B

36


DCM411 - Design for Focused and Peripheral Interaction

APPENDIX D USER TEST RESULTS

37


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.