Report M2.1: The Haptic Toolkit
Student: Attalan Mailvaganam s081101 M2.1 Coach: Miguel Bruns Theme: Changing Behavior 1
x TABLE OF CONTENTS
x. introduction -------------------------------------------------
03
1. the history of hci and interaction design -----------
05
2. haptics ----------------------------------------------------------
08
3. explorations ---------------------------------------------------
11
4. discussing haptic toolkit ---------------------------------
18
5. user session haptic toolkit --------------------------------
22
6. haptic toolkit version 2 ------------------------------------
28
7. planning --------------------------------------------------------
38
8. references ------------------------------------------------------
41
2
i INTRODUCTION
3
“a breeze may flutter a single leaf on a whole tree, leaving the other leaves silent and unmoved”
“I began to see and to hear in a manner I never had before. (...) I learned to notice the ray of sunlight that was then pouring through a chink in the roof, illuminating a column of drifting dust, and to realize that that column of light was indeed a power, influencing the air currents by its warmth and indeed influencing the whole mood of the room. (..) My ears began to attend, in a new way, to the songs of birds - no longer just a melodic background to human speech, but meaningful speech in its own right, responding to and commenting on events in the surrounding earth. I became a student of subtle differences: the way a breeze may flutter a single leaf on a whole tree, leaving the other leaves silent and unmoved (...) Walking along the dirt paths, I learned to slow my pace in order to feel the differences between the nearby hill and the next or to taste the presence of a particular field at a certain time of day when (...)” [1, page 20] David Abram wrote this piece in the Spell of the Sensuous. It confronted me with my experiences, when was the last time that I was able to perceive “information” like this? Or better said when were you able to perceive “data” like this? Or did we even perceive this in our life? The point that I want to bring forward is that in our current society we aren’t triggered to use our senses in this particular way, In reverse our senses get overloaded by computing displays in such a way that we can’t attune to the value of our perceiving. In order to reflect on this trend, it’s important that the history of Interaction Design and Human Computer Interaction is understood, which will be discussed in the next section.
4
1 THE HISTORY OF HCI AND INTERACTION DESIGN
growth of Human Methaphors by Bill Verplank
f(x) = ax +b language
HAPTIC
ICONIC
SYMBOLIC
growth of HCI by Paul Dourish
EMBODIMENT
GRAPHICAL
if (a>...) { a=..... }
0
SYMBOLIC
ELECTRONIC
Figure 1: human metaphors vs. history of HCI.
5
“Embodied Interaction is the creation, manipulation, and sharing of meaning through engaged interaction with artifacts.�
Paul Dourish describes the history of Human Computer Interaction (HCI) of computing devices as follow [4]: From electrical, manipulation functions of the computer by adjusting analog circuits, to symbolic, interacting through codes and eventually through one dimension set of textual characters, to graphical, interacting in a two-dimensional space with visual representations and to currently researches in embodiment, interacting with artifacts augmented with computational powers. In comparison, Bill Verplank discusses the human development regarding Paradigms and Metaphors as follows [14]: The human is born with first haptic skills, He/she is able to pick up things and to move. Second the visual language is developed, the ability to e.g. distinguish a chair from a table and finally the symbolism is developed, the knowledge to write, speak, learn. Their is a clear connection between these two different theories, illustrated in figure 1: human metaphors vs. history of HCI. The growth described by Bill Verplank is moving from the development of perceptual-motor skills (haptics) toward the development of cognitive skills (symbolism); while in HCI the growth is moving from analog circuits (electronic) to intelligent everyday objects (embodiment). Both these direction are moving the exact opposite. But why is this development of HCI or in
6
my case Interaction design moving more towards embodiment, described by Paul Dourish as: “Embodied Interaction is the creation, manipulation, and sharing of meaning through engaged interaction with artifacts.” [4, page 126] Weiser [16] already argued that computing, as we know them in the form of intelligent displays (PC), aren’t centred around the human needs and functions. Also the early version of computing were centred around the cognitive skills of humans, while Decartes’s, res cognitans and res extensa [12], recalls that their is no separation between mind and matter, in other words in being and acting. Both these quotes bring forward that the current versions of computing isn’t focused on solving the human needs and don’t “use” our full skill capability. If we would go back to the human development described by Bill Verplank, it is logical to argue that when haptics is developed the first, it would probably be the most developed human skill; so in current computing the best skill isn’t used effective enough. For all these reasons it is a very logical step for Interaction Design to research embodiment! As it could bring computing to our environment in the form of tangible objects solving our needs and creating new needs by using both our cognitive skills and perceptual-motor skills.
7
2 HAPTICS
8
“it can perceive two stimuli in 5 ms, which is 20 times faster than vision” In the previous sections I connected haptics to embodiment, because this proposal is centred around haptics. To remove some doubts, I want to address that haptics is a part of embodiment! By this I want to recall that there are several of possibilities to create embodiment but for this proposal my interest is in the area of haptics. For a moment back up to the anecdote of David Abram, in where he is able to isolate his senses, showing the high sensitiveness of human senses. In the contrary computing products from our current age, referred as the Information Age, don’t recall the sensitiveness of our senses, it even limits our senses; because they don’t require our full sense capability. These products are mainly vision and audio based. As Paul Dourish already described in the History of Human Computer Interaction [1], iconic metaphors are used the most to trigger the interaction between user and product. Why are the other senses left out in Interaction Design, while technology is growing and could develops itself as a richer source for the human body. Coming back to haptics, I believe that haptics is one of the senses which is least recalled in computing, only for pressing a button and touching a screen for example, while we can do lots more with our haptic sense! The haptic sense is a very sensitive human sense, it can perceive two stimuli in 5 ms, which is 20 times faster than vision; the fingertips can sense a displacement of 0.2 microns (2.0×10−5 mm) [6]. Why is the haptics sense then left out? This question bring forward two types of interactions: Efficient Interaction & Rich Interaction. Efficient Interaction is what I believe based on handling an activity with the product as fast as possible, or as efficient as possible. While Rich Interaction is based on respecting the product and creating a qualitative interaction between user and product, provoking a positive, joyful experience. Currently the focus in Interaction Design is put on efficiency, the reason why the richness of the Haptics sense is left out in product design. I do not favor this development as designer, I believe that the
9
“Image the experience of hammering a nail into the wall”
interaction between user and product should be more fun, requiring more from our senses. If we have so many rich senses, why are they limited in the design of interactive products! Or at least multi-sense interaction should be researched to understand if it could be valuable for Interaction Design! But how could haptics add more value to interactive products? Image the experience of “hammering a nail into the wall”. When I hold a hammer in my hand and am hammering a nail into the wall, the hammer can be seen as an extension of my body, I am able to feel the “hits” on the nail, this phenomenon is referred by Heidegger as Zuhanden [15], the tool becomes an extension of the body. In this example haptics is able to give enough output which makes the user able to calculate the power necessary to hammer the nail into the wall. Again showing the sensitiveness of the haptic sensor. Therefore I believe that haptics should be researched in relation to Interaction Design as it can be seen as a powerful human sense that can enhance our sense capacity related to interactive products, making intelligent products more related to human qualities.
10
3 EXPLORATIONS
haptic input vs. visual output
haptic input vs. haptic output
ITERATION 1
ITERATION 2
haptic input vs. haptic output
haptic property: tactile reception
piezo as haptic sensor
haptic illusion
weight after-eect & laser as visual output
bump and holes
new haptic input: - shaking - scratching -slamming pulling
Figure 2: Iterations
11
ITERATION 3
haptic toolkit v1: transforming any surface to a haptic input output
“haptics is a rare sense in Interaction design, it should first be learned”
After understanding some basic knowledge about haptics I could think about two different possible directions for this proposal. The first direction focused on developing more understanding about haptics through only literature research. The second one focused on creating more understanding by developing physical prototypes to experience haptics. Looking at both these directions toward my identity, I would prefer direction two, because this belongs to my Identity as designer. I believe, especially in the case of Haptics, that it can’t be analyzed on “paper” (where I can only image), it should be experienced. Next to that, haptics is a rare sense in Interaction design, it should first be learned; what only can be done by experiencing it! In many previous projects I used this Identity. All these exploration led to an interesting outcome. In my M1.1 I was able to come-up with a interactive stained glass display, by using the sun as a light source (https://vimeo. com/43929497). If I hadn’t explored with polorizers, I could have never created it! I am a very intuitive designer with an expertise in exploring through making. In this FMP I want to show who I am as a designer, what also should be reflected in the design process. In my case one of the parts in my process should be the exploration through making! But next to this point I want to address to a quote from Camille Moussette in Simple Haptics regarding prototyping:
12
“when physical haptic objects come to center stage, I will be able to really discuss about their quality”
“Often the prototypes themselves, i.e. the objects or materials outcomes, take center stage when we elaborate about prototyping in interaction design. We tend to discuss extensively on the attributes and qualities of the objects and how well or not they attain or speak to various technical or theoretical ideals or objectives.” [9, page 72] I agree with this viewpoint, when physical haptic objects come to center stage, I will be able to really discuss about their quality, what I will not require by reading literature; in where I can only imagine. Therefore my direction will be focused towards realizing different iterations of physical prototypes, with in between each iteration a reflective session to define the future iteration or plan. On figure 2: Iterations, the 3 different iterations are illustrated. Each iteration was separately performed in a timespan of a week.
13
“a simple piezo could register these small vibration differences made it very interesting for measuring haptics”
Iteration 1 In iteration 1, I first tried to search for an interesting sensor which would be able to receive haptic inputs. I got inspired by the visit/participation that I made to the Tangible, Embedded and Embodied Interaction 2013 Conference (TEI ‘13). During this visit I saw the opening-keynote by Norbert Schnell & Frederic Bevilacqua: Body and Sound - Tangible Interfaces in Music Listening and Performance [13]. During the keynote I got inspired by the project MO : Modular Musical Objects [8]. When MO would be connected to piezos, the users could scratch, knock and slam on a surface, where on the piezo was place to influence sounds. This sensitivity, was quite mind blowing, that a simple piezo could register these small vibration differences made it very interesting for measuring haptics. On figure 3: Haptic inputs, the different haptic inputs are shown. Using this haptic sensor I first created a visual output. In my case I used a laser and a thermochromic image as output. The sensitivity was well visualized, especially the laser showed the high sensitivity, but I didn’t get a “Zuhanden” experience, The sensor didn’t really expanded my senses. Instead I was constantly exposed to the sensor and its output, Vorhanden [15].
14
SLAMMING
SCRATCHING
SHAKING
PULLING
Figure 3: Haptic inputs
15
Iteration 2 In the second iteration I kept the input the same, but I changed the output to haptics. In this case the output was a vibration motor attached to the same surface (metal stroke) as the input (piezo sensor). When interacting with this surface, I received different subtle feedback, it felt like if the metal stroke extended my fingertips, making me able to drum in the air. This particular effect referred me to the thesis Simple Haptics of Camille Moussette, in where he’s talking about haptic illusions: “The beauty of illusions is that they can be experienced even as we are fully aware of their functioning. As perception takes over human reasoning, these forms of deception or trickery also allow us to explore some of the inner workings of our senses.� [9, page 56] Moussette also argues that the common known illusions are vision based, but just as in vision, illusions do exist in the other senses. Therefore he redirect to Haywards list of tactile illusions [5]. In this case the metal stroke create a weight after-effect, the trickery of weight shift. Next to the piezo I also tried to explore with other sensors. I put the vibration motors on a slider and a potentiometer. Both these sensors also created a very interesting haptic illusion, bump/holes, the perception to for example feel bumps and holes metaphorical in a road. This iteration was very useful, I was able to transfer a sensitive haptic signal to a haptic output, making it possible to receive information on the human skin. In the next iteration I actually went a step further. How would the haptic sense react when it would be placed on different surfaces, like paper, wood, metal-plate, cups and etc.Therefore I made a simple kit to easily mount haptic inputs and outputs to different surfaces.
16
Iteration 3 During this iteration I developed a haptic toolkit. A toolkit what made it possible to attach a haptic sensor and output to any surface. Instead of explaining its function and possibilities verbally it is better to show it in action, therefore click on the next link or scan the QR code with your iPad: https://vimeo.com/62501578 The possibilities with this toolkit are big. I can connect it to almost any surface that I want (as shown in the video). I developed a tool that could assist me in designing for the haptics! But then the next question arise: for what context am I going to use this toolkit? Is my target to inspire or assist other co-designers in using this kit to develop different products or am I going to use it to develop my own product?
17
4 DISCUSSING HAPTIC TOOLKIT
18
“Compared to other types of interaction, designing for the haptics is more complex.” During this design process, the focus of Haptics went from Haptic input & Visual output to Haptic Input & Output to eventually a toolkit, what can be used to explore with Haptics. Therefore the next logical discussion point would be concerned on the use and value of this Haptic toolkit. Compared to other types of interaction, designing for the haptics is more complex. I’m not designing for something that already exist, for example when designing a car there are clear guidelines, but I’m designing for an experience. Before designing for it is important to understand this experience. Here is where the toolkit comes in. Firstly it is a tool to understand the “abstract” language of haptic; by exploring with the toolkit it is possible to experience Haptics, the user is exposed to the qualities of Haptics. Second it a tool to design with; When I need to draw a sketch of a car, I have sketch tools. In haptics there are no tools yet, in this way the Haptic Toolkit is a new medium that functions as a tool to design for Haptics. Of course the toolkit only focuses on the basics of designing for Haptics. Metaphorical said when learning to sketch, you first have to learn to draw straight lines, afterwards you can draw a cube. The same for the Haptic Toolkit, it supports the basics of designing for Haptics, its e.g. only metaphorical possible to draw a cube. But don’t forget that it will be possible to draw two cubes, ten cubes en etc. It is up to the playfulness of the designer to be creative with the toolkit. Therefore the relation of the haptic toolkit to other designers is also an interesting point to discuss about. I believe that by letting designers use the haptic toolkit, does not only stimulate them to develop ideas, but also let them see the opportunities of haptic interaction products to let them experience haptics. Another innovative aspect of the toolkit is that it is not trying to create a haptic experience for a virtual world, such as various gaming devices do. It can be assembled in the real world on any objects to measure and actuate
19
haptics. This aspect opens a lot of opportunities for Interaction Design, which I believe should be researched. Compared to virtuality, it’s possible to react and perceive haptical information in the real world; just like the Haptic sense does right now. By using the Haptic Toolkit to experience Haptics I was able to see different qualities. It showed me that haptics has a rich interaction. I find it more joyful to haptically interact with the toolkit; the interaction and experiences amazes me! I can suddenly feel pressure and patterns, and react on it. It gave me the feeling that I was able to perceive more with my senses, a magical experience; objects that weren’t there before, are there suddenly. Through this Toolkit I was exposed to these different qualities. Haptics can make Interactions between user and product, lots more richer. Exposing the users to their sensory capacity. Why should Interaction Design limit the user’s sensory capacity? If more is becoming possible Technological wise, why not make most use out of our sensory capacity; make interaction between user and product more magical, joyful and more precise. The haptic toolkit certainly has use and value. As it can help to experience Haptic, showing its qualities and values and therefore helping to understand Haptics. Next to that it can support in the design for haptics as a tool for “sketching” haptics. But how do I see this toolkit in relation to my FMP? I see two direction in where I could go with this Toolkit. The first direction is focused on the development of the Haptic toolkit. In this case my target group is the designer. I’ll create a toolkit what assist them in their design for the Haptics. Next to that the toolkit will be a way to experience the qualities of haptics; This FMP will make me a missionary of haptics, showing all the qualities of haptics to the world. The second direction is concerned on the development of a haptic product. In this case the Toolkit will assist me in the development of this product. My interest goes to a comfort product, such as a Audio Hifi system, as it could expose the qualities of haptics to the world, and not just only persuade the researchers but also the customers. For these users it’s important that Haptics is concretized to a final product
20
to really “define or expose” its qualities and values to the world, which will not be possible if haptics stays in an abstract level. Of course both direction are very interesting. The first one would be a research project based on developing an elaborate version of the Haptic Toolkit for designers and the second would be a design project based on developing a haptic Audio Hifi set. I believe that the first steps in haptics is that it firstly should be understood, supported in its design and eventually developed to a product what can be exposed to the world; therefore aspects of both directions are important. The step that I’m leaning towards is to research how the Haptic Toolkit is used by other designers. Can it let them see the values and qualities of haptics and are they able to use it as a tool to design for the haptics? After this user session, it’s possible to reflect further on the purpose of the toolkit in my FMP.
21
5 USER SESSION HAPTIC TOOLKIT
22
An user session was performed with the Haptic Toolkit to see how designers would experience Haptics. The session was during a module about input & output given by Stephan Wensveen for the UCI group, consisting out of PDEng Industrial Design students. The module was started off by a presentation about the Haptic Toolkit, the story behind the concept and functionalities of the toolkit. The UCI group was divided in 4 groups of 5 students and asked to explore with the toolkit. They had two hours for this exploration, after that they were asked to present their finding in a 5 minute presentation; concerning the use, value and possibilities of the Kit.
Toolkit The toolkit consisted out of a slider, potmeter, pressure sensor, piezo to measure haptics and a vibration motor to actuate all the received information. All the components could be mounted on each other and be mounted on various surfaces. Through a Max Msp Application the components could be controlled.
23
“The toolkit as an instrument to experience and to philosophize about Haptics�
Evaluation During this workshop, 3 different types of designers could be distinguished: Concept Developers, Programmers and Visionists. Each group used the toolkit in a different manner. Concept Developers This particular group presented different concepts, in where Haptics could be implemented. They tried to place Haptic in a context, by using the toolkit as an instrument to learn about haptics and as inspiration to develop concepts. The toolkit as an instrument to learn about haptics and to ideate with it. Programmers The next group, the programmers, were more interested in the possibilities of the kit. They tried to explore with the kit. What can it measure, how does it measure? What happens when we put it in water. They focused on the possibilities of the kit. The toolkit as an instrument to understand the possibilities of haptics. Visionist The visionist tried to reflect on the toolkit, what is it’s purpose and tried to think about the general purpose of Haptics. They also tried to think about scenarios in where the toolkit could be useful and could add value to. The toolkit as an instrument to experience and to philosophize about haptics.
24
25
These three groups reacted all different on the Haptic Toolkit. They all used different qualities of the toolkit. The concept developers were more interested in the ideation qualities, programmers were more interested in the measuring qualities, while the visionists used it as a media to philosophize about Haptics. Its clear that these different designers are valuing different qualities of the Haptic Toolkit. They are exposed to the qualities of Haptics and are enthusiastic about it. By exploring with the kit they provoke understanding about haptics. In the beginning they had difficulty to understand what it was that they were perceiving but by exploring they started to understand the haptic experience and interaction. What eventually showed them new possibilities in Interaction Design. This user session made clear that the toolkit certainly could let designers show the values and qualities of Haptics and could support them as a tool in the design for haptics. Referring back to the discussion brought up in the previous section. As discussed in that section, what would be the next steps for my FMP? I currently see the most chance in a research project focused on developing a Haptic Toolkit for designers, still I believe that during this FMP a clear example should be given of a concretized haptic product. As it could show the evidence that the toolkit could certainly be used as a tool to develop haptic products and it will show the value of haptic in our society. I made this choice because haptic is still in its infancy, it should be first understood and explored with. If I would focus on the haptic product, it would be a step too fast. The next steps in the process would be an improved Haptic Toolkit to support in a new user session with a clear design task. In the current user session, no concrete tasks were given; they were just asked to explore with the toolkit. In the future user session the user should be given a clear design task, design a Audio Hifi set, and asked to use the Haptic Toolkit to develop this product. In this way it would be possible to evaluate the three head values of the toolkit: 1. The toolkit as an instrument to learn about haptics and to ideate with it. 2. The toolkit as an instrument to understand the possibilities of haptics.
26
3. The toolkit as an instrument to experience and to philosophize about haptics. As for the design of the toolkit, some points extracted from the user session should be taken into account: bigger range in actuators and sound reduction. Currently only a vibration motor is used, could force feedback modules be added? Also the sound feedback of the vibration motor could be a problem; sometimes the user focused on the sound, instead on the haptic vibration actuation. Improving these points would enhance the possibilities of the Haptic Toolkit. It would be possible to develop and explore more qualitative haptic experiences and therefore develop more different haptic modules. The goal of this FMP should be the motivation of why Haptic should be implemented in Interaction Design. I will do this by presenting the Haptic Toolkit as a missionary for Haptics. By explaining my findings about its use and presenting a possible output, in my case an Audio Hifi set. The Haptic toolkit should be a strategy to stimulate and inspire designers to develop haptic interaction products.
27
6 HAPTIC TOOLKIT VERSION 2
28
Force feedback slider The first module in the kit, is a Motorized Slider. This slider can measure its own position and is able to adjust the position of the slider through an integrated motor. This makes it possible to add force feedback to the slider. On Figure 4: Slider Graphs a scheme is given on how the slider can be controlled. In Force Motor Graph A, the force of the motor is pulsated over the length of the slider. In segment 1,3,5 the motor’s force is in its maximum and no force is exerted in section 2,4. So in 1 the motor forces back, in 2 it doesn’t react and in 3 it starts forcing back. This particular effect remind to a “bumpy road” and makes it possible to feel relief. The second graph, Force Motor Graph B, is somehow different. In this effect the force of the motor is always in its maximum, as the slider is always pushing back. It reminds to a “elastic”, when the slider is moved it snaps back to first position.
29
position slider
0
1 max
max
3 2
5 4
0 Force Motor Graph A: Bumpy Road 1 max
0 Force Motor Graph B: Elastic
Figure 4: Slider Graphs
30
Force feedback potmeter The second module in the kit is a motorized potentiometer. It works in the same way as the motorized slider. The only difference is that the sensor can be controlled by rotation. The figure, Figure 4: Slider Graphs can also be applied to this module.
31
Force feedback button The next module is a force feedback button. In this module the pressure of the button can be adjusted. If the force of the pressure is put in its maximum, the force needed to press the button is higher. This module is based on a electromagnet (Figure 5 Force button); the coil is turned around a socket with a magnet in the middle of the socket. When power is put on the coil, the magnet will float in the middle of the socket. Transforming the magnet to a spring that can be adjusted in its power.
32
button
magnet
Figure 5: Force button
33
coil
Piezo Sensor The Piezo Sensor was already introduced in the previous toolkit. The piezo sensor can be placed on various surface and transfer that particular surface to a haptic sensor. The difference between this version and the previous one is that the sensitivity is enhanced.
34
Vibration feedback The fifth module is a vibration motor. It was the only actuator of the previous toolkit. In this toolkit it is the fourth actuator and can be connected to the Slider, Potentiometer and Piezo.
35
Connection Points All the modules can be mounted on each other. The top of each module (Figure 6a: Male Connection) can be places in the bottom (Figure 6b: Female Connection) of each other module and by turning one module it will stick to the other module. To make it possible to connect the modules to different surfaces, a magnet connection can be connected to the modules. The Magnet can be sticked to metal surfaces and by using a support magnet it can be places on different various surfaces.
Figure 6a: Male Connection
Figure 6b: Female Connection
36
Still to develop The platform isn’t complete, all the sensors and actuators function well and can be connected through an basic arduino code, but there is not yet a clear user friendly interface that can be used to easily link the different modules to each other. Therefore the next step for this toolkit is to develop a clear user-friendly interface program, what can be used by the designer to explore, understand and design for Haptics.
37
7 PLANNING
aesthetics of interaction
integrating technology
aesthetics of interaction
aesthetics of interaction
integrating technology
integrating technology
aesthetics of interaction
integrating technology
aesthetics ”formgiving”
qualitative userresearch
qualitative userresearch
aesthetics ”formgiving”
PROPOSAL
Q2 M2.1
Q1 M2.2
Q2 M2.2
toolkit version 1
toolkit version 2
set of different co-designs
everyday product
Figure 7: Planning
38
This planning reflects on the four main deadlines during M2. The current deadline is focused on End M2.1, the results were discussed in this final report. During the proposal development my expertises in Aesthetics of Interaction & Integrating Technology were used the most often, as they assisted me in performing quick explorative session. Also my skill in Qualitative User Research was used to research the values and qualities of the toolkit. The second deadline was situated at the end of this semester in quartile 2 (Q2). The goal for this block was to develop an extended version of the toolkit (toolkit v2). The visual clearly shows that I focused on developing a toolkit, as it shows a convergent line towards the second deadline. During this part of the design process, I again used my expertises in Aesthetics of Interaction & Integrating Technology to first develop some explorations, after that I used it together with Aesthetics of Formgiving to develop the toolkit in such a way that it can be “used” by other designers to design with it! Still the development of the toolkit isn’t finished yet, a user friendly interface should be developed to link the different module to each other. From here on the next deadline will be in quartile 1 of M2.1. During this block I want to co-design with different types of designers to see what kind of outcomes could be generated with the haptic toolkit. These session will probably be explorative workshops based on qualitative user research, such as co-reflection, in where I will co-design together with the other designers. Therefore during this part of the process I will be focused on my expertises Aesthetics of Interaction, Integrating Technology & Qualitative User Research (exploring & co-designing). In this part I will again start broadening up.
39
In the final part I will use all the extracted qualitative feedback and physical prototypes from the workshops/session to develop a final product, in my case an Audio - Hifi. During this block all the different expertises will come together, exploration, product development and qualitative user research. As a final note I want to clarify that the focus during this project is not put in designing a final everyday haptic product, rather I want to show the value of haptics when implemented in an everyday object, through this type of design process. I want to show that the haptic toolkit is a strategy to stimulate and inspire designers to develop haptic interaction products, which is the goal of this FMP.
40
8 REFERENCES
41
1. Abram, D. (1996). The Spell of the Sensuous. New York: Vintage, 1996. 2. Bosch - Start Bosch.us.” 2004. 27 Mar. 2013 <http://www.bosch.us/> 3. Disney Research » Surround Haptics: Immersive Tactile Experiences.” 2012. <http://www.disneyresearch.com/project/surround-haptics-immersive-tactile-experiences/> 4. Dourish, P. (2001) Where the Action Is. The Foundations of Embodied Interaction. Boston (MA), USA: MIT Press. 5. Hayward, V. (2008a). A brief taxonomy of tactile illusions and demonstrations that can be done in a hardware store. Brain Research Bulletin, 75(6), 742–752. 6. Jones, M., & Marsden, G. (2006). Mobile Interaction Design (p. 398). John Wiley & Sons. 7. Kern, T. A. (2009). Engineering Haptic Devices. (p. 504). Springer. 8. MO : Modular Musical Objects on Vimeo.” 2011. <http://vimeo. com/23358363> 9. Moussette, C. (2012). Simple Haptics: Sketching Perspectives for the Design of Haptic Interactions. Dissertation: Umeå University, 2012. Umeå, Sweden. 10. NET Gadgeteer - Microsoft Research.” 2010. <http://research.microsoft.com/en-us/projects/gadgeteer/> 11. Norman, Donald (1988). The Design of Everyday Things. New York: Basic Books.
42
12. Res cogitans - Blackwell Reference Online.” 2008. <http://www. blackwellreference.com/public/tocnode?id=g9781405106795_chunk_ g978140510679519_ss1-96> 13. Schnell, N. & Frederic Bevilacqua, F. Opening Keynote: Body and Sound Tangible Interfaces in Music Listening and Performance. In TEI 2013. Barcelona, Spain. 14. Verplank, B. (2013) Closing Keynote: Tangible Interaction Metaphors, Haptics and Celebration. In TEI 2013. Barcelona, Spain. 15. “Vorhanden, Zuhanden, and Dasein, three modes of being ...” 2008. <http://dominationandmastery.wordpress.com/2007/09/29/vorhanden-zuhanden-and-dasein-three-modes-of-being/> 16. Weiser, M. and Brown, J.S. The Coming Age of Calm Technology. In Beyond Calculation: the next fifty years of computing. Springer-Verlag, (1997), 75–85. 17. WWW Ircam: Research.” 2005. 27 Mar. 2013 <http://www.ircam.fr/ recherche.html?L=1>
43