ProtoProbes design report: towards a tool for understanding sensory data in Makerspaces Written by Pepijn Verburg
Master Graduation Project (M2.2) coached by prof.dr.ir. Loe Feijs Department of Industrial Design, University of Technology, Eindhoven, June 2016
abstract The purpose of this report is to propose a novel tool for Makerspaces to help understand sensory data in digital-physical prototypes. A total of three iterations were done, working towards a final design fitting the workflow of a Maker. Design requirements are put forward as conclusions from focus groups, concurrent probing sessions, contextual inquiries and expert meetings. Research was done in various environments, ranging from universities, colleges of higher education and design studios. The notion of insight in data has been explored using a visualization framework by Patterson et al. (2014). A total of four layers of insight are identified: (1) basic functioning, (2) data across time, (3) alternative representations and (4) mathematical manipulations. These layers are implemented in a fully functional prototype. The prototype consists of (1) a hardware component allowing Makers to instantly integrate the tool in their existing prototypes and (2) a software component collecting all the data measured from multiple prototypes into one coherent web-based interface.
3
Acknowledgement First, I would like to thank a few people who have guided, assisted and facilitated me during this final master’s project of the Department of Industrial Design in Eindhoven. They might have agreed or disagreed with me. Possibly, we’ve build and debugged together. Or, we might have had some fruitful discussions. Confusion, misunderstandings and chaos probably had their times too. But still, from the chaos emerged a worthwhile result. And I hope you will agree, when reading about it.
Thanks to... Loe Feijs as coach, Janpaul Verburg as craftsman, Jeroen Peeters as expert in interaction design, Ambra Trotto as expert in interaction design, José Carlos Sánchez Romero as expert in architecture, Rickard Åström as expert in interaction design, Marcel Penz as expert in interaction design, Shirley Niemans as expert in making and facilitator, Loes Bogers as expert in making and facilitator, Troy Nachtigall as expert in wearables, Geert van den Boomen as expert in electronics, Frank van Valkenhoef as expert in electronics, Bram Naus as engineer, Thomas Latcham as fellow-student critic, Ronald Helgers as fellowstudent critic, Mariano Velamazan as fellow-student critic, Margo Breedijk as sketch artist, Eveline Heesterbeek as critical reviewer, Maryon Widdershoven as critical reviewer, Sylvester Breedijk as critical reviewer, Sophie Pan as critical reviewer, Joep Elderman as debugger, Sander Biesmans as user, Dorothé Smit as facilitator, Yannick Brouwer as expert in related work, Stijn Zoontjens as expert in related work and Roy van den Heuvel as emergency 3D printer. 4
5
Introduction Democratization of technology The democratization of technology has been a popular subject over the past few decades. Modern technologies become more accessible to more people. Institutions, education and companies respond to this by creating shared spaces for people to make stuff with these now widely available technologies. These spaces are commonly known under the name ‘Makerspace’. Digital information In these technologies an increasing amount of digital information can be found, because of sensors, actuators, signal processing, neural networks, etc. This is great for a technology-driven society as ours. But there is a problem where the increase of complexity creates a bigger gap towards human interpretation. This makes it difficult for designers, engineers or tinkerers to understand the technology they are working with, making the implementation of an envisioned design a tough job. Makerspaces can facilitate in making it easier.
6
Novel tools for understanding Makerspaces all pursue the same ideal: providing a shared and collaborative environment with high-end equipment where many different people can work on making things out of existing hardware, custom built electronics or handcrafted objects. Still, they lack interactive tools to support the understanding of digital information, because most of the tooling focuses on the manipulation of physical materials (Victor, 2014). This report shows the development of a tool to understand sensory data in digital-physical prototypes. The project answers two questions. (1) How to see what your prototype is seeing of the real world? And (2), how to see what your prototype could be seeing of the real world?
7
table of contents 1. project definition
10
1.1 makerspaces as context
12
1.2 the maker movement as target group
14
1.3 societal relevance
16
1.4 novel maker tools
18
1.5 sensory data as scope
20
1.6 benchmark
22
1.7 expert meetings
24
1.8 fundamental design requirements
26
2. approach
28
2.1 three Iterations
30
2.2 three parts
32
2.3 four stakeholders
33
3. workflow part
34
3.1 focus group: tools in creative environments
36
3.2 contextual Inquiry: workflow
38
3.3 experience flow
40
3.4 workflow design requirements
42
8
4. probe part
44
4.1 background literature
46
4.2 towards a proof of concept
48
4.3 concurrent probing: werables
50
4.4 concurrent probing: sensor basics
52
4.5 affordances and interactions
56
4.6 embodying data
58
4.7 probe design requirements
60
5. data part
62
5.1 background literature
64
5.2 towards a proof of concept
65
5.3 focus group: data visualization
66
5.4 data in spaces
70
5.5 data design requirements
72
6. synthesis
74
6.1 feature overview
76
6.2 probe prototype
78
6.3 data prototype
92
6.4 discussion
96
7. bibliography
98
7.1 references
100
7.2 table of figures
104 9
1. project Definition In this chapter the scope of the project is put forward along with background information about Makerspaces and The Maker Movement. Most importantly, this chapter elaborates why this project is relevant for our society and how it contributes to innovation.
10
11
1.1 Makerspaces as Context Maker Labs, Design Factories, FabLabs, etc., embody the human drive to make and build. They are the result of recent technological developments. The traditional craftsmanship room is being transformed into a room to support the creation of digital-physical products. Makerspace or Hackerspace? Over the past decade, an increasing number of collaborative spaces for making and hacking have made an appearance. A recent article by Cavalcanti (2013) indicates the growth in popularity which are known by a variety of names. Three types are defined for clarification: Hackerspaces, Makerspaces and Makerspace Franchises. According to Calvalcanti Hackerspaces mainly facilitate the alterations
of existing products. Makerspaces have a focus on the creation of new products through proofof-concept prototypes. Makerspace Franchises are a more professional version of Makerspaces as they are part of a bigger network around the world and often ask membership fees (Calvalcanti, 2013). Figure 1.1.1 to 1.1.3 show an impression of the differences. This project focuses on Makerspaces, due to its common occurrence in (educational) institutions and increased influence on the development of new products.
figure 1.1.1 - Hackerspaces have a focus on hacking existing products and facilitating workshops to do so.
12
figure 1.1.2 - Makerspaces are more focused on facilitating crafting of physical shapes and testing interactions.
figure 1.1.3 - Makerspace Franchises have a professional touch to them whereas fees are common.
13
1.2 The Maker movement as target group The Maker Movement is a diverse group of people, described well by Dougherty (2012), founder of Make Magazine. They are a group that have the initiative to independently make products, attempting to cross the boundaries of the available mass-produced products (Dougherty, 2012). Hatch (2013), cofounder of the Makerspace Franchise TechShop, summarises properties in the book ‘The Maker Movement Manifesto’ as: they make, share, give, learn, tool up, play, participate, support and change. This description gives a clear impression of the type of people working with digital-physical prototypes. They are willing to use external tools (Hatch, 2013) and are in need of a tool that enables them to understand the digital information they are working with (Victor, 2014). Tools can provide aid by maximising creativity to its fullest potential.
14
figure 1.2.1 - working together is one of the key aspects of the Maker Movement 15
1.3 Societal relevance The development of Makerspaces is crucial to our modern times as it strives for shared technology for middle-class. The non-profit association Educause (2013) published an article about the societal implications of Makerspaces. They describe that these spaces are used not only for the high-end equipment but also to share ideas and knowledge, to challenge each other and to overcome problems. The informal atmosphere is highly appreciated and it gives the chance for spontaneous ideas and innovation to arise. Most importantly, it enables people to learn on their own and switch quickly between disciplines, giving a wonderful environment for the creation of a self-directed learner. The notion of the self-directed learner is one of the current buzzwords within education. The development of these type of people is valuable as it is gaining popularity in this technologydriven society (Fischer & Sugimoto, 2006). This project aims to create a powerful tool in training self-directed learners at (educational) institutions to make complex products, systems and related services. To do so they need an indepth understanding of what they are creating.
16
Other companies and educational institutions are also gaining awareness of the advantages and societal influences of these collaborative spaces in relation to innovation. For example, Loertscher (2012) illustrates how libraries are a context where this shift has started to happen. Libraries will always be a centre of knowledge, a place where one can integrate and synthesize facts and thoughts. In the 1960’s, libraries started to use multimedia equipment, such as audio and video recordings. These days a transition can be found in the equipment that facilitates innovation and creativity with a basis in making. There is a fair chance many institutions are implementing environments such as Makerspaces into their standard facilities.
figure 1.3.1 - Maker event for youngsters at Willows Community School in Culver City, California 17
1.4 Novel maker tools So, Makerspaces are a relatively new concept still in development and many people are contributing to its development. The concept ‘Seeing Spaces’ by Bret Victor (2014) is one of the fundaments of this project. He observes a current infrastructure of tooling in Makerspaces mainly focusing on the manipulation of physical materials and components through, e.g. soldering, welding, 3D printing, sewing, sawing, laser cutting. This feels like a logical choice, as many of our previous workshops have been about altering the physical world. However, crafting the digital information flowing through these physical components is equally as important as crafting the physical world. The digital world defines the behaviour of the digital-physical product and influences many of the interactions with the user. The product is likely to fail when there is not a thorough understanding of it.
18
Common tools for understanding digital information are oscilloscopes, multi-meters and specialized debugging applications. Victor (2014) stresses that these tools should be replaced to see across time and across possibilities when developing interactive prototypes. Also, current tools are not designed to explore interactions between humans and digitalphysical products. They are more focused on measuring precise and accurate changes in data. They don’t account for the slowness, unpredictability and subjectivity of the human body many of these products will encounter. This results in a development process which becomes ad hoc and unreliable. For example, a process of trial and error to find a specific calibration value for a sensor or actuator for the required interaction.
figure 1.4.1 - Seeing Spaces by Bret Victor (2014) 19
1.5 Sensory data as scope The concept of ‘Seeing Spaces’ (Victor, 2014) shows a utopia of understanding the digital world. It raised the question of how to make a first step towards such an environment with current technologies. How can Makers be empowered to truly understand the digital systems they are making? How are we able to get the information out of this black-box? These are very broad questions and too much to take on for a graduation project. Three divisions can be identified in digital systems: input, computation and output.
interesting!
20
To contain the scope of the project the choice has been made to design for the first step in the cycle: the input. In other words: how can Makers be empowered to understand the sensory data they encounter while making?
how can Makers be empowered to understand the sensory data they encounter while making?
amazing!
21
1.6 benchmark This project is not the first of its kind to identify the gap between human understanding and digital processes. When looking at digitalphysical products, many innovations can be found in the field of robotics. For example, the latest tools allow seeing links between the real world and the perception of the robot (Annable et al., 2015; Gumbley & MacDonald, 2012). When looking at the Maker Movement, progress is lacking. Existing tools (see figure 1.6.1 to 1.6.6) consist of solutions in Visual Studio (Visual Micro, 2012) or custom built software (Hobye, 2012; Fens, 2012; Plotly, 2015; Involt, 2016; Arduino, 2016).
figure 1.6.1 - Visual Studio (Visual Micro, 2012)
22
They mainly focus on providing symbolic debugging or graphical data plots. It should be noted they all rely on additional software to integrate the tool into a prototype, which is likely to interfere with the processes being executed. Some need custom software packages, others need a specific development platform. This project would distinguish itself when it puts forward a solution that (1) has a low implementation threshold, (2) an independent platform, and (3) no interference with the existing system.
figure 1.6.2 - Serial Plotter (Arduino, 2016)
visual 1.6.3 - Arduino Monitor (Fens, 2012)
figure 1.6.5 - Involt (Involt, 2016)
visual 1.6.4 - GUINO (Hobye, 2012)
figure 1.6.6 - Plot.ly (Plotly, 2015)
23
1.7 Expert meetings A total of four experts have been consulted regarding the focus of this project. This chapter summarizes the questions that were asked and the most important findings for the definition of the project.
Frank Delbressine engineering expert
Panos Markopoulos user research expert
geert van den boomen electronics expert
Jean-bernard Martens Visual interaction expert
24
Delbressine has a lot of experience within the field of Mechanical Engineering and system thinking. When discussing the digital signal processing, it became apparent that the chaining of certain filters can give people insight on how they work together (for example a differential equation and auto correlation). As the focus
of sensory data is still broad, he urged to stick with simple sensors at the beginning. The importance lies more with answering what really creates sensor understanding. In this case, the understanding is about human interpretation. How can we facilitate finding the right interpretation?
Panos Markopoulos is an expert on user research and was asked how to measure understanding of data. It became clear this type of research is worthwhile when done qualitatively. It is crucial to observe how people attach to such a tool. What makes them use the tool again? What makes them teach the process to other people?
Geert van den Boomen is an expert in the field of electrical engineering. He recognizes the challenges for this project from personal experience. He stresses that some sensors are hard to measure, because they are digital sensors using specific protocols. What he encounters often is the lack of reproduction of
measurements. There is no platform allowing makers to record and share data efficiently with, for example, external experts. How to share sensory data quickly and efficiently? How to ‘tag’ moments in the data set that are important?
Jean-Bernard Martens is an expert in the field of visual interaction. He agrees with the problem but stresses not to focus on the visualization, because this is already done. The problem also lies at the integration in the workflow of the user. How can the tool invite people to use it at times they might not know they need it?
25
1.8 fundamental design requirements This project started with several initial design requirements based of the literature, experts and existing work in this field. This page summarized them.
platform-independent hardware Question:
description:
How can the tool revolve around a platformindependent hardware solution?
The tool should consist of a hardware solution with a platform-independent software component. This doesn’t interfere with the existing system. A completely closed-off system can be attached to it, making integration and removal easy to realize. This will distinguish the project from existing software-based solutions.
source: Benchmark (chapter 1.6).
workflow integration Question:
description:
How is the tool properly integrated in the workflow of a maker?
The tool should be explicitly designed to fit the workflow of a Maker. The project should not get lost in the field of data visualization.
source: Jean-Bernard Martens & Panos Markopoulos (chapter 1.7).
26
figure 1.8.1 - early exploration of integration of tooling in makerspaces
27
2. Approach This project is split up into three iterations, each having three major parts. The processes through these iterations are of a converging nature and are driven by prototypes and user evaluations to reflect on. This report presents the parts as the main chapters, each going through the work of all iterations and concluding with implications for the final design.
28
29
2.1 Three Iterations iteration one: exploring boundaries
The first iteration was explorative. It determined what options there are in creating a tool for understanding data in Makerspaces. A prototype was built to make first steps towards a versatile technological infrastructure that can be used in future iterations. This first iteration was not used
in any validations. It consisted of basic UDP transfer processes and existing visualization engines to explore possibilities in terms of technology. This was required in particular to measure to what extent the concept of Seeing Spaces (Victor, 2014) could be pushed to reality.
iteration two: collecting needs
The second iteration focused on how tools are integrated into the workflow in Makerspaces. Findings were implemented into a second prototype and validated at the Department of Industrial Design and at external institutes. It was fully functional in measuring analogue sensors and visualizing
30
the data in real-time. Users could also apply some basic data manipulations. This prototype has been used in several user validations: two concurrent probing sessions (see chapter 4.3 and 4.4) and one focus group (see chapter 5.3).
iteration three: synthesis
The first two iterations provided the project with sufficient technological infrastructure to implement concrete findings from all iterations. Iteration three has been the moment of synthesis and integration towards a final presentable prototype. During this iteration UmeĂĽ (Sweden) was visited to
develop outside the context of Industrial Design in Eindhoven. With the help of the Interactive Institute in UmeĂĽ the project was able to dive into the field of data visualization (see chapter 5.3) and data embodiment (see chapter 5.4).
31
2.2 three parts workflow part
probe part
Investigating and defining the workflow of the user is key in this part. It identifies phases where a tool for understanding digital information is needed. Chapter 3 shows an experience flow describing this.
This part is dedicated to designing the hardware component of the new tool (as defined in chapter 1.8). Chapter 4 shows the development process.
data part
The data part is about the design process of the data visualisations and integration of existing visualization tools. Chapter 5 dives into this.
32
2.3 Four stakeholders Industrial design in eindhoven
interactive institute swedish ICT
The project has collaborated with two labs within the department. There were regular discussions with the experts responsible for the E-Lab and a user evaluation was done in collaboration with the Wearable Senses Lab.
This design studio in Sweden is visited to develop the data embodiment of the project. The project collaborated with the +Project to explore sensory measurements and data embodiment in novel housing facilities.
institute of design in umea
medialab in amsterdam
Within the Institute of Design in Umea there was an opportunity to collaborate with the Interaction Lab. The project was invited to give a lecture and a focus group took place about their way of working and data visualization.
The MediaLab was approached to get a sense of how the project would fit in existing Makerspaces. A user evaluation was organized during one of the courses of the ‘Hogeschool van Amsterdam’ about the basics of sensors.
33
3. workflow part This chapter defines an experience flow identified by executing (1) a focus group (Morgan, 2002) and (2) a contextual inquiry (Schuler & Namioka, 1993). Several new design requirements are put forward from this user input.
34
35
3.1 focus group: tools in creative environments Description & Method The main question for this focus group (Morgan, 2002) is to identify what makes people use tools in creative environments and how they are embedded in their workflow. A discussion of 20 minutes took place during a user test session of the EIT toolkit (Smit et al., 2016). A tangible exploration toolkit was available to explore different types of interactions (see figure 3.1.1). Demographics Two master graduate students of the Department of Industrial Design in Eindhoven with backgrounds in service design and interaction design participated (see figure 3.1.2). The third participant was the researcher asking questions to guide the session.
Discussion The notion of visual stimuli and the comparison with traditional craftsman workshops give the project a clear starting point for the hardware development (chapter 4). It is important to build a tool visually present in the room, which is easy to access and is usable for many different projects. The room itself should invite people to grab the tools and use them. Similar processes are described by Victor in his concept ‘Seeing Spaces’ (Victor, 2014) where he sees the room as “a macro-tool people are embedded inside”.
Conclusions Three main conclusions can be made. First of all Makers are not always aware of the moment they need a tool. It is hard to pinpoint when one should switch to a tool to be more effective. Secondly, it became apparent that (physical-) visual stimuli are crucial in triggering people to use tooling. The comparison was made with a traditional craftsman workshop where different types of tools surround the Makers visually and thereby triggers their problem-solving. Finally, the tooling should be flexible and abstract enough to fit a vast range of different projects.
36
figure 3.1.1 - the EIT Toolkit (Smit et al., 2016) helps act out ideas and interactions
figure 3.1.2 - group discussion with two other master graduate students 37
3.2 Contextual Inquiry: workflow Description & Method The goal of this contextual inquiry (Schuler & Namioka, 1993) is to get a sense of the processes involving decision-making. A standard waterfall design process is chosen to identify the different phases Makers go through (not necessarily in this order): (1) problem definition, (2) background research, (3) specify requirements, (4) generate solutions, (5) prototype solutions, (6) test solutions, (7) reflect on results and (8) communicate results. During the inquiry participants are asked to think-out-loud about their decision making. They were questioned about a phase explicitly if it was not included in the session. Demographics Two master graduate students of the Department of Industrial Design in Eindhoven with backgrounds in product design, interaction design and mechanical engineering participated. They are experienced with integrating sensory technologies in their prototypes. Both were working on their prototype while performing the contextual inquiry. Participant #1 was working on a custom gesture recognition pad working with capacitive sensors. Participant #2 was working on a smart sock with 5 pressure sensors, measuring accurately how the wearer is walking. Results Appendix 3 shows a table containing the most relevant observations and quotes about each of the phases. 38
Conclusions When looking at the results there are a couple of phases that either are too subjective or take too much time. First of all, the phase generation of solutions revolves around guessing. Both participants try to imagine whether something is going to work or not. Secondly, the prototype solutions and test solutions phases take a lot of time. Apparently, sometimes the Makers don’t take additional time to explore alternative representations of their data. Finally, the phase reflect on results can be tedious when handling a multitude of data points. Discussion Four phases can be explicitly identified as troublesome in the workflow of a Maker: (1) generate solutions, (2) prototype solutions, (3) test solutions and (4) reflect on results. These are only based on two participants, but they give a good sense of when a tool should present itself. Prototyping solutions is likely to take place within the Makerspace. However, generating solutions, testing solutions and reflecting on results don’t necessarily happen within a Makerspace (as seen with participant #2). This calls for a tool that allows to be mobile and can be carried out of the Makerspace environment. This is the opposite of what Victor shows in Seeing Spaces (Victor, 2014). The space is presented as a solution for all, where everything can be tested and measured. Still, some prototypes need an entirely different context to test it optimally (wearables in particular).
figure 3.2.1 - an early stage prototype of a wearable from one of the inquiries 39
3.3 Experience flow The results of the contextual inquiry (see chapter 3.2) were used to setup an experience flow (see figure 3.3.1) with the 8 phases from a standard waterfall process. This flow is based off an early version of a workflow description (see appendix 2). Note that during this process the Maker can
1. problem definition
2. background research
still jump back and forth between phases as part of an iterative process. With this overview several moments in the process are identified where this project can play a role: (1) with experiments in the generate solutions phase, (2) during micro-testing while prototyping,
3. specify requirements
4. generate solutions
literature experimental problem
define starting point
“In my mind I setup several requirements for the sensors”
societal problem
technological problem
experts “I look at my user and then I translate them to technical implications”
ideation assess on requirements
experiments
related work intuition
clients “I originally started with the societal aspect, but in this phase I shifted towards a more technological focus” 40
“I looked at existing products and how they implemented certain sensing technologies”
users
benchmark
“I test some stuff out“
“I try to imagine how they should be integrated into the wearable and how their functionalities will probably be”
(3) to provide custom tooling when testing, (4) while collecting data during reflection and (5) while capturing outcomes when in need for communication. figure 3.3.1 - workflow of two makers throughout the protoyping process
5. prototype solution
6. test solution
7. reflect on results
“This takes some time, but it is worth it” “Testing in-between of the different types of sensors is difficult and takes a lot of time”
“The connections are not beautiful enough”
observing expert evaluation
collecting data
building
assess on performance
micro tests
experts
“This is a phase where I tend to ask some experts to join in when I’m not certain about my readings”
8. communicate results
“I also show how it was made possible and how choices are made.”
asking user validation
contextual test
custom tooling
“I just view the visual data over and over to look for specific patterns and properties”
“Exploring filtering of my data takes too much time”
determine process
capture outcome
“I make print screens of the graphics” 41
3.4 workflow design requirements This part is concluded with a set of design requirements coming from the literature, experts and user tests discussed in the previous subchapters.
Visual Presence Question:
description:
How can the tool be embedded in Makerspaces?
The focus group has shown visual stimuli are important in using tooling in creative environments such as Makerspaces.
source: focus group: tools in creative environments (chapter 3.1).
Integration in Wearables Question:
description:
How can the tool integrate in wearables?
There are many complex and unpredictable shapes in the field of wearables. Allowing integration onto these types of prototype is a challenge.
source: Contextual inquiry: workflow (chapter 3.2)
42
Mobility Question:
description:
How can the tool be used at external locations?
Seeing Spaces (Victor, 2014) focuses only on the environment within Makerspaces. However, sometimes testing and debugging is needed at external locations.
source: Contextual inquiry: workflow (chapter 3.2)
Wireless Connectivity Question:
description:
How can the tool transfer data wirelessly?
Existing tools depend on wiring to a central device. When integrating in wearables or moving objects this can prove to be difficult.
source: Benchmarking (chapter 1.6)
Data Collection Question:
description:
How can one review past data?
The tool needs to give insight into the data accross time for reproduction and communication purposes.
source: Contextual inquiry: workflow (chapter 3.2) and Geert van den Boomen (chapter 1.7).
43
4. probe part This chapter is about the most important activities defining the design requirements for the probe (i.e. the hardware). The prototype of the second iteration is explained and two user validations with this prototype are summarized.
44
45
4.1 background literature There are two important fundaments for the development of the hardware in literature as a starting point: (1) aesthetics and pleasure to invite usage of a product and (2) modularity to ensure a product can be used in vast range of applications. Aesthetics and pleasure Hekkert (2006) provides a set of principles in achieving pleasure in aesthetics. It has proven itself to be useful for inviting people to use and appreciate a product. The first principle is about “maximum effect for minimum means” where a comparison is made with a vast amount of other fields where minimal effort resulting in a big effect is appreciated. In terms of design this can be sought after in using simple visual patterns to elicit certain interactions. The second principle describes the principle of “unity through variety”. It is considered aesthetically pleasing when a seemingly coherent and simple object has layers of depth in it. Finally, “optimal match” is about multi-modality in products. Hekkert argues that not only form follows function, but “sound/touch/smell/form follows function”. Using all modalities to create an interaction is unnecessary for this project, but can be worthwhile to take into account modalities such as sound and touch to express more complex digital information.
46
modularity Clark and Baldwin (1998) define three types of modularity in design: modularityin-production, modularity-in-design and modularity-in-use of which the latter fits this project. The focus group and contextual inquiry from chapter 3.1 and 3.2 have made it clear there are many differences between projects, making it unpredictable how people will use such a tool. Clark describes this type of modularity as: “A product becomes modular-in-use if consumers can mix and match components to arrive at a functioning whole. A significant degree of control over the design is thereby transferred from the firm to its customers.” This is exactly what to achieve in context of this project.
47
4.2 towards a proof of concept System Division The two principles of Hekkert (2006) and Clark and Baldwin (1998) were the starting point for the second iteration prototype created during the first half of the project. This prototype will be referred to as ProtoProbes 2.0 from now on. ProtoProbes 2.0 is the first fully functional version with (1) a data collection unit (see figure 4.2.1), (2) a data receiver unit (see figure 4.2.2) and (3) a visualisation unit. The data collection unit (referred to as ‘probe’) is the most important component for this chapter. The visualisation unit will be discussed in chapter 5.2. The remaining two are supposed to be integrated into the Makerspace on installation or to be used in combination with the computer of the maker. This is in line with the vision of Makerspaces where shared facilities are key (Victor, 2014).
figure 4.2.1 - receives data and sends it to the visualisation unit via a serial connection 48
User Interactions The probe is based on a cube where each side has a specific functionality. Figure 4.2.3 shows the mapping of the sides and its functionalities: (1) The indicator plane contains a NeoPixel LED to show the status (2) The connector plane is modular and allows different connectors depending on the investigated prototype. (3) The grip plane are two sides with each a Bezier to provide a comfortable grip when moving the probe around. (4) The attachment plane is modular where materials, such as, Velcro and clamps can slide into the slot. (5) The power plane is for charging the 500mAh battery and an on/off switch. Both the connector plane and attachment plane are made modular-in-use (Clark & Baldwin, 1998). They even allow the user to create their own modular components, because of the usage of standardized connectors (e.g. micro-jack plugs).
figure 4.2.2 - reads analog data and sends it to the data receiver unit using Wi-Fi
1. indicator plane
2. connector plane
3. grip plane
4. attachment plane
figure 4.2.3 - division of functionalities for each plane of the shape
5. power plane 49
4.3 concurrent probing: werables Description & Method The project was given the opportunity to participate in a workshop at the Department of Industrial Design in Eindhoven. Each participant of the workshop is making their own wearable sensors out of foam, conductive thread and Velostat. This opens an opportunity for ProtoProbes 2.0 to display how well the sensors are functioning. In the room a small setup was made with a breadboard, computer and one probe where the participants can drop by to test their sensors (see figure 4.3.1 and 4.3.2). Participants are asked to connect their sensor and view the plotted voltage readings on the screen. The main goal of this user evaluation is to see how quickly people can hook up ProtoProbes 2.0 to their sensors. The data visualization is not of importance. Demographics A total of 8 people participated. They were all from the Bachelor program of Industrial Design in Eindhoven. The majority had little experience with building and debugging sensors. Results Appendix 4 lists the results of this user evaluation. It is not a transcript, but a summary of all the most important observations.
50
Conclusions The evaluation can be summarised to 4 topics: (1) the physical form of the probe doesn’t afford to use it properly. (2) The probe should be able to power a sensor, to use it stand-alone. (3) The problems with a faulty sensor couldn’t be determined quickly. The ability to connect to different parts of the sensor prove to be useful. (4) The overall impression of ProtoProbes 2.0 has been positive. The lecturers of the workshop had the opportunity to show in-depth information about the sensors and their invisible differences. Discussion The main problem lies in the affordances of the hardware (Gibson, 1978); it is not obvious how to connect the two wires. Still, different solutions also require people to have at least some understanding of electronics. Building something completely intuitive is impossible. Currently, the connectors for both the cathode and anode are on the same location. They don’t communicate a flow of current being created when connecting them.
figure 4.3.1 - the sensors were made with conductive thread and pressure foam
figure 4.3.2 - they were made in different shapes (e.g. rectangular or circular), influencing the readings 51
4.4 concurrent probing: sensor basics Description & Method A second user validation is done to see how the affordances work in another context and to find new useful features. During conversations with the MediaLab in Amsterdam an opportunity was presented to test ProtoProbes 2.0 at one of their courses. A total of 300 students follow the course ‘Ubiquitous Computing’. They are in teams of 3 to 4 people. The design case is to develop a concept and prototype for the city of Amsterdam to make people more active in recreational parks. During the course students develop a prototype using a variety of sensors. This is done on a breadboard. They can drop by the researcher to integrate ProtoProbes 2.0 into their prototype. ProtoProbes is briefly explained as: “This is a tool that helps you to understand your sensor data. You can hook it up to your circuit and instantly see the data. You can alter the data you receive with some simple mathematics to see whether you need to use something else rather than the raw data.” After the explanation they are asked to connect the probe to their setup. The researcher will take notes in between and is allowed to ask questions.
52
Demographics This class consists of 30 to 40 second year Multimedia and Communication HBO college students. They have little experience with electronics and no experience with sensors. The male to female ratio is about 1:1. Results To start off: the user test didn’t go as planned. The students were expected to be at the phase of building their prototypes. However, many were still in the phase of deciding on the concept or gathering electronic components. There was a very interesting dynamic between the lecturers and students when discussing their projects. For this reason the setup of the user test was altered to a contextual inquiry with some concurrent probing. See figure 4.4.1 to 4.4.4 for an impression. The observations are listed in appendix 5.
figure 4.4.1 - students in the phase of getting to know the different components available
figure 4.4.2 - students figuring out how the sensors are positioned in the prototype 53
Conclusions First of all two of the conclusions from a previous user evaluation (see chapter 4.3) were in line with the findings of this evaluation: (1) the probe doesn’t afford proper usage and (2) sensors need to be powered directly from the
Discussion Unlike the user evaluation from chapter 4.3, this test gave insight in very different projects. This diversity helped to see some projects need different tools than others. For example, some need to test sensors in context, others need to
probe.
calibrate them. Furthermore, it was reassuring to see the affordance need to improve also for this user group.
Furthermore, this evaluation gave some new insights: (3) ProtoProbes 2.0 appeared to be not only useful for the choice of sensor but also how the sensor should be integrated into the prototype. (4) During the explorative / definition phase ProtoProbes should facilitate calibration possibilities by integrating pull-up/pull-down resistors. This is a tedious process students either don’t know about or don’t take the time for. Furthermore, (5) many of these projects needed to recognize a threshold in sensor value. Such a threshold could be identified either the hardware of software of the probe. Finally, (6) digital sensors were quite popular, because they easily get accurate readings. ProtoProbes 2.0 couldn’t do anything with these sensors.
54
figure 4.4.3 - a piezo on a flexible surface changes the readings in such a way it is suitable for the envisioned interaction.
figure 4.4.4 - chaotic working tables are a signature of explorative and early prototyping 55
4.5 affordances and interactions The user evaluations from chapter 4.3 and 4.4 are a good starting point in defining a new shape. The Frogger Framework (Wensveen et al., 2004) contains a useful set of properties to assess the interaction of ProtoProbes 2.0. In particular the Functional Feedforward (Wensveen et al., 2004) is done poorly: the connectors don’t reveal enough information about their functions. Also, the probe lacks feedback when a sensor is connected. The notion of Augmented Feedback is useful in conceptualizing a solution for this problem. Djajadiningrat et al. (2004) argues that “in addition to a data-centred view, it is also possible to take a perceptual-motor-centred view on tangible interaction”. In context of this project it means there are possibilities to use the physical state of the artefact as a communication tool of what is possible. For example, changing the physical composition alters whether the software is measuring or not. Shape exploration With the concept of the Frogger Framework (Wensveen et al., 2004) in mind, a shape exploration is done. Appendix 10 shows all the shapes and some interactions with them. It became clear the shape should express a way of connecting an external circuit to the prototype. Figure 4.5.1 shows a selection of shapes living up to this.
56
figure 4.5.1 - overview of shapes and selection of the most interesting ones when it comes to affordances of connecting
57
4.6 embodying data ProtoProbes 2.0 is shown at the studio of the Interactive Institute in Umeü. Jeroen Peeters, PhD candidate on the subject of aesthetics in interaction, was under the impression this prototype is a weak embodiment of its functionalities. The hardware itself doesn’t express any data flow, making it unpredictable for the user what is going on. Are the wires connected correctly? Or is ProtoProbes simply not working? Metaphors Answering these questions started by looking at existing tools in a Makerspace environment (see figure 4.6.1). There were two tools that stood out: the pliers and wires. Pliers are something you can easily grab and manipulate to the right setting for the job. Opening up and closing the two handles is saying a lot about the state of the tool.
Conclusion The foam core mock-ups have been informally tested with several studio members. The plier metaphor seemed to work nicely in its interaction, but proved to be useless when one needs to move it around a prototype. The shape is too fixed and has limited configuration possibilities. The wire metaphor on the other hand elicited flexible usage: one can switch easily between locations and one can attach it easily to other objects. Also, the endings of the wires allow changes in configuration by twisting it or attaching something on top of it. These physical configurations can be translated to the digital settings internally. They can be used as dedicated locations for affordances to appear as described by Wensveen and Djajadiningrat (Wensveen et al., 2004; Djajadiningrat et al., 2004).
Wires are very flexible in usage and invite you to use them in your own way: by wrapping them around something for the purpose of attachment or placing them in such a way they do not interfere with your other electronics. Figure 4.6.2 and 4.6.3 show an exploration with foam core about how these objects would work when interacting with them.
figure 4.6.1 - existing tooling have a visual presence to trigger problem solving 58
figure 4.6.2 - plier metaphor where two segments can position themselves in different configurations
figure 4.6.3 - wire metaphor with two cylinders at the ending to function as handles to move around 59
4.7 probe design requirements All previous chapters of this part have implications for the design of the third iteration. They are summarized below.
Aesthetics and Pleasure Question:
description:
How do the aesthetics contribute to the workflow integration?
The design principles by Hekkert (2006) can play a key role in achieving integration in the workflow of a maker.
source: Background literature (chapter 4.1)
Modularity-in-use Question:
description:
How can the tool adapt itself through modularity?
The method of integration in a prototype is unpredictable for every project. Modularityin-use can provide the tool with enough possibilities for integration (Clark & Baldwin, 1998).
source: Background literature (chapter 4.1)
Stand-alone Readings Question:
description:
How can the tool directly connect to sensors?
There are moments in the process where there is no powered prototype (e.g. when testing different sensors).
source: Concurrent Probing: Wearables (chapter 4.3) and Concurrent Probing: Sensor basics (chapter 4.4) 60
Wire Metaphor Question:
description:
How can the tool express the affordance of hooking up a wire?
Shape explorations showed that a metaphor of making a closed loop is great in communicating the functionalities of measuring data within a circuit.
source: Embodying Data (chapter 4.6)
Adjustable Sensor Protocols Question:
description:
How can one switch to other sensor or communication types?
The tool needs to support both analog and digital sensors.
source: Concurrent Probing: Sensor Basics (chapter 4.4)
Adjustable Analog Calibration Question:
description:
How can one calibrate analog sensors?
The process of calibrating analog sensors is tedious and often skipped. Tooling can help in discovering the optimal calibration.
source: Concurrent Probing: Wearables (chapter 4.3) and Concurrent Probing: Sensor Basics (chapter 4.4)
61
5. data part This part goes into depth on how the data should be presented. Insight and understanding appear to be complex to design for. A visualization framework and a focus group helped to get a grasp on these definitions. Facilitating different ways of representing the data appears to be essential in creating insights.
62
63
5.1 background literature The notion of understanding data in information visualization systems is hard to get a grasp on. According to Pretorius and van Wijk (2009) “insight” is a better definition when describing understanding of data. North (2006) describes the process of gaining insight as fuzzy and unpredictable, making it hard to specify clear design requirements. Domain-specific insight For this particular project, insight is about understanding the behaviour of the sensors in relation to events in the real world. Most of the time, the behaviour depends on human interactions. This is important to note, as some other domains try to relate to data which are hard to imagine as a human being (e.g. space research, chip factories or chemistry). The data within Makerspaces are not very abstract. Every person has a starting point to relate to with most sensors (i.e. the perceivable world) which will make it easier to create insights (North, 2006). However, there still exists a problem with less intuitive sensors, such as magnetometers or gas sensors. Alternative representations Patterson et al. (2014) have defined an approach to design for the creation of insights through a data visualization framework. Figure 5.1.1 shows an adapted version. Important in this graph is the transformations block and the iterative process of refinement. Patterson has experimented with allowing the user to switch
64
between representations of the data (i. e. the same data as a sunburst chart and tree map chart). He found that certain types of visuals were more effective in creating insights than others for specific sets of data. This argues to allow switching between different representations. However, North’s work warns for the risk of distraction when doing this: “a certain visualization can enhance one task, but can make the execution of another worse” (North, 2006). Schneiderman (1996) on the other hand also argues for different representations, but his work focused on showing data with optionally less or more detail. Overall, it seems designing for alteration and refinement of different views on the same data set is worthwhile in creating insights.
data consumed
transformations produce
visual instance presented
user’s cognition
refines
defines
designer guides
figure 5.1.1 - visual framework adapted from Patterson et al. (2014)
5.2 towards a proof of concept Several existing visualisation applications are explored before creating a prototype: HighCharts (HighCharts, 2016), Sketchify (Obrenovic & Martens, 2011), C3JS (Tanaka, 2016), Plot.ly (Plotly, 2016), Rickshaw (Shutterstock, 2016) and Tableau (Tableau, 2016). It appears they all lack animated transition behaviours, are not optimized for real-time data or rely heavily on additional software.
Figure 5.2.1 shows the result of a first version with a set of standard mathematical manipulations to filter the data stream instantly. Alternative views were also added to let people experience something other than a 2D graph. Performance tests indicate a steady maximum CPU usage of 7% for a single real-time graph, whereas other frameworks would easily use 65%. The choice for a Chrome App was made to be platform independent as required by chapter 1.6.
For these reasons a bare-bones distribution of D3JS (D3JS, 2016) within a Chrome App (Google, 2016) was chosen. D3JS is a highly optimized visualisation framework allowing animated transitions to different data representations (as required by chapter 5.1) and custom optimization.
figure 5.2.1 - 2D graph of raw data with the possibility of manipulation with several toggles 65
5.3 focus group: data visualization Description & Method The Interaction Lab of the Institute of Design in Umeå has been visited during the internship at the Interactive Institute in Sweden. They were very enthusiastic about the concept and a user test was arranged. It consisted of a focus group with the topic data embodiment of different sensor types. A set of common sensors was assembled together with the Rickard Åström (responsible for the Interaction Lab). ProtoProbes 2.0 is used to show the participants what the sensors are seeing. The sensors are a starting point for discussion. The group is asked to talk about their projects, what kind of problems they found and how they would envision the data to be represented. Pretorius and van Wijk (2009) showed that directly asking what the data should look like is impossible, because insight and understanding is created from something yet unknown. Therefore more indirect questions are put forward to understand their way of working and uncover the process of gaining insight from the data on the screen. Demographics The group of participants consists of 5 second year master students of Interaction Design, Institute of Design in Umea experienced in building digital-physical prototypes. The consent forms can be found in appendix 7.
66
Results The focus group session of two hours has been fully recorded. Appendix 6 shows a transcription of the most important quotes. Conclusions The results can be summarized to a total of 7 major conclusions: 1) Allow setting a threshold to translate data to a digital signal. A lot of the testing of interactions is about certain threshold values or recognizing crossing the threshold multiple times. 2) Make a difference between simple and complex sensors. Simple sensors, such as a pressure sensor, potentiometer or LDR are easy to understand through a 2D graph. More complex sensors, such as an accelerometer, compass or magnetometer require visualizations that match the mental-model of the Maker. 3) Show an embodiment of data in the hardware itself. There is currently a big displacement of data: from the probe to the screen. Keeping attention on the screen during an interaction with the sensor becomes confusing. 4) Integrate output functionality after applying the filters. There is an opportunity for ProtoProbes to function as a ‘filter box’ where the manipulated graph is outputted back into the prototyping circuit.
5) Add functionality to change sensor sensitivity or calibration. Calibrating the sensor is now a very ad hoc and uncontrolled process making it hard to get to the right configurations for the required behaviour. ProtoProbes could support this process by letting the user quickly play with the sensitivity by integrating a variable pull-up or pull-down resistor.
7) See relation between manipulated data and actual readings. During the programming workflow the raw (i.e. not manipulated) readings are used. For this reason the tool should show the relation between the manipulated and raw data stream.
6) Integrate pattern recognition in data analysis. Many Makers are not able to recognize more complex interactions in their software. ProtoProbes could facilitate in this process in generating pieces of code recognizing certain interactions.
figure 5.3.1 - warm-up session with sketching out data visualizations 67
Discussion The overall impression of the data visualizations of ProtoProbes 2.0 has been positive. The basic features are of great value for Makers and differ itself from existing solutions. However, the depth is missing at the moment. For this reason the focus group thought of several new features to raise the complexity to match the level of master graduate students in interaction design. However, some of the features are not in the scope of the project. They are part of the process after the creation of insight and understanding. For example: pattern recognition (conclusion #6) and data outputting (conclusion #4) are needed when the product behaviour needs to be implemented. Matching complex data to the mental-model of the Maker is something also found in literature. Pretorius and van Wijk (2009) demonstrate a better understanding of the data when it is in line with the concept people have in their minds.
68
figure 5.3.2 - the Interaction Lab at the Institute of Design in UmeĂĽ 69
5.4 data in spaces In the work of Seeing Spaces Victor showed that data surrounding the users in a room are seen in environments where data of complex systems play an important role (Victor, 2014). For example at CERN, NASA, weather stations, nuclear facilities and so on. This is mostly done on screens, but what happens when the data integrate with the environment?
Result & Conclusions Figure 5.4.1 to 5.4.3 show an impression of the installation. A projector is used to beam visuals on more intimate materials, such as fabrics. It
Relative Data Also, the type of data for this project is mainly about (relatively) slow processes happening in the real world. What happens if these visuals focus on change and behaviour of the data rather than absolute values? These are questions asked before building a simple visualization installation at the Interactive Institute in Umeå. This was done in combination with an EU project about data embodiment in novel 3D printed housing facilities (the +Project) and together with José Carlos Sánchez, one of the project’s architects.
Occlusion of the projection was still a problem when interacting with the different materials. Furthermore, subtle details were easier to see (in for example gas sensor readings), because of the size. Increasing the size enhanced the immersion into the data; one can fully focus on the visuals and not get distracted by other peripheral information.
was nice to see these materials allow for more complex interactions, such as moving the material, hiding or showing certain parts of the visual or bending it in 3D space.
5.4.1 - early exploration on flat surface with the parameters of temperature and humidity in the same visual 70
figure 5.4.2 - relative data on soft materials creating more immersive interactions
figure 5.4.3 - more abstract representations, such as a particle system on soft materials 71
5.5 data design requirements All the findings of this part are summarized to design requirements list on these pages.
Views-oriented Representations Question:
description:
How can one switch between different representations of the data?
Playing with different representations is crucial in creating insights in data. Relating these to the mental-models people have in their minds is key in doing so.
source: Background literature (chapter 5.1) and Focus Group: Data Visualization (chapter 5.3)
Data Refinement Question:
description:
How can the data be manipulated for refinement while seeing the relationship with the original?
Certain mathematical manipulations help gaining insight on how to implement the sensor in the programming of the prototype. Showing the refinement in relation to the original data is essential.
source: Background literature (chapter 5.1) and Focus Group: Data Visualization (chapter 5.3)
72
Immersive Data Question:
description:
How to design an immersive experience when viewing and interacting with the data?
The interactions and immersion with the data appears to be important in various existing contexts, such as CERN and NASA.
source: Data in Spaces (chapter 5.4) and Seeing Spaces (Victor, 2014)
Data Embodiment Question:
description:
How is the data embodied in the probe?
The displacement of data from the object to the screen is sometimes confusing when one needs to pay attention to the interaction and data at the same time. The hardware itself can embody some of the data for initial insights.
source: Focus Group: Data Visualization (chapter 5.3)
Web Technologies Question:
description:
How to fully build a software platform using web technologies?
The usage of web technologies showed the potential in ProtoProbes 2.0. They are efficient, platform independent and don’t rely on additional software.
source: Benchmark (chapter 1.6) and Towards a Proof-of-concept (chapter 5.2)
73
6. synthesis In this part synthesis takes place. The requirements scattered around are merged to one coherent story and translated to a set of features. A final prototype is built based on these features. Finally, several preliminary findings are discussed.
74
75
6.1 FEATURE overview The requirements from the previous chapters are translated to an extensive set of features. Not all of them are implemented in the final synthesis, because of lack of time and to prevent feature-creeping. Appendix 11 shows the full list of features. They were prioritized by looking
76
at which would distinguish this project the most from existing solutions. On these pages one can see the relation between the design requirements and features. Note that some of the decision making has been left out to prevent repetitive information from previous chapters.
#1: Platform-independent hardware solution
#5: Mobility
The tool is an external hardware device integrating in existing prototypes (feature #1, #2 and #3). Platform independency is achieved by using a web-based solution (feature #14).
The tool can be used anywhere by using WiFi connectivity (feature #11) and web-based services (feature #14).
#2: Workflow integration
#6: Wireless Connectivity
The tool integrates different types of projects (feature #1, #2, #3). The shape is versatile enough to do so (feature #7). Practical features, such as recharging (feature #18) are also included.
The Wi-Fi microcontroller ESP8266 (Espressif, 2016) allows a wireless connection (feature #11 and #15).
#3: Visual Presence
#7: Data Collection
A docking station designed to hold multiple probes is placed in a Makerspace (feature #23).
The data are shown across time (feature #12). Pausing the stream (feature #21), scrolling back in time (feature #29) and adjusting axis (feature #17) are included as basic tools to navigate across time.
#4: Integration in Wearables
#8: Aesthetics and Pleasure
The probe is flexible enough to adapt to the unpredictability of the human body by having the shape of a flexible wire (feature #7).
This is not an explicit feature, as it depends on coherence, material usage and material finish of the probes. This will not be discussed in this report.
#9: Modularity-in-use
#14: Views-oriented Representations
Different parts are attached to determine how the probe is connected to a prototype (feature #16).
Alternative representations of the data are selected to match the mental model of the user (feature #26 and #27).
#10: Stand-alone Readings
#15: Data Refinement
The probe can output power to a sensor for early experimentation sessions as identified in the experience flow (feature #22).
The data are refined through selecting (multiple) mathematical manipulations (feature #10 and #25).
#11: Wire Metaphor
#16: Immersive Data
A wire-like shape affords to connect the prototype at two locations in the prototype (feature #4, feature #7). Also, the endings of the wire can be used to hide and show connectors (feature #8 and #9).
The data are presented on a big screen where multiple data streams come together in an immersive experience (feature #12, #13 and #24). Here one can navigate through the data naturally (feature #28, #29 and #30).
#12: Adjustable Sensor Protocols
#17: Data Embodiment
The probe can switch between analog and digital sensor types (feature #5 and #6) through hiding and showing the connectors (feature #8).
Data are shown on the hardware itself to give the user a first indication of what is going on (feature #7).
#13: Adjustable Analog Calibration
#18: Web Technologies
Analog sensors are calibrated with a variable pull-up or pull-down resistor controlled in the software (feature #19, #20, #31).
The data are collected in a web interface on a central location (feature #14). Also, the device is Wi-Fi enabled to possibly connect to existing cloud solutions (feature #11).
77
6.2 probe prototype Two modules The wire shape exploration from chapter 4.6 is translated to a 3D model (see appendix 13) and milled to two aluminium modules. At the end of each module one can turn a rotary part to hide and show certain connectors (see figure 6.2.1). Below the rotary part is a button and encoded engraving to identify in the software how the user has positioned the module. A cylinder shape is used with two flat surfaces communicating the two possible states. For the first module this is about whether there is an analog or digital sensor. For the second module this is about turning it on or off.
digital sensor
78
analog sensor
power output
micro USB charger
figure 6.2.1 - the endings of the wire shape each have their own functionalities by showing or hiding the ports 79
RGB NeoPixel LED
figure 6.2.2 - embodiment of the data flow on the hardware shows how ‘intense’ your data are; a higher data value (i.e. voltage) results in a brighter light 80
data embodiment The cable connecting the two modules is altered to fit two RGB NeoPixel LEDs (see figure 6.2.2). These can show directionality and intensity of the data, providing a first step in seeing what you are measuring.
modularity The connector holes allow insertion of various types of wire that will be held in place automatically (see figure 6.2.3). The square hole in the rotary part is meant to hold additional top pieces with specific attachment tooling. For example, they can have hook clips, alligator clips, banana plugs, etc. These parts are not developed yet.
82
connector release button
figure 6.2.3 - insertions are firmly held in place by using a connector specialized for this; a button press on top is required to release 83
custom parts Custom manufacturing of all the parts made the probe reasonably compact (see figure 6.2.4). The part that holds all the components into place is 3D printed. The rest was made on a lathe and milling machine out of aluminium or plastic for a nice finish. The next pages briefly show an impression of the manufacturing process.
84
3D printed suspension to hold all components in place
figure 6.2.4 - the assembly consists of at least 20 different parts (excluding electronics); the aluminum parts are all annodized for isolation purposes 85
figure 6.2.5 - conventional milling the aluminum module casings 86
figure 6.2.6 - smaller aluminium parts and the setup on the lathe 87
figure 6.2.7 - fitting of smaller aluminium parts onto the cable 88
figure 6.2.8 - process of assembly onto the 3D printed suspension 89
external micro USB mounting PCB
custom PCB A custom PCB is developed to fit all the electronics in the modules (see figure 6.2.9). One module holds the PCB for the power management. The other contains the PCB with all the intelligence. In appendix 14 one can find the schematic and manufacturing layout. 90
other side with the remaining intelligence (e.g. sensor calibration)
figure 6.2.9 - two layer PCB’s to fit all the electronics 91
6.3 data prototype The prototype for the data visualizations is based on the Chrome App structure (Google, 2016). The design, seen in figure 6.3.1, fully revolves around Material Design by Google (2016b). It gives clear rules about colours, buttons, lists, interactions, etc. A fully functional application is built with most of the features from chapter
6.1. Figure 6.3.2 shows a flowchart describing how the app works. One can also refer to appendix 18 for the code (clustered per object class and commented everywhere). Figure 6.3.3 to 6.3.5 on the next pages show a summary of the most important interactions.
figure 6.3.1 - graph overview (4 signals) styled with Material Design (Google, 2016b); graph zooms in when clicked on 92
external input User
N/A
ESP8266
interaction thread click or key event
50Hz
Data thread raw serial data
InteractionManager
1Hz
SerialManager
1Hz
parsed packet
Graphic thread redraw request
GraphicManager
DataManager
1Hz
data point array
20Hz
DataChannel
N/A
redraw request
DataGraphic data point array
DataManipulation
N/A manipulated data point array
N/A
render
D3JS & ThreeJS
data point array of visible domain
update UI
N/A
external output
AngularJS
N/A
visual 6.3.2 - flow of the sensory data through the Chrome App; execution frequencies are noted when applicable 93
interface adjusts to current selected sensor type
adding, editing, deleting, reordering of manipulations
selection of alternative views on the data stream
adjust scaling, scroll back in time and change axis
figure 6.3.3 - graph detail view with the main interactions explained
94
figure 6.3.4 - impression of interaction to reorder the data manipulations
figure 6.3.5 - impression of interaction to select an alternative view 95
6.4 discussion Four layers of insight The answer on the main question of how to create understanding about sensory data is still scattered around the different chapters.
layer #1
Four layers can be identified when looking at the system as a whole. Each helps in creating understanding and insight in its own way (see figure 6.4.1): Layer #1. First of all, the data flow in the cable of the probes (see chapter 6.2) shows basic information about whether there is something to measure and how intense it is at that moment.
Layer #3. An alternative view can be selected. This mainly helps to understand more complex data sets, such as accelerometers or magnetometers. Layer #4. Finally, one can add basic filters to the data stream to explore ‘what your prototype could be seeing’, instead of only ‘what your prototype is seeing’. The chaining and reordering of manipulations are what gives depth to this layer and gives the user controls of concepts previously unthinkable (Victor, 2013).
complexity
Layer #2. One can see how the data are changing across time in the real-time graph. Pausing and going back in time to see what happened are vital in making this a richer experience.
layer #2
layer #3
layer #4
figure 6.4.1 - four layers of insight and their location in the system 96
Prototype Integration There are two major unfinished aspects of the probes. Currently the insertion holes for wires are not labelled. There is no explanation whatsoever on how to connect them. The idea was to integrate this in the 3D printed suspension part, but was not possible in the manufacturing. Another way of making this clearer is through the modular connector parts from chapter 6.2. These additional parts are not yet designed and require further research in how people connect the probes to their prototypes. The contextual inquiry (chapter 3.2) and user evaluation with wearable sensors (chapter 4.3) gave an initial list of possibilities (e.g. hook clips, alligator clips, jumper wires or specific sockets).The current prototype has too much freedom in its attachment methods, making it confusing for people to use with a specific way of hooking stuff up onto their prototype. Merging Probe & Data One of the most important moments was the realization of embodying the data in the hardware at the Interactive Institute in Sweden. The first iterations were also working with embodiment of data, but this was implemented in the wrong way. It was not part of the intimacy you have with the artefact and the relations with human interactions.
The goal was to create an even more intimate and immersive experience (see chapter 5.4), but this caused the data to become even more abstract and harder to interpret. The problem probably lies in the fact this tool needs to help with building a prototype. This implies the need to see absolute values to be able to translate this understanding to your implementation (i.e. programming code). Future This project has a lot of potential when considering the first responses. The current execution of the hardware might be a bit ambitious to realize on a larger scale, but it questions how hardware is designed when looking at the functionalities it has. A first important step is to make the software open-source. It is one of the first web-based real-time data collection applications. The software is written in such a way it is easy to expand the mathematical manipulations and alternative views. The second step is to integrate the system into a Makerspace for a longer period. Two complete sets of parts are made and they are designed in way they can be used for a long period (e.g. charger circuit, strain relief on cable). Conversations with existing Makerspaces and companies are currently in progress.
97
7. Bibliography
98
99
7.1 References Annable, B., Budden, D. & Mendes, A. (2015). NUbugger: A Visual Real-Time Robot Debugging System. Arduino (2016). Retrieved from: https://www.arduino.cc/en/Main/Software. Visited on: 2nd of June 2016. Djajadiningrat, T., Wensveen, S., Frens, J., Overbeeke, K. (2004). Tangible products: redressing the balance between appearance and action. Cavalcanti, G. (2013). Is it a Hackerspace, Makerspace, TechShop, or FabLab? Retrieved from: http:// makezine.com/2013/05/22/the-difference-between-hackerspaces-makerspaces-techshops-and-fablabs/. Visited on 13th of September 2015. Chen, M. & Jäenicke, H. (2010). An Information-Theoretic Framework For Visualization. IEEE Trans. Visual. Comput. Graphics 16.6: 1206-1215. Web. Clark, K., Baldwin, C. Y. (1998). Modularity-In-Design : An Analysis Based On the Theory Of Real Options. Dougherty, D. (2012). The Maker Movement. Innovations: Technology, Governance, Globalization, 7(3), pp.11-14. Educause (2013). 7 Things You Should Know About‌ Makerspaces. Retrieved from: https://net.educause. edu/ir/library/pdf/eli7095.pdf. Visited on 13th of September 2015. Espressif Systems (2016). Esp8266. Retrieved from: http://espressif.com/en/products/esp8266/. Visited on 4th of January 2016. Espressif Systems (2013). ESPRESSIF SMART CONNECTIVITY PLATFORM: ESP8266. D3JS (2016). D3JS - Data Driven Documents. Retrieved from: http://d3js.org/. Visited on 4th of January 2016. Fens, P. (2012). Arduino Monitor. Retrieved from: http://blog.pepf.nl/2012/08/arduino-monitor-aneasy-visualisation-tool-for-sensor-data/. Visited on 4th of January 2016.
100
Fischer, G. & Sugimoto, M. (2006). Supporting Self-Directed Learners and Learning Communities with Sociotechnical Environments. Research and Practice in Technology Enhanced Learning, 01(01), pp.3164. Gibson, James J. “The Ecological Approach To The Visual Perception Of Pictures”. Leonardo 11.3 (1978): 227. Web. Google (2016). What Are Chrome Apps? Retrieved from: https://developer.chrome.com/apps/about_ apps. Visited on 4th of January 2016. Google (2016b). Material Design. Retrieved from: https://material.google.com/. Visited on 4th of June 2016 Gumbley, L. & MacDonald, B. A. (2012). Realtime Debugging for Robotics Software. Hatch, M. (2013). The maker movement manifesto. Hekkert, P. (2006). Design aesthetics: principles of pleasure in design. Hobye, M. (2012). GUINO Arduino GUI visualizer/debugger. Retrieved from: http://dangerousprototypes.com/2012/10/17/guino-arduino-gui-visualizerdebugger/. Visited on 13th of September 2015. Involt (2016). HTML to Arduino prototyping framework for designers. Retrieved from: http://involt. github.io/. Visited on 4th of January 2016. Kroski, E. (2013). A Librarian’s Guide to Makerspaces: 16 Resources. Retrieved from: http://oedb.org/ ilibrarian/a-librarians-guide-to-makerspaces/. Visited on 13th of September 2015. Loertscher, D. V. (2012). Makerspaces and the Learning Commons. Teacher Librarian. 40.1 pp.45-46. Morgan, D.L. (2002). Focus group interviewing. In J.F. Gubrium & J.A. Holstein (eds.), Handbook of interviewing research: Context & method (pp. 141–159).
101
North, C. (2006). Toward Measuring Visualization Insight. IEEE Comput. Grap. Appl. 26.3: 6-9. Web. Obrenovic, Z. & Martens, J.B. (2011). Sketching Interactive Systems with Sketchify. ACM Transactions on Computer Human Interaction (ToCHI), Vol. 18, no. 1. Patterson, Robert E. et al. (2014). A Human Cognition Framework For Information Visualization. Computers & Graphics 42: 42-58. Web. Plotly (2015). Arduino library for real-time logging and streaming data to online plotly graphs. Retrieved from: https://github.com/plotly/arduino-api. Visited on 4th of January 2016. Plotly (2016). The open source JavaScript graphing library that powers plotly. Retrieved from: https:// plot.ly/javascript/. Visited on 4th of January 2016. Pretorius, A. J. & van Wijk, J. (2009). What Does The User Want To See? What Do The Data Want To Be? Inf Vis 8.3 (2009): 153-166. Web. Schuler, D. & Namioka, A. (1993). Participatory design. Hillsdale, N.J.: L. Erlbaum Associates. Shutterstock (2016). JavaScript toolkit for creating interactive real-time graphs. Retrieved from: http:// code.shutterstock.com/rickshaw/. Visited on 4th of January 2016. Smit, D., Oogjes, D., Goveia de Rocha, B., Trotto, A., Hur, Y. & Hummels, C. (2016). Ideating in Skills: Developing Tools for Embodied CoDesign. Proceedings of the TEI ‘16, Eindhoven, Pages 78-85. Shneiderman, B. (1996). The eyes have it: a task by data type taxonomy for information visualizations. Tableau (2016). Analytics anyone can use. Retrieved from: http://www.tableau.com/. Visited on the 7th of June 2016. Tanaka, M. (2016). A D3-based reusable chart library. Retrieved from: https://github.com/masayuki0812/c3. Visited on 4th of January 2016.
102
Tufte, E. (2001). The Visual Display of Quantitative Information. Victor, B. (2014). Seeing Spaces. Retrieved from: http://www.worrydream.com/SeeingSpaces/. Visited on 13th of September 2015. Victor, B. (2013). Media For Thinking The Unthinkable. Retrieved from: http://worrydream.com/MediaForThinkingTheUnthinkable/. Visited on 13th of September 2015. Visual Micro (2012). Debug Arduino – Overview. Retrieved from: http://www.visualmicro.com/ post/2012/05/05/Debug-Arduino-Overview.aspx. Visited on 13th of September 2015. Wensveen S. A. G. , Djajadiningrat J.P. , Overbeeke C.J. (2004). Interaction frogger: a design framework to couple action and function. Proceedings of the DIS’04, Cambridge, MA, USA, 1–4 August 2004.
103
7.2 Table of Figures Page 10, 11: The Maker Movement http://www.psfk.com/2014/08/maker-movement-trends.html Page 4, 5, 6, 7, 9, 13: Makerspace https://www.flickr.com/photos/erasmushogeschool/albums/72157607369750642/ Page 13: Maker Event for Yougnsters http://qtechknow.ghost.io/superheroes-of-the-maker-movement-at-the-willows-community-school/ Page 19: Seeing Spaces http://www.worrydream.com/SeeingSpaces/ Page 12: Hackerspace https://www.flickr.com/photos/animakitty/9588952266 Page 13: FabLab https://www.flickr.com/photos/devonlibraries/15381278900/sizes/k/ Page 23: GUINO debugger http://dangerousprototypes.com/2012/10/17/guino-arduino-gui-visualizerdebugger/ Page 23: Visual Micro debugger http://playground.arduino.cc/Code/VisualMicro Page 23: Arduino Monitor http://blog.pepf.nl/2012/08/arduino-monitor-an-easy-visualisation-tool-for-sensor-data/ Page 23: Plotly https://plot.ly/javascript/
104
105
Thank you for reading.