Interaction Magazine December

Page 1

Interaction Magazine Interaction Design - Autumn 2011 - December 7th

Introducing GoodFood The final concept


This issue is edited and published by Meiken Hansen s050031 Martin Løkkegaard s072049 Maja Hyveled Jakobsen s071768 Michael Mansdal Larsen s082510 Kamilla Grove Sørensen s072013 Christopher Holm-Hansen s072023


Contents Introduction Understanding the target users Getting older Elderly and touchscreen technology The process Market research Design specification Problem statement Scope of the concept People Activity Context Technology Ergonomic/usage considerations Development of the final concept First iteration Second iteration Scenario 1 Scenario 2 Third iteration User test Outcome Goodfood First time use Profile Home sweet home Planning ahead Shopping list Cooking with Goodfood The remote Prototype Data flow Considerations Purchasing Goodfood Features Future work Conclusion & References Appendix

5 8 8 9 12 18 22 22 26 26 28 28 28 29 34 34 34 35 36 37 37 37 42 43 45 46 48 50 51 54 58 62 66 66 66 70 74 78


Interaction Magazine

4


December Issue

Introduction This magazine is a part of the final delivery in the course 42072 Design for Interaction, DTU Management, 2011 autumn semester. It documents the project work of developing a product for interaction for the aging society. The project includes conceptualization, cognitive-physical consideration and mock-ups/ prototype of the final product and testing of this. Products specially designed for elderly is a growing market and an area in which countless challenges and possibilities exist for creating good interaction design [Lecture week 6: Health Engineering]. The design task that has set the basis for the project is defined as: “To design a product that allows for interaction and has the purpose of supporting elderly people in maintaining a healthy and varied diet�. This December Issue elaborates and finalizes the development, description and testing of the chosen concept: GoodFood – consisting of an application for tablets and a washable remote control. The Goodfood concepts primary focus is intuitive interaction for the user and less on saving the world :-)

5



Understanding the target users


Interaction Magazine

Understanding the target users The intention of this section is to give an understanding of the target group, which considerations must be taken into account and what challenges must be met when designing the final concept.

Getting older

It is a well-known and documented fact that the human body is exposed to physical deterioration concurrent with getting older. The pace at which this happens may differ between individuals and is often affected by lifestyle as well as diseases. Muscle power and skeletal mass decreases as well as some may suffer from rheumatoid arthritis. Also, the central nervous system slowly begins disintegrating leading to reduced short time memory, lack of the ability to perform complex tasks and reduced effect of the five senses: hearing, touch, sight, taste and smell. [Lecture week 6: Health Engineering] The ability to perform cognitive demanding tasks gets worse as people age. “Worked examples” can be described as step-by-step demonstrations of how to solve a problem or solve a specific task while “conventional problems” may be described as learning-by-doing. Studying “Worked examples” has proven to be more efficient for the elderly than solving “Conventional problems”. Any design for elderly could thereby benefit from a guide/tutorial option, which could serve as a “Worked example”. [Gerven]

8

Elderly people tend to live by a number of habits established through their entire life. In continuation many have built up a more or less monotonous diet with well-known dishes that are easy to prepare and that they like. Often less consideration has been given to the nutritional values of the meals, which can possibly lead to malnutrition and in worst-case diseases [DR]. Furthermore 55% of elderly find it difficult to simply distinguish healthy food from unhealthy food [Ældresagen]. Malnutrition can also be a result of reduced digestion or ability to chew, which may limit the number of dishes they know how to prepare. Correct nutrition is also of utmost importance when dealing with a variety of diseases as e.g. high blood pressure, obesity or diabetes. Another dimension of getting older is that the appetite regulation gets less sensitive. Thereby food consumption may in some cases not correspond with the body’s energy demand. Eating less than the body needs results in weight loss and absence of important vitamins and proteins. Other causes related to reduced appetite can be depression, social isolation, bad teeth as well as effects on the sense of taste and smell [meraadet]. These developments and dimensions are important to be aware of and keep in mind when designing for the elderly generation as it has a great influence on the possibilities and limitations on the final concept. By making sure that the elderly maintain a healthy and varied diet


December Issue

it is possible to increase their life quality and mitigate the risk of different diseases. With a healthy and varied diet there are far greater chances of avoiding diseases, increasing their independence from other people/authorities.

Elderly and touchscreen technology

elderly experience great joy in using tablet: “I want to do nothing but use this�, a test person stated on the use of a touchscreen. The joy of using touchscreen devices may be of great value for the elderly and encourage them to be active on areas that can serve their personal interests. [Kobayashi]

As this project focuses on elders and development of a new product that require their interaction, looking into how they understand interfaces and what kind of feedback is a must. Studies show that there is not much difference in the performance between elderly and younger users when performing conventional tasks (e.g. pressing buttons and understanding icons) on PDAs (personal digital assistants) [Siek]. Since the similarities between PDAs and tablets are many, it can be assumed that elderly will be able to perform similar tasks on a tablet with a touchscreen. To further back this assumption is the general increase of tablets and smartphones, both touchscreen devices, in the population. Even though most of these products are found in the hands of younger users (under 65), more people, including elders, use either tablets or smartphones on a regular basis. Another study show that elderly people without any experience with touchscreens can perform gesture-based operation reasonably well, although hitting smaller targets (30x30 pixels) proved difficult. The study also proved that 9



The Process


Interaction Magazine

The process The following section presents activities and methodology that were used during the development process. The actual research and development will be presented later.

Brainstorm A brainstorming session on different topics was done to specify and focus the design task. The session was based on a mind map where four overall topics were presented: Independent Living, Health and Care, Occupation and Recreation. [Lecture week 7: Final Project]

Combination A lot of problem areas were identified through the brainstorm sessions. Most of the areas could be combined into broad and overall problem areas of interest. By doing so the problem areas were cut down to five: Social networking, mobility, social awareness, pets and practical reminders.

Selection One problem area was selected through discussion and voting. This selection was based on engagement, interest and knowledge on elderly and their needs. The problem area with the biggest general interest was found to be ‘practical reminders’.

12


December Issue

Design Task The area of ‘practical reminders’ was unfolded through further brainstorming on different situation where reminders are needed. Based on the session, focus was narrowed down to concern correct nutrition for elderly people.

Validation In order to validate the focus area, the market for existing nutrition assistance and guidance was investigated and interview with possible users, within the target group, were conducted. [Appendix 2]

Demands and Criteria Based on the validation and user interviews demands and criteria for the concept were defined.

13


Interaction Magazine

Concept Develoment In order to determine the concept a morphology chart was used [Appendix 4]. Seven parameters based on the earlier research were found. A brainstorm over the parameters was used to develop different solutions to the same parameter. The idea for the final concept was based on a combination of elements from the morphology. The concept was named “GoodFood�.

Iterative detailing The concept was built up, tested, detailed and evaluated through three iterations before leading up to the final version of the concept.

14


December Issue

GoodfooD 15



Market research


Interaction Magazine

Market research There are several products on the market that in an interactive way inspire people to eat a healthy and keep a varied diet. GoodFood will be a new actor in this market and it is therefor necessary to be aware the existing solutions in order to be able to stand out and position itself. An example of one of these concepts is called Epicurious which is a website for food lovers where recipes can be found; you can make shopping lists and get advice on how to cook. Furthermore the recipes are divided into levels of complexity as “Quick and easy”. [Epi] Another concept is the Danish alternative to Epicurious a webpage www.opskrifter.dk which contains a collection of recipes with some additional functions as “empty the refrigerator” which allows the user to type their ingredients in their refrigerator whereupon the system suggests relevant recipes. [opskrifter.dk] Another concept that inspires people to eat varied even though it can not be considered interactive is found on the back of the free weekly newspaper Søndagsavisen. On the back there are recipes for 7 days - the upcoming week. This gives inspiration to the readers for meal planning. The recipes often use some of the same ingredients as the recipes for the other days, which give the reader the opportunity to ‘share’ the ingredients among several days. In this way less food and in­gredients are 18

thrown away and the user have saved money. Furthermore the ingredients are arranged in a shopping list that comes in handy when buying groceries. [Søndag]. The solutions described here only represent a fraction of what is available. Many alternatives exist, all with more or less similar functionality. However, a concept that can accommodate special nutritional needs and demands and tailor menus to fit elderly people has not been found. This is why Goodfood has its raison d’etre.


December Issue

19



Design specification


Interaction Magazine

Design specification With the information on the target group and the current market, requirements and criteria [Andreassen] for the concept were set up.

Problem statement

The concept shall give the elderly assistance to maintain a varied and healthy diet and all of the activities related with this: give inspiration and planning the menu for the following days, plan the menu so it fits with the users nutritional profile, plan the shopping in accordance with the specified menu, provide guidance when cooking the dishes, provide the nutritional information for each course and the individual ingredients and always give the user the possibility to quickly access help/assistance. The concept must somehow afford hygienic use when in the kitchen environment. The layout must incorporate considerations on physical/visual standards that make the use suitable for elderly.

22


December Issue Subject

Requirement

Criteria

Varied diet

Must advise the user to eat a varied diet.

If a varied diet is achieved through inspiration and motivation.

Customization

Must be able to tailor menu to personal needs.

It would be preferred if the application could advise users to eat according to their personal condition (E.g diabetes and arthritis)

Help function

The possibility to get help if needed.

Could be nice if the help function would be superfluous due to intuitive design.

Hygienic standard

The solution must afford hygienic use.

A washable remote control for use during cooking in the kitchen.

Weight

Should weigh as little as possible without compromising the usage.

Relating to remote control.

Flexibility

Must contain the possibility to choose “not� to eat the suggested food and get something else.

Patronising behaviour

The layout must be patronizing towards the users.

Navigation

The navigation must be consistent throughout the application.

Feedback

Must give visual or auditory feedback to the user.

It would be preferred if the application gives the possibility to plan meals ahead in time.

Comment

Planning meals ahead in time might make the user feel in control. The elderly users should not feel stupid when using the application.

Should try to use mappings the user already knows.

The navigation should be elderly friendly. An effort towards not having to many buttons nor to few. Feedback can be customized by user.

23



Scope of the concept


Interaction Magazine

Scope of the concept The PACT analyses together with ergonomic considerations are used to scope the design of the chosen concept, GoodFood. The PACT framework [Benyon] consists of four areas: People, activity, context and technology. This is used to understand the people who use the GoodFood application and how the activity of the usage proceeds in a defined context.

26

People GoodFood is designed/intended for elderly who are able to cook for themselves but do not necessarily need to be in shape for shopping as some might use online methods of purchasing groceries. It is assumed that the upcoming generation of elders is far more confident in using technologies like tablets and computers than the current elders. To get an overview of the target group, personas were constructed. The personas are mainly based on the knowledge gained from talking with elderly people about nutrition, cooking, and technology usage as well as research on the internet and articles. The personas give the possibility to immerse into the concept and understand what properties that matter to the user and which aspects it is that give the users the motivation to use the concept.


December Issue James Johnson • 67 year old • Curios on different food sources • Physically able to cook and buy ingredients • Currently eating healthy • Likes to cook and enjoys new taste sensations • Owns a smartphone Morgan McKinsey • 72 year old • Is lactose intolerant and therefor has special nutritional needs • Physically able to cook and buy ingredients • Has four favorite dishes • Capable of using a computer and cell phone • Has never used a touchscreen device Loretta Lowell • 75 year old • Suffers from arthritis and has trouble using the oven. • Able to cook for herself • Poor sight and prefer audiobooks and audio instead of reading • Orders her groceries online through a smartphone application Sheila Stanton • 84 year old • Normally uses cookbooks to discover new recipes • Physically able to cook and buy ingredients • Enjoys watching the cooking channel and cooking along with the TV chefs • Uses her computer every now and the and her cell phone on daily basis 27


Interaction Magazine Activity

There are several activities that the user will be exposed for when using the GoodFood application. The first activity related with the application is the purchase. The purchase requires internet access and an acceptance from the user that he/she wants to buy and download the application. This activity must take place on the tablet of which the user intends to use Goodfood. The regular interaction with the application is accomplished with a tablet. The tablet allows for use of a tangible interface and lets the user interact with the application with their fingers. When using the application in the kitchen the tablet touchscreen is no longer the tangible interface. Instead the washable remote is used. The remote has buttons that afford pressing and are consistent with the buttons and necessary steps of action in the cooking scenario.

Context

It is assumed that elderly will use it primarily at home in their living room and kitchen. The application has different characteristics that afford different use contexts. When planning the weekly menu, creating the shopping list or editing/viewing their profile, it is expected that user will be sitting in their living room in a comfortable chair with the tablet. Another dimension is when the application is used in the kitchen during the cooking session. In this case the physical interaction with the application becomes a challenge, as kitch28

en hygiene is an unavoidable and important issue. In the kitchen environment, the tablet is exposed to hazardous elements as e.g. water and bacteria when the user has to interact with the tablet. But also when turning the scenario around other issues appear. Bearing in mind that the tablet can be used for many things e.g. gaming while on the go or in the bathroom; the problem of the tablet bringing outside bacteria into the kitchen is also necessary to keep in mind, when designing the physical interaction. The solution for these hygienic challenges is realized as a washable remote control that can interact with the application.

Technology

The media of the interactive system GoodFood is a tablet. The tablet receives input from the user through either the touchscreen or the washable remote control. The user touches the tablet with the finger to give input to the tablet. When the user is cooking the remote is used to give input to the device. The user navigates by pushing buttons on the remote, when he/she otherwise would have tapped directly on the tablet screen. The remote is powered by battery and uses Bluetooth technology for transferring the data. The outputs are formed as text, audio, video and graphic visualization. The GoodFood concept requires connection to the internet for application updates and when creating new menus.


December Issue Ergonomic/usage considerations

When designing an application for use on a tablet there are certain patterns that should be considered. Patterns are a combination of a gesture, for example touch of a button, and a system response. The combination of the two can be used in several situations in a variety of devices. Using patterns that are similar to successful patterns currently used in products may make the product seem familiar and more intuitive to use [Saffer]. One pattern that has been used in a vast number of interactive devices, and has been implemented in the GoodFood device, is the tap-to-open/activate pattern. When an object is to be opened or activated the Tap is used. Tap is a natural replacement of a click of a mouse and therefore should be a familiar gesture when using a touchscreen [Saffer]. Furthermore the tap-to-select pattern is used in the GoodFood concept. The two types: tapto-open/activate and tap-to-select are used in an identical manner, both having buttons similar in construction. In order to give the user the possibility to change her/his mind after the finger is placed on the button, they can simply slide their finger of the button as it is only activated on direct release. This gives the user the possibility to avoid unattended actions. [Saffer] Gestural interfaces, such as touchscreens, have some limitations. Compared to a keyboard with actual buttons (or/and a mouse) the touchscreen does not contain actual buttons. The physical feeling of when a button is pressed is sometimes on a touchscreen

switched into a visual display only, which gives little feedback to the user. Especially for the visually impaired this might be a problem [Saffer]. Since many elderly are visually impaired a clear feedback should be given when a gesture has been done. Simulating a movement of the button and adding a clicking sound could be means for achieving such feedback. Another physical consideration concerning the usage of the touchscreen, which is of interest to the target group, is the keyboard. Using a keyboard for typing may be challenge to elderly especially if the taps are small. When typing some use the fingertips other may use the finger pads. Finger pads are larger than fingertips and the average size of a finger pad is 10-14 mm [Saffer]. To make the writing on the keyboard as easy as possible for the elderly the size of the buttons (letters) should be large enough for using the finger pads. A tablets keyboard is, when the device is set on horizontal direction the size of the taps is larger than 14 mm and can therefore be considered elder friendly. When designing an interactive device on a touchscreen, one should consider the placement of buttons and other interactive elements. The layout should make sure that no important information is covered when interacting with the application. Therefor placing these elements in the same layout as on computer screen could most likely induce complications. [Saffer] A study has shown that elderly prefer icons of a larger size (20 mm) where the younger users preferred icons of a smaller size (5 mm or 29


Interaction Magazine 10 mm). Furthermore the study showed that elderly users had problems viewing the screen because of glaring and had a tendency to tilt it for a better view [Siek]. The icons that are used in the GoodFood concept are all of an appropriate size for elderly according to the previously mentioned studies. Touchscreen-devices have proven to be less cognitive demanding than other interactive devices when the interface is designed appropriately. Therefor when designing an application to be used by elderly one should consider what level of cognitive demand it should require [Hippler]. An interactive design that is natural to operate would not be cognitive demanding and thus be in favor to the target group of this project. When cognitive tasks are not complex the performance of young and elderly is similar. [Siek]

30


December Issue

31



Development of the final concept


Interaction Magazine

Development of the final concept 1

First iteration Creating outline

The first mock-up of GoodFood was superficially built, only giving an overview and idea of the main functions and layout. The mockup was constructed as static images and only built to show the initial thoughts and considerations for GoodFood. The concept was evaluated through feedback from supervisors.

2

Second iteration Keep it simple

For the next iterations, functions and operations were defined using function diagrams [Appendix 3]. This gave a good basis for discussing important functions and prioritizing the design effort. The second iteration was built as an interactive interface, enabling trail and error testing. The concept was evaluated based on heuristics, personas and user inputs. Simplicity was in focus, still ensuring that the concept would not be too limited. Test persons from the target group were contacted to evaluate the concept and its content. The general feedback was positive. They liked the possibility to plan meals for a whole week. One user expressed that she really often found it hard to get inspiration for new meals and would really like an assistant proposing meals [Appendix 2]. Another positive point, was the feature about avoiding certain ingredients e.g. in relation to medicine use. One user was taking anticoagulant medication, not allowing him to eat food with a high content of k-vitamin. His wife, who primarily does

34


December Issue

the cooking in the household, said it would be nice to get inspiration how to make new recipes. Especially when it came to vegetables, she was interested in assistance. They used to eat a lot of spinach, but with a high level of K-vitamins, this was suddenly no longer an option. [Appendix 2] To help defining and evaluating the main functions of GoodFood, scenarios were created based on the seven steps of action [Benyon]. These scenarios helped to investigate the gulf of execution, describing how a user of GoodFood translates goals and intentions into actions. Two scenarios were set up concerning generation of a week menu and using GoodFood for everyday cooking. The scenarios were not based on user tests, but were intended to give an overview and understanding of how interaction with GoodFood is imagined.

Scenario 1: Generate a week menu

1. Forming a goal: The user wants to plan the menu of the week. 2. Forming the intention: The user wants to plan the week menu by using GoodFood on her tablet. 3. Specifying an action sequence: She has to buy and open GoodFood. 4. Executing the action: She buys GoodFood and opens it. She adjusts the user settings in GoodFood and press the week menu button. She follows the steps shown on the screen. 5. Perceiving the state of the world: When the week plan comes up, she can see pictures of the planned meals. If something does not look nice, she can change the dish to something else. 6. Interpreting the state of the world: She has now by using GoodFood an exact plan for what to eat the whole week without finding the recipes herself. 7. Evaluating the outcome: Her goal of planning a week menu is satisfied. She did achieve her main goal, but she could change the suggested menu’s recipes if she wanted. So at some point, she did have to figure out if she wanted to make that specific meal or if she should choose another – but without coming with solutions herself. 35


Interaction Magazine Scenario 2: Cooking today’s meal

1.Forming a goal: The user wants a home cooked nice and healthy meal. 2. Forming the intention: She wants to use the planned recipe from GoodFood to cook the meal. 3. Specifying an action sequence: She has to open GoodFood and find today’s recipe. After that, she has to go into the kitchen and find the ingredients which she (hopefully) has bought earlier this week. 4. Executing the action: She opens GoodFood and chooses the feature ‘cook’. Today’s recipe opens and she finds the ingredients. She prepares them and cooks them into a meal. 5. Perceiving the state of the world: While using the recipe, she can hear the recipe read aloud. She can also see how the meal is prepared by viewing a video. Her senses like eyesight, smell, and taste can be used during the preparation to make sure the meal taste nice and figure out when it is finished. 6. Interpreting the state of the world: GoodFood helps her to choose a healthy recipe and gives her help about how to cook the meal. 7. Evaluating the outcome: Hopefully she has been able to fulfil the recipe with success so the meal is a nice meal. But maybe, she did not follow the recipe completely or believed she knew what a certain procedure was but in reality did not and did not view the video. She might also have been interrupted during 36

the cooking and forgotten to check the food. Then there is a possibility that the meal did not become a success, but there is a good chance that it might be a success. The scenarios were used to anticipate and optimize interaction with GoodFood.


December Issue

3

Third iteration Testing

points. The evaluator tried to compensate for this by translating the text.

The scenarios, constrains and consistency, together with knowledge of natural mappings, created the basis for the third iteration. This iteration was created with focus on user testing and effort was put into making it look appealing and complete. A user test was conducted and together with continuous heuristic evaluation led to the final concept, the fourth iteration of GoodFood.

It should be clear that the application tested was a prototype which had some differences compared to the final application. Examples of this are non-functioning buttons and certain information, that the user is supposed to type in, which is already there. This might have confused the user at some points.

User test

The full user test and all findings can be found in Appendix 1. The following contains some of the important findings and changes based on the user test.

The user test was performed with focus on the user’s ability to navigate through the GoodFood application. The evaluation of the application is built upon the intuitiveness of use and at what level of ease the user navigated through the application for a given scenario. It should be noticed that the user of the test had never used a touch-screen device. This choice of user was done to get the least experienced of the intended users to navigate the system, which would catch most of the breakdowns. The chosen user had previously expressed interest in learning the technology of touch-screen devices and to get to know the system, even though he had no related skills what so ever. The user lacked English skills, which might have been a barrier at some

Outcome

From the user test we became aware of communicating what happens when you tap forward and backwards. In the profile settings the user tried to navigate by tapping the numbers instead of the arrows. The arrows have now been changed to a back arrow including the text “back” and a square with a tick including the text “accept”. Not all users know the expression “Vegetarian” and “Vegan”. Instead of explaining the word the “None of the two” were changed to “I eat meat”. This button was furthermore changed to be the first button from the left.

37


Interaction Magazine The user found it difficult to change his profile settings. He tried to tap directly on the information on the profile overview. After this we made it possible to see all the profile settings at the profile overview, tap edit and each setting would appear as a button. When tapping the specific button the user would jump to the specific setting, e.g. Allergy. In this way the user did not need to go through all screens to change one setting.

for all the chosen meals in the menu. This has not been implemented in the prototype yet.

As the user did not know that the info button would give information on the specific screen it was changed to a Help button.

From the user test we agreed on having an introduction video to the application. The video should introduce the user to the main functions and shoving how the general navigation is done. This will make sure that the user will be informed about the applications different function and that the user can benefit from all functions.

In the cooking guide the user had difficulties understanding that it contained more than one step and how to go to the next steps. We have changed the screens to showing the numbers of steps and the current step. As it did not seem obvious for the user how to finish the cooking guide it should be clarified that the guide is over and how to go the home. The user misunderstood the Shop button and tapped is when he wanted to create a new menu as “he had to do shopping for more meals”. After this session the “Shop” button at the Home screen were changed to “Shop List”. The user did not realize that he could press the cook button from the My menu screen. He tried to tap the specific meal to see the recipe. As nothing happened he went to the Home screen and tapped the Cook button there instead. It could be possible to tap the specific meal to go and see or check the recipe 38

When using the remote control the user once moved the cursor too much to the right pressed OK and went to Home screen. In this way he left the cooking instructions, which were not the intension. This breakdown should be considered and it should be clear to the user where the cursor is when using the remote control.

The user test, together with the previous iterations of GoodFood, created the basis for the final concept.


December Issue

39



Goodfood


Interaction Magazine

Goodfood GoodFood helps elderly to a healthy and varied diet. The application is an inspiration source, providing alternatives and help to cook new dishes. GoodFood allows the user to clearly define preferences and special needs and based on these information creates a varied 7 day menu, covering all nutritional needs of the user. The application offers elderly a helping hand to break food habits and will improve overall health and life quality. GoodFood is an application for a tablet device, including a specially designed remote control for use in the kitchen. The application is intended for a family household with one or more members. Each member creates his/hers own profile and GoodFood will take into account all personal needs when planning the menu.

Â

The following is a run through of the most essential features of GoodFood. The interfaces presented are created from the prototype of GoodFood.

42


December Issue

First time use

The very first time GoodFood is opened the user has to setup his/hers own personal profile. The application will guide the user through seven interfaces with possibility to define special needs and preferences. It is not possible for the user to skip this step, as it is essential for the usefulness of GoodFood. The Welcome screen is the first to appear. It allows the user to determine gender buy tapping the male or female icon and also to type in a profile name. In the bottom of the screen seven circles are visible, clearly stating how far in the setup process the user is and how far there is to go. The circle indicating the current state is highlighted with a thin line around it and the circle will be 100 percent visible. As the setup process continues, complete steps will stay 100 visible where steps not yet defined will only be 80 percent visible. When the user has selected gender and entered a name he/she needs to press accept.

Â

43


Interaction Magazine The next six steps let the user define: Date of birth, height, weight, food preferences (e.g. vegetarian), allergies, special conditions (e.g. arthritis) and physical limitations. In the case of e.g. allergies, the user will be asked: “Do you have any food allergies?” If the answer is no, the application will skip ahead and let the user define special conditions. However, if the user taps yes, he/she will be guided to an interface allowing specifying the type of allergy. There is no accept button in this interface and as the user taps either yes or no, GoodFood will instantly continue the setup sequence. In the case of an mistake, it is always possible for the user to tap the “back” button always located in the lower left corner.

The setup sequence is simple and straight forward, and does not require the user to preform difficult tasks. Each of the seven interfaces are presented in a similar manner to minimize confusion and it is always possible to recover from mistakes by either using the back button or simply picking another choice (however not possible in the case of allergies, here the back button must be used).

44


December Issue Profile

When all seven steps of the profile setup are completed, the user will be presented with a profile overview. Here it is possible to see all data entered in the previous steps and by tapping Edit to change mistakes. The profile overview also present nutritional information, here it is possible for the user to see how the personal settings are affecting the choices GoodFood makes when generating a personal menu (e.g. avoided ingredients or daily calorie intake). In this interface the user is also presented with the Help button for the first time. This option will follow on every interface from hereon, providing guidance in case of need. When the user is satisfied with the profile settings he/she taps accept and is guided to the Home interface.

Â

GoodFood gives the possibility to take several profiles into consideration when later generating a menu. When the first profile is finished, it is possible to start over and enter specifications for the next. (This is however not implemented in the prototype)

45


Interaction Magazine Home sweet home

When user/users is/are finished setting up profile(s) the Home menu will be presented. From Home it is possible to go either to Cook, Menu, Profile or Shop List. •The Profile option allows going back to profile settings either for a redefinition or just to view it. •Tapping Menu, Goodfood will generate a menu specially adjusted the user profile(s). •Cook will guide the user to the cooking interface (in the case a menu is created), showing the menu of the day and providing guidance and cooking assistance. •Shop list will show what groceries are needed for the defined menu (also provided that a menu is created) As this is the first time GoodFood is opened, tapping Cook or Shop List will guide the user to the Menu planner.

46


December Issue If the user needs help to define the different buttons in the Home menu it is just to tap the help icon, as usual centered in the bottom of the interface. Tapping Help gives a pop window with a description of the different elements on the original interface. It is possible to tap the audio or video buttons in order to get further help. If Audio is tapped, GoodFood will read the help aloud. If Video is pressed, visual help will appear, defining the different elements. Tapping anywhere outside the help box or tapping the cross in the upper right corner will at anytime stop the help functions.

Â

47


Interaction Magazine Planning ahead

As defined, when using GoodFood for the first time, the user will be guided to Menu. As this is the very first time of use and no menu has been created yet, only Create new menu will be available. At any other time it will be possible to choose between the two options, either to generate next weeks menu or to show the menu for the current week (my menu).

Â

48


December Issue When tapping Create new menu, GoodFood will, based on the profile settings, create a new varied menu specially designed for the user. The profile settings will ensure that special demands (e.g. allergies) and nutritional needs of the user will be taken into account when generating the menu. The user will be presented with a menu showing different suggestions. It is possible to change dishes if something pops up that the user does not want. GoodFood will then come up with alternative based on the same nutritional content, which the user can choose from. (Not implemented in the prototype) Information of each dish will also be available to the user, showing detailed information on ingredients and nutritional content. (Not implemented in the prototype) From My Menu it is possible to go directly to Cook, Home or Shop List.

Â

49


Interaction Magazine Shopping list

Tapping Shop List presents an overview of the ingredients needed to cook the dishes from the menu. The shopping categorizes ingredients in a simple and clear way for easy understanding. The two buttons, SMS and Print, centered in the bottom of the interface, gives the user the possibility to send the list to his/hers mobile phone or a printer connected to the tablet device on which GoodFood is used. This allows the user, in an easy way, to transfer the shopping list to something that can be brought to the local grocery store.

Â

50


December Issue Cooking with GoodFood

When the menu is created and groceries bought it is time to cook. When the user taps Cook in the home menu the dish of the day will appear. As shown in the menu interface, Chicken heaven is up for Monday. The dish will be presented with ingredients, date and expected preparation time.

Â

51


Interaction Magazine It is also possible to get detailed directions for the cooking process. It is clearly stated how many steps the current recipe contains and how far in the process the user is. Each time the user is ready to continue to the next stem the arrow is taped. Cook also presents audio and video possibilities. Tapping the audio will read the recipe aloud, and if going through the directions, it will be done step by step. The video option will visually guide the user through the cooking process through small video sequences e.g. showing how to prepare the chicken.

Â

52


December Issue When the user is done cooking and reaches the last step of directions, a Done button will appear. This has the exact same function as the Home button and guides the user to the Home menu. However the extra button is added to give the cooking process a natural flow.

Â

53


Interaction Magazine The remote

The intended use of the GoodFood remote control is to be able to interact with GoodFood while cooking without having to touch the tablet device. This will ensure that hygiene will not be compromised using the application. The remote control has three buttons with appearance and icons similar to the ones used in the GoodFood application, making it intuitive for the user to see the connection between the application and physical remote control. The remote is created with big buttons giving tangible feedback when pressed. The remote can be put directly in the dishwasher or be cleaned with soap and water after use.

Â

54


December Issue The remote control navigates around the GoodFood application by using a circle. The circle jumps from icon to icon only positioning itself where a function can be triggered. When the circle has reached a desired location the accept button is pressed. (Not included in the prototype). While cooking it is still possible to interact with GoodFood using the touchscreen.

Â

55



Prototype


Interaction Magazine

Prototype A prototype of GoodFood is available for test and review. The prototype is created in Tumult Hype and exported as HTML5 to Dropbox. It is possible to access the application using all platforms and browsers, however the prototype has shown the best result using Google Chrome or Safari.

To view the two sections of GoodFood follow the links below, download and unzip the two .zip files. Make sure that the HTML5 files and the resource folders are placed together. Click the HTML5 link and GoodFood will pop up in the standard browser window ready for interaction.

The prototype is split into two different sections. Section 1 concerns the settings function of GoodFood, and gives the viewer possibility to experience how a potential user would define their personal settings in order to create a profile. Opening section 1 in a browser will present the GoodFood interface as if it was the very first time the application was opened. This section ends with the Home menu.

Section1:

Section 2 includes the rest of the main functions of GoodFood and starts with the Home menu. The section will present the viewer with the everyday interface of GoodFood. It is possible to explore Cooking, Menu, Profile and Shop List.

Enjoy!

It must be noted that the prototype has some limitations. Tumult Hype does not support the creation of a full functional application. No audio and only limited video functions are available. Further it is not possible to enter text in textboxes. The prototype has been created to give the viewer an idea of how to interact with GoodFood.

58

http://dl.dropbox.com/u/3103478/GoodFood%20Settings.zip Section2: http://dl.dropbox.com/u/3103478/GoodFood%20Main.zip


December Issue

59



Data flow


Interaction Magazine

Data flow Going behind GoodFood a data flow diagram is a good tool to show how the functions of application are connected. Four elements are illustrated: Data store, Process box, Data flow and External entities. A data store is a database supporting the system e.g. a recipe database. A process box represents an event or processes within a system in order to perform an interaction e.g. change a recipe. A dataflow, symbolized by an arrow, shows how data flows between the different processes. The external entity is in the case of GoodFood the user.

 62


December Issue

63



Considerations


Interaction Magazine

Considerations Purchasing GoodFood

Since this concept consists of an application for a tablet and a remote control the payment method should be considered. Traditional applications can only been bought on the Internet through an account that is established when buying the tablet. This account is including personal credit card information. The GoodFood system could be purchased in the similar manner as the traditional applications where the remote is sent to the purchaser by mail. Another possibility for purchasing the Goodfood system could be through various stores where the remote control and a code for installing the software can be bought. This last solution might appeal to elderly that are not comfortable entering credit card- and personal information online.

Features

One feature that was initially planned to be included in the concept was the possibility for the users to add personal settings regarding medicines on prescription. This would allow GoodFood to take special precautions regarding medication of the user. Research on this area however showed, that it was difficult to standardize guidelines for medicines and which food ingredients might have an infliction upon their effect. There were found information that showed that certain types of fruit juices could neutralize the effect of certain drugs. Since the effect of fruit juice was different from user to user, the prediction of what users would experience neutralization of the medicine effect was impossible. Research 66

also showed that there is no existing database available for guidelines regarding medication and food recommendations. The feature was therefore not considered further. An “empty the fridge� feature was also discussed at one point. This feature would contain the possibility for the user to tap in the ingredients of the fridge to be used in a meal. This feature would assure that the user could use all the food from thin his/hers fridge and thereby save money. However after internally discussion and evaluation the feature was dismissed, mainly because it would interfere with the scope of this project. When a user plans the next week menu the program has a certain amount of nutrition levels to fulfill and runs through a number of recipes to find a fit. This action is however not very compatible to the idea that the user should be able to use random ingredients from the fridge.


December Issue

67



Future work


Interaction Magazine

Future work To further validate and finally realize the GoodFood-concept several areas could be interesting to look into. First of all more user testing should be conducted to ensure the optimal layout of the program aimed at the target group. Some alternative features also could be considered to improve the functionality of the program. Furthermore there are some practical issues that should be decided before the launching of this concept. One area of the concept that needs further detailing is the shopping list feature. There are today good possibilities to shop for groceries online, and since the elderly are a group of people that certainly could benefit from this, it should be included in the GoodFood system. To shop for groceries within the GoodFood system would make good sense since it is a device for cooking and meal planning. One alternative feature that could be considered is the possibility to define a food budget, which might serve as an add-on feature in the program. This feature might find the appropriate meals for the users according a specific weekly or monthly budget. Another alternative feature could be the possibility to add additional recipes from e.g. celebrity chefs such as Rene Redzepi, Claus Meyer, Nigella Lawson or the brothers Price. These recipes could further be attached video instructions performed by the celebrity chef him-/herself. A subscription could in this 70

case be set up, e.g. to one of the mentioned chefs. This would add value to both the user and the GoodFood concept.


December Issue

71



Conclusion & References


Interaction Magazine

Conclusion Several iterations, including user testing and evaluation, have created the basis for the final concept of GoodFood. In the development process good interaction design has been the key focus area and concepts and ideas have continuously been evaluated against theory. GoodFood has been developed specially to fit the demands and criteria of elderly, striving to map and constrain functions to have minimum uncertainty. GoodFood generally touches some interesting aspects, looking to help elderly to a healthy and varied diet trough motivation and inspiration. GoodFood stands out from a market, with many similar alternatives, by allowing personalized nutritional settings aimed to adjust the diet to the user.

74

Economical validation through a thorough business plan could be an interesting aspect for next iteration of the concept. However, it is concluded that it would be a realistic scenario that GoodFood, using known and available technology, would have a competitive advantage and theoretically would be able to position it self on the market. The process of developing GoodFood has been an educational process and has proven rather difficult. The final concept is still considered a prototype, but it holds an interesting potential and is generally considered successful in providing user-friendly interaction design accessible by elderly people.


December Issue

References Cross, N., “Engineering Design Methods, Strategies for product design.” 3.edition, Wiley. 2000. Andreassen, M.M, Hein, L., ”Integreret Produktudvikling” 2. oplag. Instituttet for produktudvikling. Sektionen for Konstruktionsteknik 1985. Benyon, D. 2010. Design Interactive Systems: A comprehensive guide to the HCI and interaction design. Second edition. Pearson. Hippler, Rachelle Kristof et. al.: ”More than Speed? An Empirical Study of Touchscreens and Body Awareness on an Object Manipulation Task” Applied Science - Firelands Computer Science Department Psychology Department Bowling Green State University, Bowling Green, OH 43403 Siek, Katie A. ; “Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs” Indiana University, Bloomington, IN 47405, USA Saffer, Dan, “Designing Gestural Interfaces” Publisher: O’Reilly Media, Inc. Pub. Date: November 26, 2008 Gerven, P.W.M. Van. et. al. ”Cognitive load theory and aging: effects of worked examples on training efficiency” Department of Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht, The Netherland. 2002 Kobayashi, Masatomo et. al.; “Elderly User Evaluation of Mobile Touchscreen Interactions” Hongo, Bunkyo-ku, Tokyo, 113-8656, Japan [Lecture week 6: Health Engineering] [Lecture week 7: Final Project]

75



Appendix


Interaction Magazine

Appendix 1 - User test User test The user test was performed with focus on the user’s ability to navigate through the GoodFood application. The evaluation of the application is built upon the intuitiveness of use and at what level of ease the user navigates through the application for a given scenario. It should be noticed that the user of the test has never used a touch-screen device. This choice of user was done to get the least experienced user of the intended users to navigate the system, which would catch most of the breakdowns. The chosen user has previously expressed interest in learning the technology of touch-screen devices and to get to know the system, even though he has no related skills what so ever. The user had lacking English skills which might have been a barrier at some points. The evaluator tried to compensate for this by translating the text. It should be clear that the application tested was a prototype which has some differences compared to the final application. Examples of this are non-functioning buttons and certain information, that the user is supposed to type in, which is already there. This might have confused the user at some points. From the user test we became aware of different aspects which now have been changed in the final prototype. The test setup The actual test of the system was conducted 78

on a regular computer display due to the absence of a touch-pad. To compensate for the lack of a touch-screen, the user was asked to use the screen as a touch-screen. The evaluator then moved the cursor around the screen with a mouse according to the users tapping of the screen. As the user does not speak or read fluent English the language barrier is something that should be taken into account. To make up for the limited English skills the evaluator translated the text on the display that the user did not understand. The test scenario was as close to the actual use scenario as the test was carried out in the home of the user. To set off the test the application was started in advance. This was done due to the fact that the test was done on a prototype and the startup does not show any type of information except the name of the application. Profile settings The following sections explains how the user conducted the profile settings. The welcome screen (screen 1) First step was to write the name of the profile owner. Since the user never used a touch device prior to the test he did not know how to get his name in the text box. The evaluator explained how an ordinary keyboard pops up when tapping the text box used to write the name in which seemed obvious to the user when confronted with the information. We assume that if the user has a tablet, the user would already know this information.


December Issue

To continue the process of setting up the profile, the user had to navigate to the next screen. This proved difficult at the beginning. The user tried to tap the “1 button” but it did not react. Then he tried to tap the 2 button as he realised he had to go there, but the button did not react either. After realising that the buttons did not work as he thought, he indicated that he did not know what to do. The evaluator explained how to use the arrow and the user completed the welcome screen continuing to the next screen. From this we became aware of communicating what happens when you tap forward and backwards. The arrows was changed to an arrow including the text “back” and square with a tick and including the text “accept”. We have decided that navigating the steps should only be done by tapping the arrows and not by tapping the numbers. This constrain makes sure that the user will fulfill all steps when setting the profile the first time. Age settings (screen 2) The prototype was only made to work when setting the year, not the day and month. The user quickly tapped the “day” square, which in the prototype did not respond. This showed that the user immediately knew how to set the birth options and he became familiar with the way of navigating on a touchscreen device. The user got the function right away and quickly navigated to the next screen by tapping the forward arrow. Height and weight settings (screen 3) Like the age screen the user quickly got how

the drop-down menu worked and tapped it right away. He found the number corresponding to his weight and height and continued to the next screen using the arrow on the screen. Food preferences (screen 4) The user now knew the functions but did not known the meaning of vegetarian or vegan food. The evaluator explained the meaning and the user tapped “non of the two” and continued to the next screen. This made us aware of not all users knows the expression “Vegetarian” and “Vegan”. Instead of explaining the word the “Non of the two” were changed to “I eat meat” and were changed to be the first button. Food allergy (screen 5) No problem at all. The user answers ”no” and continues to next screen. Health settings (screen 6) The user was presented with different options regarding his health status and tapped several of the options without hesitating. When finished, he navigated to the next screen. Physical ability settings (screen 7) and completing profile settings During the test this 7th screen were not made, but the evaluator explained the content and asked the user to continue the process by pretending he had already had filled in the forms like he did on the previous screens. This proved difficult. After finishing settings the user did not know where to push and tried to tap the “1 icon” 79


Interaction Magazine

at the bottom of the screen. Then he tapped the backward arrow in the lower left corner resulting in one step backwards to the “Health settings” screen. The user realised what happened and tapped the forward button returning to screen 7. It did not seem clear to him that he had to continue by pressing the forward button and the evaluator explained that the user had to tap the forward arrow to continue. As we change the back and forward arrows it should now be clear how to complete the settings. Editing the profile settings Later the user were asked to change his settings according to the doctors news, that a specific ingredients did not agree with the user any longer. The user had difficulties knowing what to do from the home menu. “Profile... [thinking for a while], no I tap the info button” After entering the profile he tried to tap the text about allergy, but nothing happened. As the user was not familiar with English the evaluator read aloud the different buttons. The Edit button was tapped and the user came to the start of the profile settings. Initially the user had difficulties coming to the right screen as all earlier screens than Allergy had to be gone through. After a tap forward, back and forward again the user ran quickly through all the screens until the allergy screen, changed the settings and finished the profile settings. 80

After this we made it possible to see all the profile settings at the profile overview. Tap edit and each setting would appear as a button. When tapping the specific button you would jump to the specific setting, e.g. Allergy. In this way the user did not need to go through all screens to change one setting. Navigating the system from Home screen Entering the Home screen the user did not intuitively know what the different screens meant. The evaluator started explaining and translating the the different functions of the screen trying to give the same information as if the user tapped the info button. The user was interested in the first button, the one to the left, even though he did not know what to use it for. Afterwards we have discussed the order of buttons. We discussed whether they should be in the order of first time use: (Profile), Meal, Shop, Cook, (Profile) or in order of how often the different buttons are used. We agreed on having them in order of how often they are used. After the first time use, it would make sense to have the most used buttons in the beginning, to the left according to the reading direction. As the user did not know that the info button would give information on the specific screen it was changed to a Help button. We agreed on having some kind of an intro-


December Issue

duction video in the final application. The video should introduce the user to the applications main functions and shoving how the general navigation was done. This would also make sure that the user was informed about the applications different function and benefit from all functions. Cook The user tapped the Cook button from the Home screen redirecting him to the Cook screen. On the Cook screen he read the ingredient list and stopped. After a while he taps the “info” button and the evaluator explained what the screen could do. After that the user tapped the Forward arrow and the first instruction were given on the screen. He quickly ran through all the instruction screens translated by the evaluator, imagining that the user was cooking. When having finished the instructions he stops and wondered what to do. The evaluator asks him what he thought he should do and the user considers pressing the backwards arrow but decides to tap the info button. The user got informed that this was the end of the cooking guide. He did not know how to act and the evaluator informed the user could continue tapping the “home button” in the lower right corner of the screen. As the meals were collected to give a varied cost he would the user would cook the whole meal even though there was a part he knew he would not like. “I would eat a little bit of it and try to taste it!”

From the user test it became clear to us that the the next button should communicate in a more clear way that the cooking guide contains more than one step. This was partly done by changing the button, partly done by showing the the numbers of steps and the current step. As it did not seem obvious for the user how to finish the cooking guide it should be clarified that the guide is over and how to go the home. Shop The user understood the different options for getting the list to another device by tapping the info button. Afterwards he understood the different possibilities by him self. He found the ability to get the shopping list sent by a text message to his mobile phone the smartest. Entering the home screen, the user quickly read the text on the buttons. He wanted to do another dish as taps the Shop button. From the screen he realised that he could not choose a new meal. He went back to Home and tapped the Meal button. After this session the “Shop” button at the Home screen were changed to “Shop List”. Menu In the Menu screen two buttons appeared: Create new menu and My menu. As the user was testing a prototype, the My menu was already typed in, and not blank as it would be in the real application if the user had not planned a menu yet. The user tapped correctly the Create new menu as he had not generated a menu 81


Interaction Magazine

yet. He accepted the menu without realising that it could be changed. The reason for this could partly be that he was pleased with the chosen meals partly that is was not communicated well enough that the user could edit the menu. He found the 4 days period suitable according to the size of the refrigerator and the ingredients’ best before date. The user did not realize that he could press the cook button from the My menu screen. He tried to tap the specific meal to see the recipe. As nothing happened he went to the Home screen and tapped the Cook button there instead. It could be possible to tap the specific meal to go and see or check the recipe for all the chosen meals in the menu even though is is not the right day. This has not been implemented in the prototype yet, but you can tap Cook directly from the My menu screen. Editing the meals in the menu The evaluator explained the user that the meals in the menu could be changed. The user and evaluator discussed whether it should be possible to change the ingredients in a menu. The user was a bit concerned that other alternative ingredients would spoil the totality of a meal. If he did not like a specific ingredient as garlic he would just skip it himself in the recipe. Later the user proposed to change a whole meal or change the order of the different meals. This confirmed us that it should be pos82

sible to do this, which we earlier just assumed the user would like to. This feature have not been implemented in the prototype yet. The user was aware of getting a varied cost and proposed himself to have a day with fish instead of all meat. The remote control In the beginning the user found it hard to use the remote control instead of the touchscreen. The arrows did not mean back and next but moved the cursor to the left or right. When understanding this the user managed the navigation well and pressed OK when he wanted to tap the button of the cursor. At one time the user moved the cursor too much to the right and pressed the Home button. In this way he left the cooking instructions which was not the intension. This breakdown should be considered and it should be clear to the user where the cursor is when using the remote control. The user found the remote control good but he had difficulties in remembering using it all the way through and he did several time tap the touch-screen ‘with sticky fingers’. The icons The user found the edit, info and accept icons good and mentioned that is was good that there was a consistency of which icons and choices he could choose in the different screens. The user found the icons in the Home screen good and decoded them well, he though. He


December Issue

did not decode the calender, meal and ingredient icons in the Menu screens. He rather read the text and decoded the system in this way. The language was from time to time a barrier. E.g. the user misunderstood the meaning of the Edit button which caused difficulties in the different screens when he should edit a setting. He thought that Edit meant Finish or Stop. He liked that the icon was often supplemented with text. Then he would not be in doubt of the meaning of the button he said. The size of the buttons were good and could even be smaller the user thought. If he should tap a button it would be good if it had the same size as the buttons on his laptop’s keyboard. Finishing a sequence The user found it difficult to see if he had finished all the steps of a sequence and should go back to Home. By changing the text and icons in the bottom of the screen we hope that this will help the user realize his possibilities of the last screen in the sequence. Conclusion The user test gave us a lot of feedback of the program and made us realize less intuitive mappings which the elderly user had difficulties understanding. The language was from time to time a barrier. E.g. did the user misunderstand the meaning

of the Edit button which caused difficulties in the different screens when he should edit a setting. The user found the four main functions, Cook, Menu, Profile and Shop, well chosen and they covered all his requests for the application: “These functions are what I need!”. Preparation for the user test Questions to be answered during the test: ● Familiar with touchscreens/devices? ● Navigating the system with remote control ● Recovery during use ○ Change of personal settings (applying intolerance to a food source) ● Users understanding of buttons, icons, ● Is a 7 days period appropriate ● How about the personal settings - are they understandable - too long/short? Is it okay that the user MUST complete all questions ● How the users navigates between different pages? Can they go back to the last page? ● Can they go in and change the user settings? ● Do the icons make sense? ● Are the buttons too small or too big? ● Can they make a week menu without problems? Is it obvious that they can change a dish to another? ● Are the remote control easy to use? ● Would they prefer the stand-remote control, so the tablet do not lie on the table or 83


Interaction Magazine

do they prefer the other solution? â—? Would they prefer to shop only once a week or more times? â—? How long time before dinner have they planned what to make, and do they plan for more day than one?

84


December Issue

Appendix 2 - Interviews Appendix 2 – user interview User couple 1: Profile: A couple (Tove, 78 year and Ove, 82 year) has been asked according to their cooking. It is mostly Tove who is doing all the cooking. Her husband helps with things like stirring the sauce or taking the food out of the oven. They do both shop – sometimes a part, but mostly together. Ove takes anticoagulant medication, which does that he should not eat food with a high content of k-vitamin (because it will reduce the medications effect). His wife claims that this is annoying. Usually they had a lot of spinach in the kitchen garden, but now Ove cannot eat this. There is a lot of the green vegetables he cannot eat, and that limits the cooking. A lot of the recipes she usually cook contain vegetables with a high content of kvitamin. Therefore, she thinks it is a good idea if she could get inspiration of other dishes than the ones she usually makes, in order to replace those she cannot make anymore. When they buy things (not just food) they are very much aware of the price. They like to buy food cheap. And she hates to throw food away there has gotten too old. Tove do not believe that she could use a tablet, but Ove thinks that it would be easy to learn. They can both use computer, but Ove is better than Tove. User 2

Profile: woman, Anni, 59 years. Anni is a part of the intended user group within some years. Does not have a tablet “but I would probably have it very soon”. • Are doing a week menu o It is cheaper o I don’t have to think about it during the week o I only have to shop once (at least one big shopping and them possibly smaller shoppings for milk and so on) • I use the proposals for meal on the back page of Søndagsavisen [the free newspaper Sunday] o The ingredients in the meals are shared among the different meals – this makes it cheaper • I love to have alternatives for the dishes – to try new food and to get inspired. Not always having the same 5 different salads • I would love to get information about the nutrition of the food. This is knowledge I do not have myself... • A week menu should contain fish twice a week o Should be different types of fish • It would be nice also to get proposals for lunch. Right now I am at work at lunch, but in a couple of years I will be home and would like to have exciting food at lunch as well • I would like to have the shopping list at my phone. I would not bring my iPad – it is too unpractical. Printing is waste of paper... • It should be an application – not a tab 85


Interaction Magazine

let with the programme. If the user would like to use the application on a tablet he would most likely have a tablet already... If the concept was a tablet it would be annoying that it could only be used for this application. o The application should be to the kind of elders who have the energy to have a tablet • The menus should be costumized o To fit to my health conditions o Not to eating the same meal as my neighbour all days!! • During cooking I am not worried about the tablet to become dirty/sticky. A remote control could be ok. But it did not have to be there! • About payment I think that I would like to have a monthly subscription – I would pay 20-25 DKK a month. Or I could just buy it once. Then I would pay maximum 450 DKK. As a cooking book. But I would not know if I would be found of the application – so for a start I would like the subscription. Then there could be an addition for lunch to the price of 5 DKK... • I would not pay more if I had a special diseace/comdition. It should be the same price for all. • It should be special made for elderly – so the meals would give the right nutrition.

86


December Issue

Appendix 3 - Sketches To decide what possibilities there should be for the user in GoodFood the group made various suggestions that were merged to one solution. In the following some of these suggestions can be found:

Â

87


Interaction Magazine

88


December Issue

Appendix 4 - Conceptulazation Through discussion, the group found several parameters that may be relevant to a concept that is to be the solution to the problem statement. Based on these parameters a Morphology chart was constructed [Cross]. The parameters are stated on the left column. This was followed by a brainstorm session where ideas according to the parameters were stated in the chart in the x – direction. The amount of ideas was deliberately limited in order to avoid an immense chart leading to a vast number of concepts.

GoodFood: Other considerations Other consideration: Based on combination of elements from the morphology GoodFood was decided to be the final concept. The combination of solutions creating the basis for GoodFood was found superior in several criteria. One criteria being customization, in which GoodFood allows customization according to health and limitations of the user and Hygienic standard.

Some of the parameters in the chart were chosen due to the scope of the course 42072 Design for Interaction. The parameter “how to operate� for instance was chosen because the devise is to be interactive with the user. Parameter

1

2

3

Varied diet

Week plan maker that suggests a week plan according to the users needs

Cook club where elderly join a virtual community with whom they cook in company with

A devise on the refrigerator keeping an eye out for what groceries are put in. If there is not enough variety it complains and posts recipes

Help to cook

Press button and a Call a phone film appears number

Press button and it is linked to a webpage with cooking guidelines

Hygienic standard

Cover product with thin layer of film

Make the product washable

Washable remote control

How to operate

Physical buttons linked to screen

Touchscreen

Voice

Physical Screen mounted components /layout on the fridge/in the kitchen

Mobile device, with a screen to be carried around

89


Interaction Magazine

Appendix 5 - Division of work All group members have contributed equally to the project, and the report/prototyping work has been divided equally. The main responsibilities of the group members have been: All members have read the report, rewritten and edited parts. Introduction – written made by Meiken and Kamilla Process – Text written by Maja and illustration made of Martin Understanding the problem – Written by Michael Marked research – Written by Michael Elderly people and their limitations – Written by Meiken Design specification for a New Design – Made by all the group members Problem statement – Made by all the group members PACT Framework – Written by Maja Data Flow chart – Written and illustrated by Maja Intended use of GoodFood (Seven steps of action) – Written by Maja and edited by Kamilla. Purchasing GoodFood – written by Meiken Ergonomic considerations/User considerations – written by Meiken User test – written by Kamilla Description of the final application – written by Martin and Christopher Various features – Written by Meiken Further Work – Written by Meiken Mock-ups are made by: Remote control physical – Meiken Remote control (CAD) – Michael GoodFood – Martin and Christopher User test – performed and videotaped by Michael Report layout – Christopher and Martin

90


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.