ROCHE ACCU-CHEK SUGARVIEW USER EXPERIENCE REPORT Venkatesh Gerahalli, Solutions Delivery Manager Francis Mwangi & Arnabi Konar, UX Experts July 2019
Table Of Contents Section 1 – About Us Section 2 – Overview Section 3 – Methodology Section 4 – Participants Section 5 – Study Results Section 6 – Recommendations Section 7 – Conclusion & Next Steps Section 8 – Appendix: Session Video Recordings
2
About us About Applause Applause is a Digital Experience company. We help our 3,000 customers improve THEIR customers’ Digital Experience – whether customers are interacting with them on a web site, a mobile device, wearable or on location, leveraging those devices. Applause helps improve the quality and usability of applications that support the complete customer digital experience.
Our UX Expert Team Our global team of 30 UX researchers consists of industry professionals who bring deep knowledge in a wide range of industries to the table and includes members with advanced degrees as well as published authors. Your UX Expert Francis Mwangi is a UX professional with more than 4 years of UX research experience in varying fields including education, entertainment, crowdsourcing, finance and media. Francis enjoys helping to create products/services that users identify with and serve as an extension to their daily lives. 3
OVERVIEW
Overview This document presents the findings and design recommendations for the Roche mobile application Accu-Check SugarView UX study. The study was completed in July 2019. Fourteen (14) participants took part in a remote moderated usability study. The APK under review was version 1.1.0.
Study Goals The aim of the UX study was to better understand the overall usability of the mobile application and uncover any potential usability issues especially related to the experience of testing blood glucose levels. Participants freely explored the mobile application and shared their feedback throughout the testing experience while also answering questions asked by the moderator.
5
Top Findings Positives
Negatives
✔Participants liked the historical data showing the blood glucose range that the application stored over time.
✗The instructions and feedback during scanning were
✔Participants liked that they wouldn’t need to carry another gadget for them to monitor and test their blood glucose levels.
✗Most participants experienced major issues when
insufficient, not followed, or not clear enough for most participants to complete a successful scan.
Highlights
scanning strips primarily caused by unsteady hands. ✗The time allocated for each scan before timeout was
not sufficient for most users.
✗Many users preferred a more precise reading, i.e. a
number showing the exact level of their blood sugar as opposed to a range as provided in the application. Lowlights
✗Many participants both expected and wanted the
goals to aid in their goal to better health and blood sugar regulation. 6
Top Recommendations • Provide users with a short tutorial video after the initial on boarding process that would instruct them on the correct way to go about the testing. • Account for unsteady hands while scanning or users will not be able to easily align the camera and strips. High
• If possible, increase the time allocated for scanning the strips before time out. • Give more adequate feedback during scanning especially in regards to the colors displayed during scanning, the time required to successfully scan a strip and the reason for the time outs. • Incorporate custom goals, exercise regimens, lifestyle changes and meal suggestions for participants to
Mid
improve on their blood glucose levels in the goals section. • Provide users with reminders to take their tests.
Low
7
METHODOLOGY
Methodology Remote moderated testing Participants complete tasks and answer questions about the ease of use, confidence, likelihood of future use, etc. They do so in their own environment, using their own devices. Participants follow a carefully crafted script that has been agreed upon by Roche and Applause.
Participant recruitment Participants were recruited to mirror the target user groups and are according to the recruitment profile details as discussed by Roche and Applause. Applause has an extensive community that is used to recruit participants that fit the user profile.
Product under testing Participants used a prototype of the Roche Accu-Chek SugarView mobile app (APK version 1.1.0) on compatible devices, most provided to participants by Roche. Participants also used provided test strips and control solution. 9
PARTICIPANTS
Participants 14 participants took part in the UX study. Their names have been removed and replaced with numbers.
11
Participants 14 participants took part in the UX study. Their names have been removed and replaced with numbers. Age ranges
Interpreter data
12
STUDY RESULTS
History
Dashboard
Works Well 6 out of 14 participants liked the historical data showing the blood glucose levels the app stored over time. • They liked having the ability to follow their “progress” even after completing their tests and could use this information to make informed decisions regarding their eating habits or use it as a record of sorts when they visited their doctor.
Medium Issue The scrolling feature is not intuitive resulting in participants hardly scrolling down to the second fold of the screen to check for more details.
14
Instructions
Scan 1
High Issue 9 out of 14 participants failed to read and understand the instructions provided in the application. • 8 participants immediately started scanning the wrong side of the strip (side w/ green & black square instead of correct side w/ black square). • This side stood out the most and participants intuited that this was the green side that was mentioned in the app instructions.
• 4 out of 14 participants could not follow the usage of the overlay. • One participant placed the test strip on top of the mobile phone screen.
• Some did not start the camera before / during scanning of test strips. • Participants also struggled with instructions to move to darker surroundings due to too much ambient light. Participants were not sure how to regulate the light and to what degree.
15
Camera sensitivity
Scan 1
High Issue All 14 participants had issue(s) scanning the first test strip. Problematic areas included: • Aligning strips with the camera overlay as a result of unsteady hands • Once aligned correctly, camera took more time than expected to scan the strips • Ambient light and camera position issues
16
Timeout
Scan 1
High Issue 13 out of 14 participants experienced application timeouts before completing a successful scan. • Participants became frustrated, feeling the time provided was too short for an accurate reading. • Most missed instructions to ‘repeat the measurement with a new strip’ and tested with the same strip.
17
Test results
Results Page
Medium Issue 8 out of 14 participants prefer actual values for their blood glucose levels as opposed to ranges provided by the app. • Reason: they could then monitor whether they were improving or deteriorating. Also they could take the figures to their doctors and get their medicine or diets adjusted based on these results. • Some also wanted the range to be expressed in the plus/minus 20 of the actual number.
The color codes depicting the different test results did not match the severity of the results. ● An example was the color blue showing high blood glucose level while blue is generally associated with tranquility and calmness.
18
Goals
Goals
Medium Issue 7 out of 14 participants expected the goals to be a way for them to monitor what they eat and help them regulate their blood glucose levels. They expected the goals to also include options like regular exercise, systematic diet plans, water and food intake time notifications. 3 participants mentioned they would additionally like to test and track their Hb1c count, BMI Index and Cholesterol. • They thought the doctor’s recommendation was advice from a doctor on how to eat and live healthy with their diseases. They wanted the doctor’s recommendation to be linked to a real doctor whom they could talk to and get advice from if need be. • Many wanted the app to incorporate custom goals including diet and health plans and physical exercise.
19
Call to Action
On-boarding screen
Medium Issue The Call to Action (OK, I UNDERSTAND), specifically in Motorola mobile devices was hovering above the test instructions blocking the instructions from the participants view.
20
RECOMMENDATIONS
Recommendations – heuristics checklist In this section, violations of existing usability principles are listed, as determined by a UX expert. Each issue is briefly described listing which of the 10 Nielsen heuristics (https://www.nngroup.com/articles/ten-usability-heuristics/) it is in violation of as well as a recommendation for improvement. Each issue is assigned a severity level. 1.
Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
2.
Match between system and the real world: The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.
3.
User control and freedom: Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.
4.
Consistency and standards: Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.
5.
Error prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.
6.
Recognition rather than recall: Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.
7.
Flexibility and efficiency of use: Accelerators — unseen by the novice user — may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.
8.
Aesthetic and minimalist design: Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.
9.
Help users recognize, diagnose, and recover from errors: Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
10.
Help and documentation: Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. 22
Recommendations – severity levels In this section, violations of existing usability principles are listed, as determined by a UX expert. Each issue is briefly described listing which of the 10 Nielsen heuristics (https://www.nngroup.com/articles/ten-usability-heuristics/) it is in violation of as well as a recommendation for improvement. Each issue is assigned a severity level.
Overview of Severity Levels High
Medium
Low
Positive
High. User cannot complete task or is severely impacted.
Medium. User is impacted but can complete the task.
Low. User is minimally impacted.
Things done well. Great implementation that should be kept.
23
Recommendations
High
Difficulty scanning test strips with unsteady hands Error prevention Issue: All participants had unsteady hands yet the application required that the users maintain a stable static period of time while scanning the strip to get successful readings in both steps. Recommendations: Consider a different method of scanning the strip that doesn’t require the user to have such stability when reading the strips. The Anti motion Blur Feature that reduces the camera from shaking too much when shooting a slightly dark scene or when using telephoto could also be considered.
24
Recommendations High
App instructions were overlooked or not followed Help and documentation Issue: As a result of not reading/following the instructions, users would either not start the camera, scan the wrong side of the strip, rescan the same strip twice, or place the strip on the overlay but on the mobile screen itself. Recommendations: Consider a tutorial video that walks users through the proper way of using the application allowing them to follow visually. The video could be optional for expert users but accessible to novice users who want to rewatch it before conducting any tests. It’s important to note that 4 out of the 14 participants preferred that any help provided on the application use should be in the home page.
25
Recommendations
High
Application feedback is unintuitive Visibility of system status Issue: The different color changes the application goes through during scanning don’t explain to the user what is going on. Due to the movement of the hand, the colors change at a rapid pace and only a user who has successfully completed one scan understands what the colors signify. The colors also sometimes linger and users did not understand if the app was scanning and they should hold the position or what was happening. Recommendations: Keep users informed in what each color means and what they should aim for for a complete scan. Also inform the users how long they should pause on a particular color for a successful scan. An example would be text that reads 'scanning' can appear above the box in green with a countdown indicated by 3, 2, 1, success!
26
Recommendations Medium
The application times out too fast for a reading Visibility of system status Issue: The app times out too fast for the user to make a successful scan mainly caused by the issues described in previous slides. Recommendations: Consider informing the users of how much time they have and why there’s a time limit to this particular aspect of the application. A possible solution is implementation of a visible timer that counts down as soon as they start the scan. An example would be to provide text such as "Timeout... strips may provide inaccurate readings after being exposed to light and/or air for more than 45 seconds."
27
Recommendations Medium
The application gives results in form of a range Match between system and real world Issue: The results after successful completion of a test range from very low to very high, but some users prefer a more precise reading. Recommendations: Consider: a. b.
Giving the exact figure for the user’s blood glucose level if possible. Giving figures within a range that the application can achieve. A range specified as +/- 20 seemed agreeable to some participants.
28
Recommendations Medium
Color coding does not match severity of the results. Match between system and real world
Issue: The results after a successful completion of a test are depicted using colors that don’t match the severity of the test results. Recommendations: Consider using colors that are already symbolically established with meaning among participants. An example would be red for high levels and blue for optimum levels. Another consideration could be to use green/yellow/orange/red where green represents optimal levels and red equally represents the very high and very low ends of the scale. Each color does not necessarily need to be unique. In the case of very high and very low, each represents a concern and thus could be fairly represented using red.
29
Recommendations Medium
The Call to Action Error Prevention Issue: The Call to Action (OK, I UNDERSTAND), specifically in Motorola mobile devices was hovering above the test instructions blocking the instructions from the participants view. Recommendations: Consider responsive design for the application that will result in the application rendering well on all devices and screens it may potentially be used on.
30
CONCLUSION AND NEXT STEPS
Conclusion and next steps Conclusion Overall, participants were open to using the application but only after the scanning section had been made more user friendly. It is important to note that this is the only section that all users experienced issue(s) with.
Next Steps Recommendation Consider including a tutorial video of the application during the onboarding process and reworking the scanning section of the application by implementing the recommendations from this report. After revisions to the app are implemented, consider additional usability testing to observe comparative differences from this study and note areas of improvement and/or additional opportunities for improvement.
32
APPENDIX
Participant Details # of scans
Duration
(initial attempt to first success)
(initial attempt to first success)
Software Professional
10 fails, 1 success
45 mins
http://bit.ly/2XQPIt1
Samsung Galaxy S8 (SM-G950F) + Android 9.0.0
Retired
6 fails, 1 success
29 mins
http://bit.ly/2LYcUhH
7 years
Samsung Galaxy Note8 (SM-N950F) + Android 8.0.0
QA Tester
7 fails, 1 success
40 mins
http://bit.ly/2OetDjI
25
4-5 years
Samsung Galaxy Note8 (SM-N950F) + Android 8.0.0
Part Time Teacher / Tester
5 fails, 1 success
33 mins
http://bit.ly/2XTL90Y
India
32
3.5 years
Moto G 5S Plus + Android 8.1
IT Professional
8 fails, 1 success
37 mins
http://bit.ly/2OesI2m
6
India
24
1 year
Samsung Galaxy Note 8 SM-N950F + Android 8.0.0
Working Professional
1 fail, 1 success
30 mins
http://bit.ly/2XYaF0g
7
India
37
2-3 years
Moto G 5S Plus + Android 7.0
Service
3 fails, 2 success
25 mins
http://bit.ly/2Y1NJNm
8
India
42
5 years
Samsung Galaxy S8 SM-G950FD + Android 8.0.0
IT Professional
9 fails, 1 success
55 mins
http://bit.ly/2O2lCOp
9
India
44
2 years
Motorola G5s Plus Android 8.1.0
Service
11 fails, 0 success
75 mins
http://bit.ly/2Z1hFL4
10
India
37
5 years
Samsung Note8 SM N950F - Android 8.0
Senior QA Consultant
2 fails, 1 success
10 mins
http://bit.ly/2JEklc8
11
South Africa
43
11 years
Moto z-force Moto Z2 (XT1789-06) + Android 7.1.1
Process Controller
8 fails, 1 successful
50 mins
http://bit.ly/2O5S9U2
12
South Africa
38
2 years
Nokia 8 TA-1012 + Android 9.0
Testing Lead
3 fails ,1 success
26 minutes
http://bit.ly/2JEqHZd
13
South Africa
30
< 6 months
HTC U Ultra + Android 8.0.0
Test Analyst
88 minutes
http://bit.ly/2O10RTk
14
Nigeria
27
1 year
LG V30+ Android 8.0
Student
72 minutes
http://bit.ly/2XYcAa5
#
Location
Age
Diabetic for...
1
India
40
3.5 years
Samsung Galaxy Note8 (SM-N950F) + Android 8.0.0
2
India
66
12 years
3
India
49
4
India
5
Device
Occupation
11 fails 0 success 7 fails, 0 success
Link to Session Recording
34
Session Recordings Link to Folder with All Video Recordings ● http://bit.ly/2LPcL05
35