Usability Testing Report

Page 1

USABILITY TESTING REPORT ISM 360 | P4

PREPARED BY:

Hope Winchell, Sam Fulwider, Andres Benites



TABLE OF CONTENTS:

02

Executive Summary

04

Procedure

06

Testing Environment

07

Recruiting

09 12 13 14

Observation and Recommendations

Recommendations Illustrations System Usability Scale Results Appendices


EXECUTIVE SUMMARY

2

EXECUTIVE SUMMARY: Usability Testing

ELEVATE APPLICATION

APP DESCRIPTION

Version 5.22.0

Elevate is a mobile app that allows any user to “Train their Brain” through different games and activities for the purposes of honing different areas of brain development. These being: Writing, Speaking, Reading, and Math. Completing activities adds to the scores accumulated in these different areas, called proficiency quotients. The proficiency quotients track performance and are then divided into six subsequent proficiency levels, ranging from 0-5000 (based on game performance, training consistency, and game variety). The levels are as follows: Novice [0-1250] Intermediate [1250-2500], Advanced [2500-3750], Expert [3750-4259], and Master [4750-5000].

PURPOSE

The purpose of this usability testing process is to review Elevates user interface through the scope of everyday tasks that are vital to the app’s conceptual success amongst key demographic users. Testing processes allow for a thorough investigation of the application’s UI, highlighting integral areas requiring adjustment, in the pursuit of a sense of greater alignment to user needs. Our procedure, refined through pilot testing, gives the moderator a standardized methodology through which repeated questioning and contextual “probe questions' ' allow for exhaustive analysis. This in-depth exploration offers up key findings concerning the pros and cons of app’s usability.


EXECUTIVE SUMMARY

TEST CONTENT

Usability tests contents include user navigation to, and user completion of, several tasks that seek to provide a holistic assessment of essential features of the Elevate application. Processes for reconsideration: On-boarding Process, EPQ System, Alteration of Training Goals, and App Navigation. Development Team ASH will utilize user responses in order to provide general recommendations for changes to the Elevate UI, Navigation, and Performance Tracking. Analysis will be derived explicitly from user feedback.

SUMMARIZED FINDINGS THE USER D I S L I K E D (Major Takeaways) The lack of credibility in Elevate’s Performance Tracking System (EPQ, or Elevate Proficiency Quotient) Redundancies and seemingly unnecessary sets of questioning in Elevate’s Onboarding Process Improper organization of the application’s key features (personalization of training goals)

THE USER L I K E D (Major Takeaways) Colorful, clean, animated UI that holds user attention, enhances game interactivity, and complements game objectives Gradual learning curve eases user into site navigation and key information regarding individual activity instructions are emphasized Majority of users came away with a positive experience with Elevate; some even expressed desire to implement app into their everyday routine

3


PROCEDURE

4

PROCEDURE The usability test conducted consisted of an introduction, pre-testing questionnaire, 5 tasks, a short interview regarding overall experience, and lastly a post-test questionnaire (SUS).

TEST OBJECTIVES

01

02

03

04

Instructions and controls

Understanding of the EPQ

gauge user interest

task oriented objectives

Firstly, we want to know if users can easily learn the instructions and controls of new exercises.

We want to explore users understanding of the EPQ and understand how they analyze these results.

We want to gauge a users interest in “brain-training� activities to more clearly understand what attracts users to productivity based applications and whether or not these users would consider incorporating the Elevate application into their daily routines.

By creating a task selection table we were able to prioritize features from most important To least important. This was an important step to complete as a team to determine overall importance.

TOOLS

Testing Script + Task List

Task Log

Pretest + SUS Quiz

iPhones + Elevate App


PROCEDURE

TESTING

Sessions began with a brief introduction explaining the purpose of the study, the goals, and the duration. Participants were then asked for permission to record the session. The session continued onto the pre-testing questionnaire pertaining to demographics and general mobile device usage. Proceeding to the testing of the application, tasks included 5 scenarios that were illustrative of the major function within elevate. These include the onboarding process, training sessions, activities, user profile, and lastly performance. The participant ran through each task and was reminded to think out loud and share any thoughts. Following the task series, we gathered overall impressions using a round of interview questions and a system usability scale that determined the overall usability of Elevate.

WHO

All elevate testing sessions were run with one participant who met recruiting criteria, an observer, and a test moderator who walked the participants through the session. All sessions lasted approximately 30 minutes and followed the testing script developed by our team.

PROBLEMS

Based on pilot testing, many problems were identified that required slight alteration to the final testing script. These changes included implementation of the system usability scale, additional pre-testing questions, and a distinct procedure of swapping between the two iPhones needed for the tasks . Task 4 originally asked participants to complete a speed diagnostic study, participants had a diďŹƒcult time critiquing this exercise as there were little to no usability issues. To solve this, we adjusted the task to use the eye tracking study session which created a more challenging and engaging step for the participants.

DATA

Following testing sessions all data collected from questionnaires and task logs were condensed and organized in a document to track metrics, quotes, errors, and observations.

5


TESTING ENVIRONMENT

6

TESTING ENVIRONMENT SOFTWARE OBSERVATION + USABILITY LAB

TESTING

30 MINUTES

Testing was conducted at De-Paul University's Software Observation and Usability Lab. The average session length was 30 minutes. Required equipment included two iPhones with version 5.22.0 of the Elevate application installed. The first phone was used for onboarding and training sessions (tasks 1 - 3). The app on the first iPhones will needed to be factory restored for each testing session, as many tasks were focused on first time users. The second iPhones used an Elevate account that had data pre-loaded into it.

TESTING ROOM

2 Cameras [morae software] (1 recording phone screen, 1 recording participant), microphone [morae software], Testing script, task list for participant, 1 computer with the Microsoft Form for pre and post testing, and 2 iPhones with Elevate installed.

OBSERVATION ROOM

1 computer with Morae Observer software – observer was able to log task time and observe screen recordings through this, task log, and task list. 1 Test was done using Loom recording software and Macbook and iPhone cameras - no pressing changes to set up and testing.


RECRUITING

7

RECRUITING

HRS. ON MOBILE DEVICE DAILY

PROFESSION

RECRUITING CRITERIA

18 - 35 years old (College student or working professional) Are interested in improving cognitive abilities

ILLUSTRATOR UX DESIGNER

5

ACTUARY

4

FILM STUDENT

2.5

DATA ANALYST

3

STRATEGY CONSULTANT

4

Uses a mobile device daily

We recruited 6 participants that aligned closely with our User Personas that were previously developed.

10

PARTICIPANT DATA

17% MASTERS

83% BACHELORS

18-22

23-25

USED BRAIN TRAINING APP

2

EDUCATION

AGE

4

33% YES

67% NO


OBSERVATION AND RECOMMENDATIONS

9

OBSERVATIONS + RECOMMENDATIONS TASK 1 Create an account on elevate after going through the onboarding process. 100% COMPLETION RATE

SEVERE

3 ERRORS TOTAL

MINOR

3.08 MINUTES ON TASK

The first quiz pertaining to goals confused a majority of participants, they found the questions to be redundant and did not see the point of them. The EPQ quiz’s timer was found to be distracting to the attention span, not allowing them to fully focus on the question at hand.

POSITIVE

The animations and bolded words on each screen allowed the user to quickly gauge what was going on, helping keep their attention

QUOTES

“I think the bolded prompts helped me a lot" | "Was not expecting a test right away"

TASK 2 Complete 1 training session. 83% COMPLETION RATE 4 ERRORS TOTAL

SEVERE

MINOR

2.04 MINUTES ON TASK

POSITIVE

QUOTES

RECOMMENDATIONS

(1) Participants quickly ran through the games instructions, often not reading them. In order to navigate to instructions, the user must click the very small “?” in the corner. Being because this is not visible, most participants began the game and used assumptions to guide them through. Activities and Training sections were confused by participants. He game was very easy to learn for most participants. The colorful and clean UI complemented the game objectives. "I would only read the report at the end of the training if I had answers that were wrong" | “Start of game should just be a play button, no metrics are necessary (distraction to the user)” If it is the users first time playing, make it mandatory that they read the instructions before playing the game. This can be achieved by using the instructions screens after the “play” button is tapped.


OBSERVATION AND RECOMMENDATIONS

TASK 3 Adjust your training goals so that you are focused on improving your focus while reading and mental vocabulary. 75% COMPLETION RATE 8 ERRORS TOTAL

2.40 MINUTES ON TASK

SEVERE

(1) Training goals are too deeply hidden in the app, no user would realistically spend several minutes trying to locate this - they would most likely come across it by mistake.

MINOR

(2) Once training goals are modified, there is no system confirmation that changes have been applied. Additionally, the goals clarity can be improved.

QUOTES

"There is no way I would know to find training goals in settings" | “Settings is a weird spot to put personal goals in” | "Settings and Layout was pretty good/understandable"

RECOMMENDATIONS

(1) “Training goals” can be moved to the main profile page along with “Session Length” above achievements so that users can easily adjust these settings. (2) Add a save button to both training goals and session length pages.

TASK 4 Complete the eye tracking study session. 75% COMPLETION RATE 14 ERRORS TOTAL

SEVERE

The study section can only be accessed by toggling between a bar under activities, this bar was easily overlooked by most users making the organization of the app very confusing.

MINOR

The swipe functions were complex compared to other activities in the app they’re second nature to users who have a significant amount of experience with mainstream applications. One user said that a short quiz following accompanying the session would be a productive feature.

2.98 MINUTES ON TASK

QUOTES

“The game started right away. I didn't get a chance to explore options or click start” | “There exercise was very underwhelming. There needs to be a quiz to accompany the study session - it seems pointless otherwise"

RECOMMENDATIONS

See study sessions recommendation

10


OBSERVATION AND RECOMMENDATIONS

11

TASK 5 Analyze your reading EPQ. Share with the moderator your epq and proficiency level pertaining to reading. 75% COMPLETION RATE

Users didn’t seem to understand the scale to which the point values of each activity were based upon.

MINOR

2 ERRORS TOTAL

POSITIVES

Overall, this task was straightforward.

RECOMMENDATIONS

.95 MINUTES ON TASK

See EPQ recommendation

KEY FINDINGS

EPQ METRICS PROBLEM

RECOMMENDATIONS

Users didn’t seem to understand the scale to which the point values of each activity were based upon. The entire “EPQ” system of gauging performance didn’t appear to retain much validity, according to users. Establish greater credibility for the EPQ system. Give users access to links that provide studies/detailed analysis supporting the system and proving that user’s legitimately benefit from these activities.

TRAINING GOALS PROBLEM

RECOMMENDATIONS

Users must navigate through the settings section of the application to adjust personalized training goals. Create subsection in the Profile tab of the UI that allows users to adjust their training goals however they see fit.


OBSERVATION AND RECOMMENDATIONS

TIMER PROBLEM

RECOMMENDATIONS

In both the onboarding process and in the activities themselves, users found that the timers given within Elevate provide inadequate and almost unrecognizable notifications to the users concerning the amount of time left in any given game. Consider a numbered timer, a darker shade for the “wipe-down” timer, or allow greater space in the UI for enhanced visibility.

STUDY SESSIONS PROBLEM

RECOMMENDATIONS

Users had the most trouble navigating to the “Eye-Tracking” Study section of the application. This task accumulated the most errors of any tested task. (1) Enlarge the tabs that allow toggle access between Study and Game Sections. OR (2) Remove the “notifications” page on the nav bar and replace this with the Study section. Considering there are over 25 study activities, it contains important content in the app that was easily overlooked by participants.

ONBOARDING PROCESS PROBLEM

RECOMMENDATIONS

Onboarding Process presents the user with seemingly useless/redundant questioning that unnecessarily elongates first-time users' profile creation process. (1) Do away with onboarding altogether and allow users to go directly to creating their profile. If process were to be kept, ask more specific questions that relate to the different learning domains. (2) Omit a majority of the current onboarding process. When users first download the app, prompt them to select goals. [The goal selection process needs to be clear and concise; more than half of participants during testing sessions spoke poorly about the 5 initial questions during onboarding. They did not understand what the end goal of the questions were] Allow users to play two activities without creating an account, following this they will be guided through the profile creation process on Elevate. Using this method will allow users to more accurately select their training goals while also gauging an EPQ level.

12


RECOMMENDATIONS

12

RECOMMENDATIONS SCREEN SHOTS IN REFERENCE TO KEY FINDINGS.

CURRENT TIMER

TIMER

>

USE SHADED TIMER WHEN APPROPRIATE.

STUDY

CURRENT STUDY LOCATION (TOGGLE)

>

TRAINING GOALS

REPLACE NOTIFICATIONS WITH “STUDY”

>

ADD BUTTON HERE SO USER CAN ACCESS GOALS THROUGH PROFILE


SYSTEM USABILITY SCALE

13

SYSTEM USABILITY SCALE (SUS) SCORE

78/100

OVERVIEW

The system usability scale was used at the end of testing to determine the overall usability of Elevate. This scale included 10 questions, the participants were asked to rank each question based on how much they agree with the statement; 5 meaning they strongly agree, 1 meaning they strongly disagree. Each score was calculated to result out of 100. Elevate’s average SUS score was 78.3 out of 100. The industry average is 68. Scoring 10 points on average above the standard we can determine that Elevates overall usability falls within a percentile ranking of 70%. On average, there are few issues regarding usability that aect the user's experience negatively.

QUESTION

AVG. SCORE (OF 6)

I think I would use this app frequently.

3.6

I found the app unnecessarily complex.

2.1

I thought the app was easy to use.

4

I think that I would need the support of a technical person to be able to use this app.

1.1

I found the various functions in this app were well integrated.

4

I thought there was too much inconsistency in this app.

2

I would imagine that most people would learn to use this app very quickly.

4.4

I found the system very inefficient to use

1.6

I felt very confident using the app. I needed to learn a lot of things before I could get going with this app.

4 1.5


APPENDICES

14

APPENDIX A

PARTICIPANT GRID


APPENDICES

APPENDIX B

SUPPLEMENTAL MATERIALS 1. TESTING SCRIPT 2. TASK LIST 3. TASK LOG 4. OBSERVATION VIDEOS + ANNOTATED TASK LOG 5. SUS AND DEMOGRAPHIC DATA SETS (ALL ON BOX.COM : https://app.box.com/s/b7mgrto0k6gi3bd6969ytirwc1bmsics )

15


APPENDICES

16

APPENDIX C

TEAM CONTRIBUTIONS As a group, we all separated the sections of our P work as follows: SAM

Observations and Recommendations, Moderator, Observer, Pilot Testing, Executive Summary, Team Contributions, Future Work, Task Log Analysis

ANDRES

Observations and Recommendations, Moderator, Observer, Pilot Testing , Testing Environment and Recruiting Criteria

HOPE

Observations and Recommendations(tasks) , SUS, Moderator, Observer, Pilot Testing, Procedure, Testing Environment and Recruiting Criteria, Formatting/Report Design,


APPENDICES

APPENDIX D

FUTURE WORK In further usability testing I think the hope would be, first and foremost, to find more sufficient video technology for observation of our subjects and how they work through the UI. Morae was useful, but with almost every interview there appeared to be faults in the system either in downloading the session, recordings, or more specifically, the video mount above the phone that tracks user navigation. It is crucial that in future instances of Elevate testing the development team has access to that video feed. It allows for easier tracking of user errors, and it can provide concrete evidence of the areas in which an application falls short in meeting the needs of its users. Our development team hopes that additional testing processes seek to broaden the recruitment criteria for test subjects. Through the six interviews we performed, Team ASH was able to recognize that many younger users, in the 18-25 demographic, look to their mobile devices to “shut off their brains”. Elevate provided less of an impact because young users that we tested appeared to have greater access to resources (classrooms, libraries, Online learning services) with greater credibility. The people who are far from academia, retirees or seniors who have more apparent need for an application such as Elevate, may have a higher potential for everyday usage. This would be an interesting demographic to approach in future Elevate usability tests; seniors may shine a brighter light on existing usability problems by having a generally steeper learning curve. It would increase attention on interactivity for a wider range of users. In looking back on how our team executed P4, it is our belief that the process was collaborative, comprehensive, and ultimately fun. We thoroughly enjoyed our time working together. The only change we unanimously agree on is pacing ourselves through the reports we completed this quarter. We never felt behind other groups but rather we hope, for the sake of our collective sanity, that we’d space out our work better in the future.

17


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.