Wolverine Access Interactive Advisement Usability Testing & Analysis
Kumud Bihani Mohammad Hadra wi Andrea McVittie Ning Wang
Wolverine Access The Wolverine Access (WA) system is an online web application for accessing and managing an individual’s information related to the University of Michigan. Accounts to the system are given to students and their families, alumni, faculty and staff. Depending on a user’s role, different types of the user's account information can be managed through the system, including course schedules, contact information, housing information, and payroll options. A new version of Wolverine Access is going to be launched within the next six months for use by the University of Michigan community. This upgrade will include a significant change in the interface, as well as the inclusion of many new features. Included in these new features will be the "Interactive Advisement" module. This report will focus specifically on this new feature.
Wolverine Access Interactive Advisement Interactive Advisement will allow students to view their degree progress by displaying their program's academic requirements. The system will indicate which requirements have been fulfilled and which have not. Below each requirement, a list of courses that would count towards fulfilling that requirement will be listed, along with how many credits the course is worth, and when it is offered. This new advisement portion of the service will also eventually allow students to examine the impact on their academic advancement if they switch to a new major for their studies. A report detailing how their earned credits will transfer between degree programs will be generated and sent to the student. Users will access advisement through a new section called "Self Service" (Figure 1). Figure 1.
The student will then need to select "My Academic Requirements" under the "Degree Progress/Graduation" section. (Figure 2.)
2
Figure 2.
Once the student has selected "My Academic Requirements" they will be presented with the automatically generated report detailing their academic progress. (Figure 3.) Figure 3.
3
Target Audience Wolverine Access is used by and intended to serve all enrolled University of Michigan students. For the Winter 2008 semester, there are 37,851 students registered. This population is made up of 24,786 undergraduate students and 13,065 graduate students. [1]
Figure 4. Wolverine Access System Users – Winter 2008
The undergraduate population consists of students enrolled in over 200 different degree programs. Incoming Freshman arrive from over 1650 different high schools. The undergraduate population originates from all 50 states and out-of-state undergraduate students make up over one-third of that population. The undergraduate population includes students from over 80 foreign countries. Four percent of undergraduates are international students. One quarter of the undergraduate population is African American, Hispanic American, Native American, or Asian American. The gender distribution of undergraduate students is approximately 50/50. [2] In addition to enrolled students, 180,000 Alumni are also granted access to portions of the system in order to access their transcripts.[1] Teaching staff is also supported by Wolverine Access. Currently, 6,610 total faculty members are registered users. Of those, 5,025 faculty members are teaching during the Winter 2008 semester, while the rest have taught courses within the last year.[1] The Wolverine Access system is also used by 6,800 users with administrative access who enter data, run reports and view information. These users are members of a variety of departments including Payroll, Benefits, Human Resources, Recruiting, Admissions, Curriculum, Academic Advising, Student Services, Student Financial, and the Financial Aid office. Also included in this group of users is the Wolverine Access Support Staff. [1]
4
Figure 5. Wolverine Access System User Types – Winter 2008
The Interactive Advisement portion of Wolverine Access will be mostly used by the enrolled student population including undergraduates, graduate and professional students.
User Test Goals We conducted user tests to get feedback about usability, functionality and aesthetic problems within the system from a representative sample of student users. We particularly were looking for indications that the user was having difficulties with the navigation or system's vocabulary. Tasks were designed to require the users to navigate the system and identify where in the system they could get commonly needed information. Tasks were also designed to expose users to some of the new features, in order to get reactions to these new additions.
Limitations of User Testing While the user tests resulted in valuable information, our test did have some limitations. The system provided for user tests was populated with test data. This data did not necessarily reflect an accurate degree program which may have caused additional user confusion that would not exist if the data was an accurate representation of the user's own program. In addition, many user information fields were missing, such as the user's address, causing the system to not function as it would if it had a complete user record. The system provided also was not always consistent between users. As it was a development copy of Wolverine Access, changes were made to the system between user tests. This resulted in a different experience for some users than for others. The development status of the system also added additional complications to the user interface that are not representative of the final user experience. Additional menu items for administration of the
5
system were visible. As these options were intended for "expert" users with administrative access, and not normal users of Wolverine Access, the experience of the test users was very different from the experience users will have after the public launch. Expert terms and options for controlling the database and other back end features confused users and added additional visual information for users to go through before finding the navigation item they were searching for.
Methodology The team designed a user test around our goals, recruited a sample population of users and administered the tests as described below.
Recruiting During the creation of the User Survey we included an optional question, where users of the current wolverine access system could leave their email addresses if they were interested in a “sneak preview� of the new wolverine access system. Through this form we collected about 40 responses from people who were interested. We contacted all of these respondents via email, asking for their availability and received ten replies. In selecting respondents for user testing, we made diversity a priority and sought to match our users with our personas as much as possible. We wanted to get students from different departments at the university, as well as have a mix of graduates and undergraduates. There were several other factors taken into consideration as well, such as availability of at least three group members during the user tests, and availability of user testing equipment. Working within time and equipment restraints, we successfully completed 5 user test. Sampled users were a mix of international and American students, graduates and undergraduates, and a fairly good representation of the various schools affiliated to the University of Michigan. We had an equal distribution of male and female users. All users said they had medium proficiency with technology. (Table 1)
6
Table 1. Sampled User Demographics User
Affiliation
Year
Department
Technology Proficiency
U01 (Pilot)
Student
Graduate
School of Information
Medium
U02
Student
Graduate
U03
Student
Graduate
U04
Student / Staff
Graduate
U05
Student
Sophomore
U06
Student
Freshman
Industrial and Operations Engineering School of Information Recreational Sports, School of Education Business Administration, Ross School of Business LS&A, School of Public Health
Usage of WA Biweekly
Comments
Gender
N/A
F
Medium
Weekly
International Student
F
Medium
Monthly
M
Medium
Daily
International Student Did undergraduate degree from University of Michigan
Medium
BiWeekly
N/A
M
Medium
Weekly
Major Undecided
M
F
7
Test Setting and Equipment The first three tests, including the pilot test, were conducted in the usability lab, located in the Duderstadt Center at the University of Michigan. The test was performed on a Windows platform computer using the Firefox browser. The team used Camtasia to record the on-screen movements of the user, and a web camera to capture facial expressions. We also recorded the user’s voice. We used a provided test id to log in to the Wolverine Access system, so that the user did not have to use his own id, and he could feel free to make changes to the fake profile without having to worry about the consequences. The remaining 3 user tests were conducted at the Shapiro library, and at West Hall. We used a Dell laptop running on Windows, with Camtasia pre-installed, and a built in microphone. We attached a web cam to the laptop, so that we could capture the facial expressions of our users.
Pilot Tests We conducted 2 pilot tests. The first pilot was conducted on a group member. The goal of this test was to gain familiarity with the equipment being used as well as reveal other issues that needed to be corrected before going forward with a valid test. This first pilot test also helped determine where the note takers could be positioned, so that they would not obstruct the user. The second pilot test was done by a user who was familiar with the current Wolverine Access system, but had not seen the new system. The main goal of this pilot test was to determine the quality of tasks provided, as well as the amount of time required to complete each task. We asked our pilot test user to complete certain tasks without giving her a chance to get familiar with the system. We recorded her on screen movements, voice, and facial expressions. She was also asked a series of post test questions. After the pilot tests, we made several changes. The first was to re-order tasks so that the user does not get intimidated by them. Harder tasks were moved later in the test, in order to not immediately frustrate the user. The second was to allow the user a few minutes to gain familiarity with the system before giving them tasks. Since the users had never seen the new system, they would feel more comfortable if they had a chance to explore on their own before being given a task to complete. Follow-up questions and the introductory statement which were read to the user were also adjusted after these initial tests..
User Tests Each of the five users were first asked to fill out a pre-test questionnaire. This contained their basic demographic information as well as a consent form, so that we can use their information and videos for our final presentation. ( See Appendix: pg 25 ) The user was then read a brief statement describing the test. ( See Appendix: pg 23 ) It was emphasized to the user that we were testing the system, and that there were no wrong answers. Users were instructed to think out loud as they worked through the system. The user was also informed that they could excuse themselves from the test at any time. Users were asked before the
8
test began if they had any questions, and given a choice to explore the system briefly before beginning the tasks. Three users took the option to familiarize themselves with the system prior to the test, while two did not. During each test, the team assigned one moderator, one technical assistance person, and one to two note takers. We recorded the timing of the tasks, took notes on the user’s train of thought, and captured his keystrokes. Any point that seemed particularly confusing to the user was noted. We also recorded the on-screen movements, facial expression, and voice of the user, for further analysis in the future. The moderator guided each user through six tasks designed around brief scenarios. The first task asked the user to locate a copy of their transcripts. This task was selected because it is a common use of the Wolverine Access system and likely to be something users will do at least once during their time at the University of Michigan. This task was asked first, as it was relatively simple and would allow the user to feel comfortable with the system before being asked to complete more frustrating tasks. The second task asked the user to look into which degree requirements they had not fulfilled yet. This would lead the user into exploring the main feature of the new Interactive Advisement portion of Wolverine Access. The team could assess the usability of the navigation to this portion of the site, as well as how the information is presented, and user reactions to this new feature. The third task expanded on the second. Users were asked to determine a course they could take in order to fulfill an unfilled requirement. The combination of these tasks was added to assess the presentation of the academic requirements information. The team wanted to explore if the visual cues and symbols given to the user successfully communicated to the user. The fourth task asked the user to find their advisors contact information. This task was selected as it is a new feature in the system. The team wanted to get feedback on the inclusion of this information in the system, as well as examine how easily users located it. The fifth task asked users to find information about how their credits would transfer between majors. The task was selected to evaluate if users could find this information, using the systems terminology – a “what-if” report. The team also wanted to gather information about the students reaction to the inclusion of this feature. The final task asked students to investigate how their credits had transferred from a previous university. This task was designed to be a further test of the navigation and terminology. Throughout these tasks the moderator reminded the users to think out loud and express their thought process and feelings towards the system. After the tasks were completed, users were asked follow-up questions about their experience with the system. The team asked the users to identify things they liked or disliked about the system, as well as provide any other feedback they wanted to offer. The other group members were then given a chance to ask any follow questions they had of the user.
9
Users who were unable to complete some tasks often asked the team to show them how to complete the task at the conclusion of the test. Once the user tests were complete, we analyzed the videos and notes, and came up with several findings.
Results & Findings The usability testing of the Wolverine Access Interactive Advisement revealed overall a fairly usable system. The team identified several positive usability features, as well as noting several areas where improvements can be made. Many of the findings revealed in the tests have been identified by the team in the heuristic evaluation. However, for some of them, the rankings are not consistent with the heuristic evaluation results. Also, many unexpected findings were revealed in the tests. The majority of system issues revealed by our user tests were functionality and usability issues. Overall, the users found some of the vocabulary of the site confusing. The navigation system also presented a challenge for the users. Along with the terminology employed in the navigation, the system gave the user little information about their location within the site, or their other options.
Table 2: Summary of System Issues Category Number
Description
1
Confusing terminology or vocabulary
2
Navigation difficulty
3
System status unclear
4
Repetitive labels for different content
5
Visual layout not understandable
The user reaction to the aesthetics was largely positive. Most users felt the new look and feel was an improvement over the existing system and preferred the newer version, despite their frustration with the functionality. Many users expressed that they felt that the additional features and new interface were great improvements. In some instances, the visual symbols that were used did not successfully communicate with the users. In each test, the team recorded how long each user took on each task. (Table 3) Users who did not take the offered extra time to familiarize themselves with the system before starting the task had some difficulty accomplishing the first task. The users who did take the time to familiarize themselves with the system before beginning the tasks took longer to accomplish the other tasks.
10
Table 3: Time Spent Per Task
S = Start Time
E = End Time
Tables 4 & 6 were created from the information gather during user testing, including timing logs, notes, and videos of the subjects and their screens. Each table displays a column labeled “User(s)”. This column lists the user or user who noted a particular point. Table 5 explains the severity of the problem, as rated using Nielsen’s Severity Ratings for Usability Problems. [3]
11
Table 4: Usability Strengths #
1 2
Description There are more features and functions in the new system that would be of great help for the users, such as The extracurricular activities and honors and rewards, the new service about the credit required for transferring to a new major and The dictionary service
The new interface of the Wolverine Access is better than the current one
User(s)
U4, U5 U3, U4, U6
3
The menu in the top region is helpful, since it provides many functions and is easy to access
U6
4
The “Self-Service� page is the most visited page, and provides a starting point to accomplish most tasks
All
5
Having a side bar to navigate the website is a good design.
6
The dictionary service is kind of a nice feature but the user is not sure when it is going to be used (what is the use of it)
U1, U5, U6 U4
Table 5: Nielsen's Severity Ratings for Usability Problems RATING DESCRIPTION 0
I don't agree that this is a usability problem at all
1
Cosmetic problem only: need not be fixed unless extra time is available on project
2
Minor usability problem: fixing this should be given low priority
3
Major usability problem: important to fix, so should be given high priority
4
Usability catastrophe: imperative to fix this before product can be released
12
Table 6: Identified Usability Issues #
Category
1
4
2
2,3
3
1
4
1
5
4
6
N/A
7
Description There are two identical "My Academic" Links that are shown together, but lead to two different places. It is confusing. Navigation through the system is really hard. Using the left navigation bar to locate the information is difficult for some of the tasks. The system doesn’t “speak the users' language”. The words, phrases and concepts are not familiar to the user. For example, “what-if” report is a confusing term for the users that none of them can complete the task 5. Some of the terms used in the new system are not consistent with the current system which users are already familiar with. As a result, they will make the users confused. There are two "order official transcripts" which is confusing
User(s) U1, U2, U5, U6
Ranking
All
4
All
4
U4, U6
4
U4
4
There are some broken pages in the beta version of the system.
All
4
1
Categories don’t always make sense.
All
3
8
5
Users have difficulty understanding planner legend.
U1
3
9
1,5
U1, U4, U6
3
10
3
"Processing" signal which shows the status of the system is not available for some actions
All
2
11
2,5
The navigation bar sometimes doesn't correctly indicate the current location of the user.
U4
2
12
2
Users find it hard to get back to the main page in which they have more choices because the term of “self Service” doesn’t indicate it is the main page.
All
2
13
2
Users cannot find which page is considered to be the home page. (Self Service, Student Center, Main menu, or the login screen)
U3, U4, U6
2
14
3
No descriptions under each section about the information and the function of the page.
U3, U4, U6
2
15
5
The big blank space in the main page is a bad design.
U1
1
16
2
Some navigation hidden under "more" link on Self Service page.
U3, U4, U6
1
17
N/A
Linking Wolverine Access to other sites such as Ctools and Mirlyn would be great
U2, U4, U6
0
Difficulty understanding which items have been fulfilled and which have not (courses)
4
13
Specific Findings and Proposed Solutions The following details the findings from table 6 and offers possible solutions. Category 1. Confusing Terminology or Vocabulary Some of the terms used in the system are unfamiliar to the users because they are not consistent with the terms most often used by the users. Example - Issue #3: Labels applied by the system are not ones the users understand. Most users had difficulty knowing to click on “Self Service” to reach the system functions. (Figure 1) The “What-If Report” term (Figure 6) describes the feature of the system where a student may get information about how their credits will transfer to a new major. However, this term is not familiar to users. As a result, in the user tests, when asked to find this information, users did not select this function to accomplish the task. Figure 6: “what-if” Report
Example – Issue #4: Users are familiar with the term of “backpack” in the current system but in the new system, it is replaced with “shopping cart” (Figure 7). Users were unsure if these were equivalent. Figure 7: Shopping Cart
14
Category terminology was also a problem. Users had difficulty locating their Advisor information party due to the fact that the labels of other navigation categories were more appropriate than the one it is actually found under. Users expected to find it under “Academic Planning”, “Enrolment”, “Degree Progress/Graduation”, or "personal data summery" under "Campus Personal Information". ( U04 @ [13:19] & [14:40], U05 @ [16:05], U06 @ [09:48] & [10:40] ) Proposed solution: Speak the users’ language. When choosing terms to use in the system, try to use those that are already in a users vocabulary. For examples, “credits” is a term often used by students, while the system uses “units”. Also, reuse those terms in the current system to minimize the learning curve for users. For the new terms, it is good to have a description or legend to show its meaning. A survey could also be conducted to determine appropriate terms. Involving the users in selecting the terminology may produce the most successful results.
Category 2. Navigation Difficulty The new system adds a navigation bar on the left of the page, trying to give users an easy way to navigate through the system and locate the useful information. However, during the tests, most of the users mentioned that the navigation is confusing and difficult to use. Example – Issue #2 Overall, most of the users found difficulties when using the navigation bar. One of the big problems here is related to the category 1 that the terms used in the navigation bar are hard to understand and confusing. (Figure 8)
15
Figure 8: Navigation Bar
Example – Issue #11 Sometimes, users cannot know exactly where they are in the system. The system lacks information indicating where in the system the user is. Page titles do not always reflect the navigation item used to find them. Example – Issue #12 In the tests, after one task, when being asked to perform another task, users are used to go back to the main page of the system, which is often done in the current system. But in the new system, they don’t know how to go back to the main page. Example – Issue #15 To save space some of the links are hidden in the “more” links. (Figure 9) One user did not like the design and want to see all the choices when entering the system.
16
Figure 9: Self Service
Proposed solution: After the terminology problem is solved, the navigation problem would be easier to deal with. Designing hierarchy of navigation system is important. A good strategy is to make individual function separate and group closely related function. All of these must be base on the users needs. A card-sorting exercise using several sample users may be helpful in refining the categories. In addition, the links on the navigation bar should be highlighted based on the page the user is visiting. Category 3. System Status Unclear While overlapping with the previous category, this is an important enough issue to warrant it’s own listing. In the tests, users sometimes would ask the moderator where they were because they didn’t know their current status by observing the page. Examples – Issue #2 Based on the information we collected, the users want the visual look of the navigation bar to indicate their location, through highlighting or some other means. Example – Issue #10 The word "Processing" appears in the right corner of the screen when the system is working to retrieve information for the user. (Figure 10) This alerts the user that the system is functioning while they wait for results of an action. This is a good practice but users are not aware of the signal because it is not obvious to the users. Also, some pages don’t have that label when the page is trying to retrieve the required information, which lead them to think that there is a problem with that particular service and try to use another service.
17
Figure 10.
Example – Issue #13 The system offers little to no description about the new navigation categories and descriptions. Some users want to have it especially for new functions of the system. It would not only help the users locate themselves but also give them a better understanding of the system. Another problem here is when the system is trying to open another tab in the browser to show the requested information. (Figure 11) In that case, the user cannot see that there is another tab opened. Figure 11: View Transfer Credit Report in a new tab
Proposed solution: For #2, it is important to redesign the navigation bar to make it consistent with the content of the system. One good solution is to design drop down links in the navigation bar and make the current link highlighted to match the main content.
18
For #10, repositioning and resizing the signal can solve this. Making it bigger and show up around the left part of the page would be much more obvious to the users. Also, using a progress bar in the center would solve it. For #13, adding necessary details or descriptions can be implemented by first researching on users needs for each page. In addition, there should be an indication for users that there is an external page (new page) was opened. A better option may be to leave it up to the user to select if a new window or tab is opened.
Category 4. Repetitive Labels for Different Content Repetitive labels for different content confuse the user. There are two cases of this problem that would be catastrophe for the system. During the tests, all the users found this problem and want it to be solved. Example – Issue #1: There are two identical "My Academic" Links that are shown together in the left navigation bar and the page of “self-service”, but these two links lead to two different places. (Figure 12 & 13) The first link is a broken page that would return error information when clicking. All the users thought that this is a very big problem that should be solved quickly. Users often bypassed the second, working “My Academics” assuming it was the same as the broken link. Example – Issue #5 When users perform the task of downloading a transcript, they found that there are two "order official transcripts" buttons in the transcript page. (Figure 14) There are no signs show the differences of the two and actually both buttons will take the users to the same place. While multiple access points to information is a good feature, this repetition seemed to confuse the users.
Figure 12: Two “My Academic Requirements” in the navigation bar
19
Figure 13: Two “My Academic Requirements”
Figure 14: Two “order official transcripts”
Proposed solution: For #1: Labels different functions or buttons with different terms; For #5: Remove the unnecessary link. Category 5. Visual Layout Not Understandable One of the problems came out in the usability tests is with the visual layout. Specifically, the visual layout sometimes fails to give a clear signal to the users to show how the page works. One reason for this problem is also related to the problem of #13: there is little or no description in the system about the current page. Example – Issue #8 There is a legend in the planner section to show users what different signs mean. (Figure 15) Users have difficulty in both finding and understanding it. On long pages, the legend stays at the top,
20
where it was not noticed by our test users. (Figure 16) Figure 15: The Legends
Figure 16: Status bar
Example – Issue #9 When performing the task of checking the academic requirement information, the users had to compare their academic progress against their academic requirement. Some of the users had difficulty understanding which courses have been fulfilled and which have not because there is lots of information in the page, and the information is not well organized to show the differences of the courses. ( U01 @ [07:50] & [24:18], U04 @ [12:38], U06 @ [06:26] ) Example – Issue #11 The navigation bar sometimes doesn't correctly indicate the current location of the user. The highlights of the left navigation bar are confusing sometimes because when the users performed an action in the page, the highlights would remain unchanged. Described previously in “Category 2”. Example – Issue #14 The large white space on this page with small menu feels unbalanced. (Figure 17)
21
Figure 17: A Large Blank (white space) in the Main Page
Proposed solution: For #8, the legend needs to redesign to make it bigger. More description is needed to make it clear. For #9, making two separate tables to show the courses that have been fulfilled and the courses that have not been fulfilled respectively would be easier to understand. For #11, see category 2. For #14, an alternative is to show main categories directly after the users log into the system.
Conclusion Overall, the new Wolverine Access offers several advantages to the previous system. Users are interested in many of the new functionalities. Our assessment revealed that the weaknesses of the systems usability are mainly in it’s navigation. We suggest that the development team concentrate on re-labeling navigation items using the users vocabulary, refining the navigation groups on the Self Service page, and eliminating all repeated labels that lead to different locations. Attention to these areas would resolve the most serious usability issues, as revealed by our user test. We also suggest that the development team repeat user testing closer to launch when a more realistic user account is available. The removal of the administrative functions from the users navigation options, along with the additional functions that will be working properly by that time, will create a more accurate test of the whole system.
22
Appendix References [1]
Information Provided by Client
[2]
University of Michigan – Office of Undergraduate Admissions (2007) Fast Facts. Retrieved February, 2008, from http://www.admissions.umich.edu/fastfacts.html
[3]
Jakob Nielson Severity Ratings for Usability Problems. Retrieved March 2008, from http://www.useit.com/papers/heuristic/severityrating.html
User Test Moderator Script "Thank you for taking the time to participate in this trial of the new Wolverine Access. We're going to be asking you to complete a few tasks using the new system. For the purposes of this preview, you will be using a test account and not your real account information." "The system you're going to be using today is NOT the final version of the new Wolverine Access. Features you see today may or may not be available on the final version." "Please keep in mind that we are testing the system, not you, and that there are NO wrong answers. While you are using the system we'll be asking you to 'think out loud'. Let us know what's going through your mind as you use the system. If something is frustrating or easy, let us know. You decide when a task is "completed" and you may move on to the next task at any time by asking us for the next task. You may opt out of this test at any time. If you don't understand what we are asking of you, please ask for clarification." "At the end of this test we'd like to get your feedback about what you liked or didn't like about the system." "Do you have any questions before we begin?� "Would you like to take 5 minutes to look around the system and familiarize yourself with it before we begin?"
23
User Task List Task 1 - Scenario: Moderator: "You're applying for an internship. The company you want to work for has asked for a copy of your transcript. Find where you can request one." Task 2 - Scenario: Moderator: "You are preparing to make your schedule for the next year and want to find out which degree requirements you still have to fulfill. Find this information." Task 3 - Scenario: Moderator: "Find a requirement you have NOT fulfilled. Determine a course that you can take in order to fulfill this requirement". Task 4 - Scenario: Moderator: "You want to talk to your advisor about your schedule. Find their contact information" Task 5- Scenario: Moderator: "You are considering changing your major and want to know how your credits will transfer to the new degree program. Get information about this." Task 6 - Scenario Moderator: "You recently transferred from another university. Find information about how your credits transferred from your old school to U of M.
User Test Follow-Up Questions Moderator:
1. “What did you like about the system?” 2.
“What did you not like about the system?”
3.
“Is there anything else you'd like to tell us about your experience with this system.
24
Pre-Test User Survey
25