VS.
A Comparative Assessment of Two Online Video Players Maura Robinson. This report seeks to understand the features that characterize a best in class video player, and combine these aspects to deliver the most ideal solution to the market.
HF751: Final Project Report May 4, 2015
Table of Contents SECTION 1: EXECUTIVE SUMMARY .................................................................................................... 1 PROJECT DESCRIPTION............................................................................................................................................1 APPROACH .................................................................................................................................................................1 KEY FINDINGS ...........................................................................................................................................................1 RECOMMENDATIONS ...............................................................................................................................................1 SECTION 2: BACKGROUND AND METHODS .................................................................................... 1 METHODOLOGY ........................................................................................................................................................1 SECTION 3: FINDINGS ............................................................................................................................. 4 VOLUME CONTROL ...................................................................................................................................................4 TIMELINE SCRUBBER ...............................................................................................................................................7 PLAY BUTTON ........................................................................................................................................................ 10 OPEN ENDED SUGGESTIONS:............................................................................................................................... 11 SYSTEM USABILITY SCORES: ............................................................................................................................... 12 SECTION 4: LIMITATIONS AND EXTENSIONS .............................................................................. 15 LIMITATIONS .......................................................................................................................................................... 15 EXTENSIONS ........................................................................................................................................................... 15 SECTION 5: KEY TAKEAWAYS........................................................................................................... 16 CURRENT USER PREFERENCES ........................................................................................................................... 16 DESIGN RECOMMENDATIONS: ............................................................................................................................ 16 REFERENCES ........................................................................................................................................... 18
A Comparative Assessment of Two Online Video Players Robinson
Section 1: Executive Summary Project Description The goal of this study is was to determine which elements embody a best in class video player combine these aspects to deliver the most ideal solution to the market. The method selected to accomplish this goal was a comparative assessment of the Brightcove video player a leading industry peer: YouTube. Approach Remote unmoderated usability testing was selected as an approach for gathering user insights. The focus of the testing centered around three main components: volume control, scrubber control and the play button. Thirty participants took part in this study. It was a within subjects test, meaning that each participant viewed and responded to both video players. Key Findings Volume Control: Ease of Use: Youtube Visual Appeal: Brightcove Overall: 56% Youtube Timeline Scrubber: Ease of Use: Youtube Visual Appeal: Youtube
Overall: 83% Youtube
Play Button: Overall: 86% Youtube SUS Score: Youtube : Average 84 Brightcove: Average 70.6
Recommendations
Page 1
A Comparative Assessment of Two Online Video Players Robinson
Section 2: Background and Methods Methodology The primary focus of this study was to identify usability issues associated with the Brighcove video player in order to determine where Brightcove excels, and where it lags behind on the basis of user perception. Through the following approach I sought to identify the features that characterize a best in class video player, and combine these aspects to deliver the most ideal solution to the market. The method selected to accomplish this goal was a comparative assessment of the Brightcove video player a leading industry peer: YouTube. Remote unmoderated usability testing was selected as an approach for gathering user insights. An account with Userzoom.com was leveraged for administering the usability tests. This is a web-based service that allows everyday users to test products without a moderator present using a self-reported feedback technique. Unmoderated usability testing was selected because it allows users to evaluate products on their own time while reducing the amount of time required to perform usability testing. Userzoom captures self-reported data from the user’s interaction with the product and creates metrics and visualizations to represent this exchange. Qualitative feedback was also collected to further insights into the interaction of the user. This data was then exported from Userzoom and analyzed using several statistical techniques. These techniques include: Mean Calculation T-Test Confidence Intervals SUS Analysis Pareto Charts The focus of the testing centered around three main components: volume control, scrubber control and the play button. The test script was designed to gather comparative feedback by feature, in order to gain the most direct feedback on user preferences across players. Users were asked to use describe their experience throughout the study using a 5 point Likert Scale with 1 being least appealing/easy and 5 being most appealing/easiest to use. User feedback was requested on the basis of ease of use, visual appeal, and preference of features between players. Through gathering open ended user feedback, the test also yielded rich information on various challenges associated with the usability of a product attempting to attain mass appeal including: varying personal preferences of users, diverse technical issues impacted by browsers and the internet, and human limitations and accessibility issues. This qualitative feedback also provided rich insight into what the features of an ideal video player might look like.
Page 1
A Comparative Assessment of Two Online Video Players Robinson Thirty participants took part in this study. It was a within subjects test,meaning that each participant viewed and responded to both video players. This affected the T-Test results most significantly. No demographic data on the users was collected. Key Data and Metrics: 1. “Ease of Use” rating on a 5-point scale, with higher ratings meaning easier. a. Volume Control b. Timeline Scrubber 2. “Visual Appeal” rating on a 5-point scale, with higher ratings meaning more visually appealing. a. Volume Control b. Timelines Scrubber 3. Open-ended Comments For each feature which player did you like better? Associated comments. a. Volume Control b. Timeline Scrubber c. Playbutton 4. Open-ended question asking if there are any improvement that would make the player more effective or easier to use? 5. Ratings on the ten individual ratings making up the System Usability Scale (SUS).
Deliverables:
Means for the “Ease of Use” and “Visual Appeal” ratings. T-test to see if they are significantly different, Graphs showing these means for the two sites, including a 95% confidence interval for each.
Summary of comments, and identification of patterns associated with these two ratings.
Summary the responses and identification of patterns to the three openended questions about any Challenging/Frustrating aspects and any Effective/Intuitive aspects.
Page 2
A Comparative Assessment of Two Online Video Players Robinson 
Mean SUS score for each video player. T-test to measure if they are significantly different from each other. Graph showing the means for the two sites, including a 95% confidence interval for each, Frequency distribution graph showing the distribution of SUS scores for both sites.
.
Page 3
A Comparative Assessment of Two Online Video Players Robinson
Section 3: Findings Volume Control Volume control ratings were evaluated on the basis of Visual Appeal and Ease of Use. The ratings were collected using a five point Likert Scale for each area of measurement. Data was collected after the completion of each task. The graph below shows mean rating for each area of measurement by video player. The red bars represent Youtube, while the teal bars represent Brightcove. In Visual Appeal, Brightcove surpassed Youtube slightly with a mean rating of 3.30 to Youtube’s 3.27. For Ease of Use, Youtube received a higher mean rating at 3.07 to Brightcove’s 2.20.
Areas of Measurment
Mean Volume Control Ratings
Visual Appeal
Ease of Use
0
0.5
1
1.5
2
2.5
3
3.5
Rating Scale (1-5) Player Y
Player B
Volume Control Confidence Intervals:
Mean Standard Deviation Sample Size Confidence Coeff. Margin of Error Upper Bound Lower Bound
Volume Control Player B Ease of Use Visual Appeal 2.20 0.81 30 1.96 0.29 2.49 1.91
3.30 0.79 30 1.96 0.28 3.58 3.02
Volume Control Player Y Ease of use Visual Appeal 3.07 0.78 30 1.96 0.28 3.35 2.79
Page 4
3.27 0.74 30 1.96 0.26 3.53 3.00
A Comparative Assessment of Two Online Video Players Robinson
Volume Control T-Test: Are they significantly different? Visual Appeal Volume Control T-Test
Ease of Use
0.000127046
0.845380347
According to the results of the T-test, the difference between visual appeal for the two video players is statistically significant at the .05 level. For Ease of use, the difference between the two players is not statistically significant at the .05 level. Volume Control Open Ended Question Feedback: “Of players B and Y, which volume control did you like most? Comment on why you like it best.”
Volume Control Preference
44%
56%
Player B
Player Y
56% of users tested preferred Player Y’s Volume control, while 44% preferred Player B’s Volume controls. Volume Control Summarization of Comments:
Page 5
A Comparative Assessment of Two Online Video Players Robinson
Volume Control Comments 100% 90%
4
80%
3.5
70%
3
60%
2.5
50%
2
40%
1.5
30%
1
20%
0.5
10%
0
0%
Category of Issue Frequency
Issue /Comment Prefer Positioning Left (B) Prefer Positioning Next to Play Button (Y) Larger Controls Smoother Sliding (Y) Prefer Vertical Bar Unable to find player B's controls Prefer Larger Range of Volume Unable to Find Player Y's controls Prefer Horizontal Bar Total
Cum. Percentages
Frequency 5 3
Cum. Percentages 22% 35%
3 3 2 2 2 2 1 23
48% 61% 70% 78% 87% 96% 100% 100%
The comments revealed several insights into user preferences for the volume control. One interesting topic is the orientation of the volume control, vertical vs. horizontal. Multiple users expressed a preference for the vertical bar over the horizontal bar, while another user preferred the horizontal bar. One user noted, “ Vertical bar gives the sense of how loud the volume is, it is just more comfortable than the horizontal one� . This presents an interesting dilemma in making a design decision: match with the user’s prior
Page 6
Cumulative Percentage
Frequency
5 4.5
A Comparative Assessment of Two Online Video Players Robinson experience and mental model or align with the rules of interaction design. On one hand, you have the user’s mental model, which favors a vertical control due to past experience and the perception of increased/decreased sound. On the other hand, you have established design principles, such as the left to right prevalence effect which proposes that in general, movement on a horizontal axis is more efficient, and takes less time than movement on a vertical axis (Weeks, Proctor, & Beyak, 1995). A preference for larger controls was also deducted from the comments, along with the placement of the control on the left hand side and in close proximity to the play button.
Timeline Scrubber Timeline scrubber ratings were evaluated on the basis of Visual Appeal and Ease of Use. The ratings were collected using a five point Likert Scale for each area of measurement. Data was collected after the completion of each task. The graph below shows mean rating for each area of measurement by video player. Youtube’s mean ratings exceed Brightcove’s in both areas of measurement for this feature. In terms of Visual Appeal, Youtube received a mean rating of 3.43 to Brightcove’ s 2.47. For ease of use, Youtube came out on top earning a mean rating of 3.73 with Brightcove at 1.93.
Areas of Measurement
Mean Timeline Scrubber Ratings
Visual Appeal
Ease of Use
0.00
0.50
1.00
1.50
2.00
2.50
3.00
3.50
4.00
Rating Scale (1-5) Player Y
Player B
Timeline Scrubber Confidence Intervals:
Page 7
A Comparative Assessment of Two Online Video Players Robinson
Timeline Scrubber T-Test : Are they significantly different? Visual Appeal Timeline Scrubber TTest
Ease of Use
3.71048E-11
0.000729053
According to the results of the T-test, the difference between visual appeal for the two video players is not statistically significant at the .05 level. For Ease of use, the difference between the two players is statistically significant at the .05 level. Timeline Scrubber Open Ended Question Feedback: “Of players B and Y, which controls for moving forward and backward did you like most? Comment on why you like it best.�
Page 8
A Comparative Assessment of Two Online Video Players Robinson
Timeline Scrubber Preference
17%
83%
Player B
Player Y
83% of users tested preferred Player Y’s Timeline scrubber to Player B’s.
Timeline Scrubber Summarization of Comments:
14
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
Frequency
12 10 8 6 4 2 0
Cumulative Percentage
Timeline Scrubber Comments
Category of Issue Frequency
Issue/Comment
Cum. Percentages
Frequency
Cum. Percentages Page 9
A Comparative Assessment of Two Online Video Players Robinson Prefer Video Thumbnail Prefer Scrubber Dot Prefer More Visible Elapsed Time Prefer Larger Controls Ability to hover over to see time Prefer to click through to scroll Dislike Video Thumbnail Total
13 12 8
30% 58% 77%
4 3
86% 93%
2 1 43
98% 100% 100%
The results of qualitative feedback yielded a deeper understanding of what the ideal online video player’s timeline scrubber would entail. A strong preference for a video thumbnail was gleaned from the feedback, followed closely by a need for the scrubber dot to enable smoother navigation through the timeline. A more visible view of elapsed time was also desired, along with larger controls. Several users also expressed a preference for the ability to hover over the timeline to see the “second count”, instead of having to drag the scrubber to navigate.
Play Button Play Button: Feedback on the play button was evaluated on the basis of an open ended question which requested that users look at each player’s play button and decide which they preferred. Many users simply noted which player they preferred, and did not comment further. Play Button Open Ended Question Feedback “Of Players B and Y, which Play Button do you like best? Comment on what you like and Dislike.”
Page 10
A Comparative Assessment of Two Online Video Players Robinson
Play Button Preference 14%
86%
Player B
Player Y
Of the two players 86% of users referred player Y’s controls. Play Button Top Comments: The majority of users preferred the location of Player Y’s initial playbutton at the center of the video rather than in the top left corner, as Player B’s had been designed. A strong preference was also shown for a larger size, and a closer proximity to volume controls while the video played.
Open Ended Suggestions: A final question was included in the study to gather feedback about the user’s overall experience and if there were any suggestions that would improve the usability of the players. Open Ended Suggestion Feedback: “Do you have any suggestions that would improve the experience of playing videos with Player B?”
Page 11
A Comparative Assessment of Two Online Video Players Robinson
9
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
Frequency of Issue
8 7 6 5 4 3 2 1 0 Prefer Y
Add Circle to Improve Improve Timeline smoothness of image quality Scrubber VC interaction
Prefer B
Cumulative Percent
Most Common User Suggestions
Automatic Play Button
Category of Issue Frequency
Issue/Comment
Prefer Y Overall Add Circle to Timeline Scrubber Improve smoothness of VC interaction Improve image quality Prefer B Overall Automatic Play Button Total:
Cum. Percentages
Frequency
Cum. Percentages
9 8 3
38% 71% 83%
2 1 1 24
92% 96% 100% 100%
From the responses to this final open ended question, we can see there was a strong preference for player Y. Another frequently listed improvement is adding a dot to the timeline scrubber, which is consistent with findings from that section. An interesting insight is to improve the smoothness of the interaction with the volume control. Two users also called for an improvement in image quality.
System Usability Scores: Data for the system usability score was collected from participants to gauge which system was characterized by better usability overall. The SUS score comparison was calculated using between within subjects analysis meaning that each participant rated both Video Players. As shown in the table below, Player Y achieved a higher mean SUS rating at 84 to Brightcove’s 70.67. The T-test determined that the two scores were not statistically significant at the .05 level.
Page 12
A Comparative Assessment of Two Online Video Players Robinson
SUS Score Mean by Video Player
Video Player
Player Y
Player B
0.00
10.00
20.00
30.00
40.00
50.00
60.00
70.00
80.00
90.00
Average SUS Score
CI for SUS Scores Mean Standard Deviation Sample Size Confidence Coeff. Margin of Error Upper Bound Lower Bound
Player B 70.67 13.52 30 1.96 1.77 72.43 68.90
Player Y 84.00 8.37 30 1.96 1.09 85.09 82.91
SUS Score T-Test: 2.68354E-07
The frequency distribution for the SUS scores of each player are depicted in the charts and tables below:
Page 13
A Comparative Assessment of Two Online Video Players Robinson
SUS Score Frequency Distribution Player B 12
Frequency
10 8 6 4 2 0 31-40
41-50
61-70
71-80
81-90
Average SUS Scores
SUS Frequency Distribution Player B 31-40 2 41-50 2 61-70 9 71-80 11 81-90 6 Total 30
SUS Score Frequency Distribution Player Y 16 14
Frequency
12 10 8 6 4 2 0 61-70
71-80
81-90
91-100
Average SUS Scores
SUS Frequency Distribution Player Y 61-70 2 71-80 7
Page 14
A Comparative Assessment of Two Online Video Players Robinson 81-90 91-100 Total
15 6 30
Section 4: Limitations and Extensions Limitations Despite the focused approach to collecting user feedback through the implementation of userzoom.com, there were certain risks involved with conducting remote usability testing. One of the largest areas of concern was ensuring that participants understood the test script, and provided the appropriate information that each task was designed to capture. In moderated usability testing, the moderator has the opportunity to steer participants back on track in the event that they misunderstand the objective of the task. With unmoderated usability testing, users are left to act on the script based on their own interpretation, which does not always yield the intended results and can impact the quality of the feedback. Several additional limitations arose after the test launched including: the varying personal preferences of users, diverse technical issues impacted by browsers and the internet, and human limitations and accessibility issues.
Extensions A few extensions to build upon this study could be to add in additional video players such as Hulu, Vimeo, or Netflix to develop richer comparisons. This would allow Brightcove to evaluate the features of every available option in the market to truly develop the most ideal solution. Collecting demographic information on users and testing different user groups would also serve as an excellent extension of this study. This would help to determine accessibility issues, as well as identify if there are noticeable differences that arise in the preferences of different user groups.
Page 15
A Comparative Assessment of Two Online Video Players Robinson
Section 5: Key Takeaways Current User Preferences Volume Control: Ease of Use: Youtube Visual Appeal: Brightcove Overall: 56% Youtube Timeline Scrubber: Ease of Use: Youtube Visual Appeal: Youtube Overall: 83% Youtube Play Button: Overall: 86% Youtube SUS Score: Youtube : Average 84 Brightcove: Average 70.67
Design Recommendations: Based on user feedback uncovered in the study, the following design recommendations have been derived in order to deliver the most ideal solution to the online video player market. Volume Control:
Horizontal Volume Control to Align with interaction design principles Place in close proximity to the play button Increase the size of the control Place a larger end point for easy dragging Optimize the smoothness of the interaction of dragging along the control
Timeline Scrubber: Add thumbnail images to the scrubber control Increase the size of the scrubbing dot Display the total elapsed time on the left in close proximity to the play button and increase the size Display the time when hovering over the timeline Play Button: Large Center Play button Initially Increase the size of button Place on the left side in closer proximity to the play butt
Page 16
A Comparative Assessment of Two Online Video Players Robinson Low Fidelity Prototype:
Page 17
A Comparative Assessment of Two Online Video Players Robinson
References Weeks, Proctor, & Beyak. (1995). Stimulus-response compatibility for vertically oriented stimuli and horizontally oriented responses: evidence for spatial coding. Experimental Psychology, 367-383.
Page 18