The Ultimate
Digital Test Coverage Guide How to Build and Implement a Test Strategy for the Multi-Platform Digital Age A GUIDE BY ERAN KINSBRUNER
A New Approach to Achieving Digital Quality There’s no doubt the world is becoming more digital. Today we interact with a screen for most of our daily activities. In fact, it’s become so common that recent research by Google concludes that 90% of users transition between screens on a regular basis to accomplish a single task. Users today are simply less tolerant of buggy performance on the websites and apps they love. As such, test coverage strategies can’t just work for functional tests anymore. App responsiveness and quality are critical to the digital experience and add additional layers to testing strategies, such as performing tests under real user conditions. For app developers, this shift to a digital-first world requires a comprehensive digital test strategy. It asks for more rigor and accuracy in how digital test labs are built and maintained, what platforms to include and how often to test on these platforms, all while dealing with rapid market changes. As a market leader with over 1,500 customers in industries such as retail, travel, telecom and banking, Perfecto is uniquely positioned to provide advice on digital quality testing. For this guide, we talked to our most-valued customers and analyzed exclusive enterprise cloud usage data. The result is a complete guide on how to approach, implement and maintain a winning digital test strategy based on the experiences of Fortune 500 companies. You can use this guide as a companion to our: • Digital Test Coverage Index, which provides an overview of the key mobile devices,
operating systems and browsers you should be testing on in 2016 • Digital Test Coverage Optimizer, an interactive web tool for your specific organization
that will help you find out which devices are important to include in your test strategy
Contents
2
Chapter 1: Assessing the Digital Test Coverage Landscape
Pg. 3
Chapter 2: Putting the Digital Test Coverage Pieces Together
Pg. 7
Chapter 3: Implementing Your Digital Test Coverage Plan
Pg. 10
Chapter 4: Real World Challenges of Digital Testing
Pg. 14
“Users today are less tolerant of buggy performance on the websites and apps they love.”
Chapter 1: Assessing the Digital Test Coverage Landscape What Is ‘Test Coverage’? When trying to assess and plan for test coverage in a lab, there are many considerations. The digital market has become incredibly complex and fragmented, making the synchronization between the various platforms (web, mobile, IoT) extremely difficult. Fig 1 and Fig 2 below demonstrate just how quickly new devices and browsers are introduced in a given year. It’s a seemingly endless introduction of new device types, Android, iOS and Windows Phone OS updates, and web browser updates – two to four different browser releases each month.
Apple releases a bug fix every 1.3 months
Mobile Market Calendar 2016 Be sure to plan ahead to get maximum test coverage. JANUARY
FEBRUARY
MARCH
Honor Holly 2 Plus
Galaxy S7 Galaxy S7 Edge
HTC One X9
Xiaomi Redmi 3
DEVICES
LG K7 Live G500TG
iPad Air 3 iPad Pro 9.7" iPhone SE
APRIL
MAY
LG G5
JUNE
JULY
AUGUST
Galaxy A9
Note 6
Sony Xperia X
OnePlus 3
SEPTEMBER
OCTOBER
NOVEMBER
DECEMBER
iPhone 7 iPhone 7 Plus iPad Mini 5 Watch 2
iPad Pro 2
LG V11
Surface Phone
Nexus 7P
Xiaomi Mi 5 HP Elite X3 (Win 10)
Droid Turbo 3
Nexus 6X
HTC One M10
Sony Xperia X
iOS 9.2.1
iOS 9.3 Beta public availability
Media Pad M2
P9
iOS 9.3 iOS 9.3.1
iOS 9.3.2 Beta
iOS 9.3.2
iOS 10 announcement
OS
iOS 10 release iOS 10.0.1 release
Windows 10
iOS 10.2 release
iOS 10.1 release
Android 6.1
PROJECTED TYPICAL ADOPTION OF MAJOR OS
Android 7 release
Galaxy Note Galaxy Note 4 Galaxy S4/S5 S6/Edge/Note 5 Upgrade available Upgrade available Upgrade available Android 6.0 Android 6.0 Android 6.0
Android 6 2% share
Android 6 10% share iOS 9 = 77% iOS 8 = 17% Other = 6%
Android 6 15% share iOS 10 = 35% iOS 9 = 60% Other = 5%
Fig 1: Mobile Landscape Calendar 2016
3
Browser Calendar 2016 Be sure to plan ahead to get maximum test coverage. JANUARY
FEBRUARY
MARCH
APRIL
MAY
JUNE
JULY
AUGUST
SEPTEMBER
OCTOBER
NOVEMBER
DECEMBER
48
49 50 Beta
50 51 Beta
9.0.6
51 52 Beta
52 53 Beta
9.0.8
53 54 Beta
10.0.1
54 55 Beta
55 56 Beta
9.0.3
9.0.4
9.0.5
47 48 Beta
9.0.7
48 49 Beta
49 50 Beta
9.0.9
50 51 Beta
51 52 Beta
10.0.2
44 45 Beta
45 46 Beta
46 47 Beta
Fig 2: Web Landscape Calendar 2016
With all these new releases, it becomes clear that organizations cannot test all the browser and OS versions manually and need the help of an automated test lab, either handled internally or by a third-party service.
Coverage as a Company-Wide Priority The process of planning your test coverage should be based on a set of rules and considerations that align with organization’s business needs and target markets. Many companies think of test coverage as solely the testing team’s responsibility. But building a strategy that ensures a seamless digital experience for the user is very much in the joint interest of the entire company, including marketing, line of business, product, QA and developers. The marketing and business groups think about test coverage from a broad, user experience perspective as they build the brand through marketing activities, messaging and partnership efforts. The dev and test teams have an eye on a more focused set of platforms to develop and test against to reduce the number of production defects and quickly identify problems with new release versions by testing early and often. Each of these groups lose when development and testing are done with the wrong test lab setup. So this is a great opportunity to work cross-functionally to make sure everyone in your company has an eye on the digital experience.
4
iOS X 10.12
52 53 Beta
Business
IT
Collaboration
Fig 3: Digital test coverage as an organizational strategy
Building the Test Coverage Mix The best way to define and plan your digital test coverage is to have a mix of the most relevant data points within your organization—such as what devices and browsers people use to visit your app and website and who your competitors are—and by keeping an eye on market share data and device releases and browser updates (See Fig 1 and Fig 2). The way we recommend structuring your test coverage is to combine the following sources into one Venn diagram. As Fig 4 shows, this structure includes a combination of market share data (which you can get from research firms like IDC and comScore) as well as customer analytics and insights about your direct competitors, both of which you can get from your marketing groups.
Market Share
My Customers
Competitive Data
Fig 4: Our recommendation for the right test coverage mix
5
Once you are able to gather the above insights and make an informed decision on the test lab setup, you’re ready to think about the size of your lab. The DevTest team should understand how many of the test lab devices and platforms they actually need to cover all manual, automation, performance and other test cases and whether these platforms are sufficient for the entire team, based on head count. Sizing the lab correctly can positively influence your team efficiency by eliminating cases where developers or testers are waiting for a device or desktop to be released by another user. It can also help reduce your overall testing cycle time by executing more tests in parallel on more platforms.
6
Chapter 2: Putting the Digital Test Coverage Pieces Together Collecting the Data Now that you have an understanding of how to strategically approach your test coverage effort, let’s dive into how to actually go about it. Dev and test teams should work with marketing and business departments to obtain the most recent digital traffic analysis. This will provide the teams with focused data about which popular locations, browsers, mobile OS/devices were used recently to access their digital services. This info alone is not enough to build the lab because it doesn’t take into account the larger market and vertical
competitors, but you should know exactly what current users are doing so you can start testing and optimizing the digital experience for them. After you have internal traffic data, you can work with the product team to understand requirements and considerations based on product research and planned features, and also incorporate your competitive analysis into the mix. Then you can gather the industry benchmarks and a forecast of what’s coming down the road so you have your marketing data covered.
Device and Platform Criteria With the internal data in hand for your test strategy, let’s start defining the right mobile device considerations. Perfecto recommends teams include in their test labs one or more platforms (smartphone, tablet, desktop browser) from each of the four groups below, no matter which test coverage goal they’re trying to meet. 1
Reference Devices: This is a key group because it includes devices such as Google’s Nexus. These devices are important because they will always be the first to get beta and GA versions of the Android OS. This allows dev and test teams enough time to test their apps on the upcoming platform. These devices should be part of the test lab despite their market adoption or share.
2
Popular Devices: This group is a no-brainer to include in your test lab and can consist of devices coming from both your customer data and the greater mobile market.
3
Legacy Devices: In this group, we find older iPads, Samsung devices, browsers and mobile OS versions. These devices are popular in various markets and as such, require testing. However, they’re often slow to receive the latest OS updates. Also, running on older hardware with less CPU and memory can be challenging for modern applications that support newer features.
4
Emerging Devices: It’s imperative to treat your digital platform as an ongoing effort and therefore it’s crucial to keep an eye on new devices, operating systems and other trends and be prepared to test them. This can be new devices or major beta versions of iOS or Chrome. Including these devices in the mix can save R&D time later on and also position your brand as ahead of the curve.
7
Including User Conditions in Your Test Coverage After you have the right mix of devices for your lab, you’ll need to think about the real life conditions your users experience each day. What networks do they use? Are they on Wi-Fi? How many apps do they run at once? The digital platforms that consumers use are complex and you’ll want to re-create specific user environments as much as you can in your test lab. In Fig 5, we refer to user environments such as network conditions, specific device and OSes in various locations, with apps running in the background. In addition, you should consider testing for these conditions on different screen orientations (portrait and landscape), whether for a responsive website or other, as screen orientation is one of the most common problem areas across different devices and platforms.
Fig 5: User environment considerations 8
By taking a set of tests that run against a given number of devices and desktop browsers on a functional level and factoring in real user conditions, you enhance your test coverage (Fig 6). You add more depth to your testing and increase the likelihood of meeting your user expectations. To advance your overall test environment, you should define your target personas, whose behavior is defined by their platforms, locations and background apps.
Fig 6: Adding user conditions enhances your test coverage
Most Common User Conditions to Test Against: 1. Switching Networks
4. Memory Consumption
2. Apps Running in the Background
5. Sudden Spikes in Mobile Traffic
3. Phone Call Interruptions
6. Low Battery
9
t of It’s bute of iable of the peaks n have ated color reating exveying
as mpeteeks and fecto’s
Chapter 3: Implementing Your Digital Test Coverage Plan The Steps for Defining the Right Coverage As an initial step in building a mobile device lab, identify the following testing considerations: 1. Supported locations (countries) 2. Supported mobile platforms (iOS, Android, Windows 10) 3. Target test coverage level (Essential, Enhanced, Extended)
The recommended practice for using these considerations along with real app or web traffic analytics is to give a higher score and priority to the analytics and complement the lab coverage with niche platforms and new and upcoming devices, as well as older OS versions. Based on market research and customer validations, Perfecto recommends building your lab with the following guidelines (see Fig 7) to achieve the optimal test coverage and the least risk: 376 387 Keystones
C54 M0 Y100 K0 R130 G188 B65
C12 M0 Y80 K0 R227 G233 B53
1. Test on the top 10 different device/tablet models on various OS versions, in what we call
Foundations
Black 3
7460
7493
C67 M44 Y67 K95 R33 G39 B33
C100 M6 Y2 K10 R0 G134 B191
7545
355
the “Essential” group
5503
2. Follow up by expanding to a list of top 25 total devices, including the top 10. This second
group, the “Enhanced” group, will represent an optimized list from the market’s longest tail of popular, legacy and186new device/OS combinations. 109 300 151 C25 M4 Y44 K3 R187 G197 B146
C58 M32 Y18 K54 R66 G85 B99
C91 M0 Y100 K0 R0 G150 B57
C39 M2 Y14 K10 R148 G183 B187
C99 M50 Y0 K0 R0 G94 B184
C0 M60 Y100 K0 R255 G130 B0
C2 M100 Y85 K6 R200 G16 B46
Accents
Wordmark Special Dark Field Use
7541 C7 M1 Y3 K2 R217 G225 B226
C0 M9 Y100 K0 R255 G209 B0
3. From there, you can move to the third group for “Extended” test coverage that
can be met by testing on 32 different device/OS combinations. In this group, we will either see devices that are older but still need to be tested against, or very new but not popular yet.
Usage 100%
Mobile Device Coverage Essential coverage – Top 10 "must test" devices based on usage Enhanced coverage – Top 25 devices, that include legacy and trending devices and different screen sizes Extended coverage – Top 32 devices, that include niche, legacy and brand new devices to represent the "long tail"
10
+
Essential 10
Fig 7: Test coverage groups
15
Enhanced
+
7
Extended
# Unique Device/OS Combinations
As part of each group mix, the following attributes should be considered: • Device and OS popularity (market share) • Screen sizes, resolution and other screen attributes
such as pixel per inch (PPI) • Device age (launch date) • New and trending devices and platforms
• Operating system version update rate (e.g. reference
devices like Android Nexus get a higher score) • Unique device properties important for testing purposes
– chipset, CPU, memory • Audience demographics
The second and third groups offer the highest test coverage and the least risk to digital teams. You should also have an ongoing process of refreshing your test lab, usually on a quarterly basis, to make sure nothing major has changed in your user base or in the market that would
Fig 8: Example of how to define test coverage
require a new device or OS. If you would like help with creating the right mix of recommended devices for your organization, check out our Digital Test Coverage Optimizer (seen in Fig 8). This interactive tool will guide you in defining the right mix of OS and device platforms based on your needs.
11
Web Testing Methodology For a web testing lab, the method is a bit different. Because major releases are predictable for Chrome, Firefox and Safari, and because there is a clear forecast for beta version releases, we recommend the following algorithm for defining your web test lab (Fig 9 on the next page describes a recommended web testing model): 1. For Firefox, you’ll want to test the browser against
the following combinations: • Latest version against three major Windows
OS versions • Previous version against three major Windows
OS versions • Both the latest and previous versions on the
latest two Mac OS versions • Latest beta version of Firefox against the
latest Windows OS and the latest and legacy versions of Mac OS
12
2. For Chrome, the leading browser in the market,
the coverage method requirements are a bit wider than the other two, so you should test on: • Latest version against three major Windows
OS versions • Previous version against three major Windows
OS versions • Both the latest and previous versions on the
latest two Mac OS versions • Latest beta version of Chrome against the
latest Windows OS and the latest and legacy versions of Mac OS
Browser Version/OS Combinations
Browser Versions
Web Test Coverage – Global Firefox
Chrome
Safari
IE
Edge
45 (Latest)
49 (Latest)
9-El Capitan (2015)
11 (Win 7, 8.1,10)
Edge 13, Nov 2015
44 (Previous)
48 (Previous)
8-Yosemite (2014)
9 (Win 7)
46 (Beta)
50 (Beta)
7-Mavericks (2013)
8 (Win 7)
44 x Win 7
48 x Win 7
9 x El Capitan
11x Win 7
45 x Win 7
48 x Win 8.1
9 x Yosemite
11 x Win 8.1
46 Beta x Win 7
48 x Win 10
9 x Mavericks
11 x Win 10
44 x Win 8.1hh
49 x Win 7
8 x Yosemite
9 x Win 7
45 x Win 8.1
49 x Win 8.1
7 x Mavericks
8 x Win 7
46 Beta x Win 8.1
49 x Win 10
44 x Win 10
48 x Mac El Capitan
45 x Win 10
48 x Mac Yosemite
46 Beta x Win 10
49 x Mac El Capitan
44 x Mac El Capitan
49 x Mac Yosemite
44 x Mac Yosemite
50 Beta x Win 10
45 x Mac El Capitan
50 Beta x Win 7
45 x Mac Yosemite
50 Beta x Mac El Capitan
13 x Win 10
46 Beta x Mac El Capitan
Fig 9: Web Coverage Index
3. Safari releases are announced annually at Apple’s WWDC
event. The recommended practice for testing that browser platform is: • Latest Safari version on the three latest major
Mac OS versions • The two previous Safari versions against their reference
OSes (i.e. Safari 8 on Mac OS Yosemite)
4. Internet Explorer (IE) is going to be replaced in the near
future by Edge, but most people still use older Windows OS versions like Windows 7, 8.1 and 10 with IE 11, IE 9 and also IE8. We recommend testing on: • Latest IE (currently version 11) against the three major
Windows OS versions • The previous two IE versions (IE 8 and IE 9) are
unsupported by Microsoft as of early 2016, but these browser versions are still being used on Windows 7. Therefore we recommend continuing to do sanity testing against these combinations. If your customer analytics show zero usage for these versions, you can remove them from your test coverage matrix. • Latest Microsoft Edge browser on Windows 10
7
13
Chapter 4: Real World Challenges of Digital Testing While writing this report, we engaged with dozens of enterprise customers at various stages in their digital transformations – from organizations just moving to a multi-channel strategy to ones that are already fully digital. Here’s what we learned about the challenges and pitfalls of testing for digital quality: The IT-Business Relationship Test coverage requirements vary based on a company’s digital maturity. Less advanced organizations will have no collaboration with marketing teams, resulting in no access to customer analytics and no real insights about app or website performance. In such cases, DevTest teams will be disconnected from what devices and platforms their customers are actually using, and their strategy will depend mostly on market share data, which is not an accurate enough indicator of customer expectations. To deliver a true digital experience, technical and business teams should be working from the same set of customer analytics to determine the best personas (demographic info, mobile traits) and user conditions (types of apps running in background, incoming calls and alerts, etc.) to build and test against. With goals now being shared, the IT-business disconnect will diminish and app and website performance will be more targeted and high quality. Don’t Take Your Eye Off the Market Too many teams have blind spots because they’re not observing market trends on a regular basis. Watch the market to keep track of newly released devices and OS versions, but more importantly, keep test labs configured against older OS versions that are still in use and that may be exposed to bugs and poor performance.
14
Chart from a Perfecto customer showing growth in iOS usage after the release of a new version.
Who’s in Charge? We learned that test coverage responsibility varies between organizations. Sometimes the “Director of QA” is the leader, and in other companies, it’s a “program manager”, “digital consulting and analytics manager”, “director of technology” or a “VP of digital services.” Because digital quality is “everyone’s business”, there should be a defined owner within the organization with this responsibility. This owner needs to stay in sync with all the digital quality stakeholders to make sure the lab that digital teams are using for development and testing is always up to date. Test Lab Refresh Rate We also found that many organizations are not sure when to update test labs for new devices and OS versions. A majority of organizations do a lab refresh every 1-2 quarters. We recommend doing a lab refresh every quarter as an ongoing practice to keep pace with all the new devices, OS versions and browser updates hitting the market.
15
Conclusion In this guide, we outlined the key challenges in defining and building a digital test coverage strategy and test lab. We focused specifically on aligning business and technical goals, leveraging the best data sources, and selecting the right devices. We provided practical guidelines for both mobile and web test coverage, taking into account market usage, customer analytics, platform characteristics, user conditions, and more. It’s important to note that test coverage definitions will vary based on a company’s industry, its organizational maturity in the digital space and its application types (mobile web, responsive web and/or native apps). But whatever level your company is at, using this guide, along with our Digital Test Coverage Index and Digital Test Coverage Optimizer, will help you get closer to providing a consistent and memorable digital experience for your customers.
See which devices should be covered in your test plan by visiting: www.perfectomobile.com/testoptimizer
About Perfecto Perfecto enables exceptional digital experiences. We help you transform your business and strengthen every digital interaction with a quality-first approach to creating web and native apps, through a cloud-based test environment called the Continuous Quality Lab. The CQ Lab is comprised of real devices and real enduser conditions, giving you the truest test environment available. More than 1,500 customers, including 50% of the Fortune 500 across the banking, insurance, retail, telecommunications and media industries rely on Perfecto Mobile to deliver optimal mobile app functionality and end user experiences, ensuring their brand’s reputation, establishing loyal customers, and continually attracting new users. For more information about Perfecto Mobile, visit perfectomobile.com, join our community, or follow us on Twitter at @PerfectoMobile.
www.perfectomobile.com 781.205.4111
© 2016 Perfecto Mobile Ltd.