2016 P3 Commsday Mobile Benchmark

Page 1

A special report analysing the 2016 data


This year’s benchmark Welcome to the 2016 P3 CommsDay Mobile Benchmark: an in-depth objective comparison of the user experience for voice and data on Australia’s three mobile networks, operated by Vodafone, Optus and Telstra. In 2016, the third year that we’ve run the benchmark, mobile network performance is of increasing interest to more and more Australians. Just a few weeks ago, Deloitte’s Mobile Consumer Survey found that 84% of the country’s population now own a smartphone – a 5% increase through the last year, and ahead of this year’s global average of 81%. For the three months to June this year, the Australian Bureau of Statistics found that mobile handset users downloaded an average of 1.8GB of data per user per month. And earlier in the year, communications CommsDay’s own analysis showed P3 P3 is a leading international consulting, engineering and testing services that Australia’s mobile carriers had company with a rapidly growing team of more than 3,000 consultants and engineers working to develop and implement innovative solutions to today’s one of the highest ratios of 4G sub- complex technology challenges. In 2015, P3 posted a turnover of more than million EUR. Offering a broad portfolio of services and proprietary tools scribers to total mobile base in the 300 to the automotive, aerospace, telecommunications and energy industries, world, indicating that Australians are P3 adds tangible value that helps clients succeed at every stage, from innovation to implementation. In the telecommunications sector, P3 communiparticularly hungry for the latest and cations provides independent technical and management consulting services including network planning, end-to-end optimisation, security, QoS fastest mobile technology. and QoE testing, international benchmarking, device testing and acMore than ever, customers are de- ceptance services. P3’s clients include network operators, equipment vendors, device manufacturers, public safety organisations and regulatory aumanding the best possible perforthorities around the world. P3 has been conducting public mobile benchmarks in Germany since 2002 and in Austria and Switzerland since 2009; it mance not just from the latest introduced public benchmarks in Australia and the UK in 2014, and in the smartphones but from their mobile US and the Netherlands in 2015. In 2016 alone, P3 communications compiled about 10,000 measurement days in 48 countries across five continetworks. Vodafone, Telstra and Op- nents, with its test vehicles covering more than 1 million kilometres. tus are continuing to spend billions CommsDay of dollars on upgrading and expand- Communications Day is the primary source of news, information and analysis for the telecommunications industry in Australia and New Zealand. ing their networks to meet those de- Launched in 1994 as a daily journal, CommsDay is now read daily (and frequently cited as a reference) by thousands of senior executives, minismands. This benchmark gives an ters, policymakers and regulators across the sector. The brand has also idea of how that investment is trans- expanded to include a digital magazine and an annual program of major conferences in Sydney and Melbourne. lating into an improved experience Written by: Petroc Wilton in practice for smartphone users. Principal benchmark responsibilities, P3 communications: Marcus Brunner, Maziar Kianzad, Ralf Pabst In the following pages, you’ll find Benchmark and testing coordination, P3 communications: Nebojsa Tomasevic, Thomas Meier design: Grahame Lynch a detailed description of how we ran Production Photography: Munir Kotadia This publication is copyright to Decisive Publishing. All rights reserved. the tests for this year’s benchmark and how we’ve updated our methodology to capture the end-user experience more accurately than ever. You’ll find an indepth look at performance on voice and data in large cities, smaller cities and towns and connecting roads as well as the all-important master score table, showing how each of the three operators compares to the others on key metrics. And there’s a comparison

Go straight to page 12 for the final verdict


of network performance for each operator in 2014, 2015 and 2016, showing you just what’s changed in the last year. Finally, if you’d really like to drill down into the details, you’ll find a complete breakdown of the raw test data, drawn from a huge set of around 195,000 individual samples taken over weeks of scientific testing – significantly more extensive even than last year’s 150,000 measurements. Petroc Wilton CommsDay Group Editorial Director

Methodology: updating the benchmark for 2016 First published in 2014, the annual P3 CommsDay Mobile Benchmark was intended to provide a unique resource for consumers: a totally independent and impartial comparison of Australia’s three mobile networks, unprecedented in the scope, detail, and scientific rigour of the testing that underpinned it. We launched the benchmark because smartphones are now an essential part of life for many Australians, and it should be as easy to compare mobile networks as it is to read reviews of new phones – after all, the latest and most advanced handsets need to be supported by equally high quality mobile networks. Since launch, the benchmark has become an established annual event that shows how the experience on each network is changing over time. It has received widespread coverage from the press and consumer organisations, and is now keenly anticipated by the operators themselves. We work closely each year with Optus, Telstra and Vodafone on the design of the tests and the results provide them with an objective view of their respective strengths, the effects of all their network investment, and areas for improvement. As with previous years, 2016 has seen all three operators make significant investments in improving the reach and performance of their mobile networks, not least in terms of mobile spectrum. Radiofrequency spectrum is the specific part of the electromagnetic frequency range that is used to broadcast signals for mobile phones, along with other things such as radio and TV. Operators pay significant licence fees to the Australian government for exclusive use of certain frequencies, ensuring nothing else

CommsDay P3 Mobile Benchmark: December 2016 Page 2


can interfere with their mobile phone signals. It’s a key resource for operators; generally speaking, more spectrum means more bandwidth for data and faster mobile internet, and certain frequencies have other advantages as well. This year, both Telstra and Optus have continued to increase the number of locations that use their 700MHz mobile spectrum – powering what Telstra calls its ‘4GX’ service and Optus refers to as ‘4G Plus’, both promising faster internet and an improved experience for users with compatible devices. They spent almost two billion dollars between them for the 700MHz licences in a 2013 auction, with Telstra buying twice as much as Optus; 700MHz is a relatively low frequency, and low frequency spectrum is important because it can transmit better over long distance and penetrate walls more effectively, providing better indoor service. While Vodafone didn’t buy any spectrum in the 2013 auction, 2016 saw it finish a crucial ‘refarming’ process for its existing low-band spectrum. Refarming is a well-established process by which operators shift more of their spectrum to newer mobile access technologies; in March, Vodafone completed the move of its entire 2x10MHz holding in the 850MHz band to 4G, double the amount originally planned, again giving it stronger indoor penetration and range. It has also started to refarm 2100MHz spectrum from 3G to 4G in some areas. 2016 has seen an extra boost to regional mobile network coverage thanks to the Australian government’s Mobile Black Spot Program. This initiative matches Commonwealth, state and local government funding with specific investments from participating operators to establish mobile coverage and competition in previously underserved regional and areas. This is building 499 new and upgraded mobile base stations around the country. When completed, these are intended to provide new handheld coverage to 68,600 square kilometres, and dozens have already been finished and activated. The operators have also announced plenty of other network investments and upgrades through the year. Optus, for instance, has been targeting better coverage at major sports stadiums, trialing a centralised radio access network to better manage interference. And Telstra is prioritising A$250 million of its capital program to boost network resilience and performance; A$50 million of that specifically on its mobile network, and A$100 million on its core network.

CommsDay P3 Mobile Benchmark: December 2016 Page 3


As in previous years, we’ve updated our methodology for 2016 to capture as many of the latest network developments as possible. In particular, we’ve brought in a new walk test, using proven P3 technology to facilitate measurements indoors at shopping malls, on public transport, and at major tourist hotspots – some of the use cases where lowband spectrum becomes really important.

SCALE AND SCOPE Australia’s sheer size, geography, and population distribution creates a major challenge in measuring network quality around the country. Its current population of over 24 million people is distributed around a landmass of 7.7 million square kilometres, with most people concentrated in the large cities on the east, west and south coastlines but a significant proportion living in more remote areas. It’s also worth noting that the operators have focused to differing extents on providing coverage to regional Australia; Telstra, for example, claims the broadest network coverage and the most coverage of regional areas. To provide the most representative results possible, however, our drive test vehicles covered a massive 17,000 test kilometres or so across several weeks of dedicated testing. While it’s simply not possible to test across the entire country, the routes and test areas were carefully selected to cover a significant share of the Australian population across different types of coverage areas – urban, suburban and regional, as well as connecting routes – to enable conclusions to be drawn on the overall state of the market. We tested in Busselton, Bunbury, Perth, Brisbane, Tweed Heads, Canberra, Byron Bay, Geelong, Coffs Harbour, Port Macquarie, Newcastle, Sydney, Wollongong, Nowra, Queanbeyan, Albury, Melbourne, Warrnambool, Ballarat and Adelaide – nine cities with a population of over 100,000, as well as eleven towns and smaller cities, and connecting roads running through more remote regional areas. Our new walk tests also opened up a new level of granular and detailed testing in the five largest cities, giving us deep insight into network performance in central business districts, selected tourist hotspots and even major shopping malls in Sydney, Melbourne, Brisbane, Adelaide and Perth. Altogether, the areas selected for the 2016 test accounted for well over 14 million people – more than 60% of the Australian population.

THE DRIVE TESTS As usual, for our drive testing, we used paired test vehicles equipped with P3’s measurement hardware to collect voice and data samples; they tested 3G to 4G and 4G to 4G

CommsDay P3 Mobile Benchmark: December 2016 Page 4


voice calling, as well as data transfer in 4G-preferred mode. The paired cars took the same overall routes around the country in parallel but were never in the exact same place at the same time, to avoid the risk of one distorting the measurements of the other; they also took different routes around cities to maximise the area covered. One big change for 2016 was that we actually deployed two pairs of test vehicles, both starting in Sydney but then taking separate routes around the country. This enabled us to take an even more comprehensive volume of test samples than in previous years, while reducing overall measurement time – and thus also reducing any disruption caused as operators made sure their networks were in peak working condition for the tests. Using some of the new and compact testing technology that P3 developed for last year’s benchmark, we were again able to use a ‘fly-in, fly-out’ approach to drive testing. Our test personnel could store their equipment in a small suitcase, fly it into any location with an airport, install it into a fresh pair of hire cars within a few minutes and immediately begin testing. This meant that we could easily test in locations such as Perth, which would have been hard to reach over land.

THE WALK TESTS New for 2016, we introduced walk tests into the benchmark. Just as with our drive tests, these included both 3G to 4G and 4G to 4G voice calling, as well as data measurement in 4G-preferred mode. But where our drive voice tests ran voice calls between our paired measurement vehicles, our walk testers in Sydney, Melbourne, Adelaide, Brisbane and Perth were calling a second party located in Sydney, in an areas with strong signal coverage from all three operators. The big advantage of walk testing was that it allowed us to gather comprehensive measurements in locations like shopping malls (for example Sydney’s Pitt St Mall, Westfields Marion in Adelaide or the Galleria shopping centre in Perth), central business districts, and tourist hotspots like Melbourne’s Southbank or The Rocks in Sydney. We were even able to test on public transport, such as buses or metro train lines, and used several public transport options in each city. In addition, walk testing allowed us to specifically test indoor coverage, an important part of smartphone enduser experience. That in turn allowed us to phase out attenuation in our benchmark measurements. Attenuation is a reduction in the strength of mobile phone signals; in the real world, it’s caused by factors such as the signal passing through walls or other obstacles. In previous years, we’d set up artificial attenuation to reflect these real-world conditions – first by using rather bulky hardware and arrays of rooftop antennas for our drive test cars, and later by using P3’s own compact Antenuatr technology. However, the notion of using additional attenuation has sometimes been contentious in some of P3’s key markets, as well as introducing extra complexity into the testing process; walk testing solves this problem by making it possible to measure indoor performance in a dedicated way.

CommsDay P3 Mobile Benchmark: December 2016 Page 5


THE SMARTPHONES For 2016, in both drive and walk tests, we used arrays of Samsung Galaxy S5 Cat 4 smartphones for voice testing and Samsung Galaxy S7 Cat 9 smartphones for data measurements. We used commercially available firmware for the handsets, corresponding to the original network operators’ branded versions, and the most comprehensive mobile tariff plan available from each operator. Telstra, Optus and Vodafone provided most of the SIM cards for the test, and made sure that fair use policies – sometimes used to limit download speeds for high-usage customers – would not interfere with testing. Our staff also anonymously purchased SIM cards for some of the testing as a control, ensuring the supplied SIMs matched those available in stores.

DATA TESTING Just as with last year, one smartphone per operator in all drive testing (and, for 2016, walk testing as well) was dedicated to data measurements – and set to 4G-preferred mode to represent the typical Australian enduser experience. Telstra, Vodafone and Optus all still maintain 3G networks, but 4G is a very well-established technology now; operator investments in both radio spectrum and infrastructure are mainly focused on 4G, and almost all smartphones available in stores are 4G compatible. However, smartphones will still fall back to the older 3G networks in areas where 4G is not available; since we tested in smaller towns and connecting roads as well as in major metropolitan areas, this was naturally reflected in our measurements. Also similar to previous years, we ran a battery of different data tests: accessing some of the most popular web pages as well as static reference pages, downloading 3MB files and uploading 1MB files to simulate the transfer of photos, smartphone apps and other day to day data loads, and playing YouTube videos. We also repeated our ‘peak tests’ on both upload and download, pushing as much data as possible across the network across a continuous 10-second window and measuring average throughput; this really demonstrated the maximum speeds possible on each network. The most important factors of data performance for endusers are still stability and

CommsDay P3 Mobile Benchmark: December 2016 Page 6


reliability. We all appreciate high speeds but, if emails fail to send or web pages time out without loading, we become frustrated very quickly. Therefore, as with the 2014 and 2015 benchmarks, our data scoring put a lot of emphasis on success ratios: that proportion of web pages that loaded completely, file up- and downloads that finished without a hitch, and so on. Other metrics included the average time taken to transfer files, startup times for YouTube videos, and of course the average throughput for peak testing. This year, for video tests, we also used YouTube’s adaptive bitrate feature to adapt the resolution to available network conditions – and measured the resulting resolution for all test samples.

VOICE TESTING Voice calling might be amongst the most basic features of a modern smartphone, but endusers tend to have quite demanding expectations of their voice service. They want to be able to make calls from anywhere, whether mobile or stationary, without dropouts or interruptions – and often while they’re downloading emails or other data in the background at the same time. Therefore, we set fairly challenging voice test parameters. Reliability is the single most important aspect of a voice call, with endusers having a very low tolerance for calls that drop out or fail to connect. So the heaviest score weighting was on call success ratios – that percentage of all voice call tests that were set up and completed successfully, with a minimum speech quality throughout. Other metrics included the time taken to set up the call, and speech quality – scored with the Perceptual Objective Listening Quality Analysis Wideband Algorithm, a standardised five-point quality scale. All voice testing was run alongside continuous background data traffic to reflect a typical environment for today’s smartphone user. While you’re making a voice call, applications on your phone will often be sending data to and from the internet in the background, whether it’s from social media, instant messaging platforms or email clients. We simulated this in a controlled way with the random injection of small amounts of HTTP background traffic on our test phones. As with the previous year, our voice testing needed to allow for the effects of a technology called Circuit Switched Fallback (CSFB). Because the 4G standard was not originally designed to natively support voice calling, making a call on a phone in 4Gpreferred mode will actually result in a switch back to the 3G network to set up the connection. But CSFB has some side-effects; the switchover can result in paging problems and call setup delays, and if users are downloading data while making the call, their data transfer rate will drop to 3G speeds until the call is finished. To account for this, we used two voice channels for all our testing. The first channel had the smartphones at both end of the call set to 4G-preferred; the second had one phone set to 3G-preferred and the other to 4G-preferred. The test results were drawn from a combination of calls across the various configurations, with the test scores reflecting aggregate performance.

CommsDay P3 Mobile Benchmark: December 2016 Page 7


New for 2016, we also ran a separate case study on Voice over LTE – a protocol which natively supports voice on 4G, and thus promises to tackle the issues around CSFB – outside the scope of the main benchmark.

CONSULTATION AND INDEPENDENCE For 2016, as with previous benchmarks, we spent months consulting with Vodafone, Telstra and Optus before the start of testing, starting back in January – and kept communications open during the test period itself. The operators know their own networks better than anybody, and it was critically important that we be as transparent as possible in designing the benchmark parameters and selecting the latest mobile technology to test. We did not disclose the exact test routes to the operators, and in fact the final routes were chosen by random lottery from a number of alternatives immediately before testing; this avoided any possibility of operators bringing additional capacity online in specific areas to influence the results. However, we provided all three operators with an overview of the scope and timeline – giving each of them the opportunity to factor the data into their own network planning processes and, for instance, to avoid scheduling any planned outages (for network upgrades, additional mobile basestation deployments and the like). Maintaining communications throughout the testing ensured that operators could provide any relevant information about unplanned outages should they happen to coincide with the test period. In such cases, P3 is able to evaluate the impact on measurement results and potentially consider the effects of these outages on the final results. In other countries where P3 mobile network benchmarks are well established, P3 has observed that a side-effect of frequent network-testing is that it helps motivate operators to invest in constant performance improvements – meaning that customers eventually benefit.

The raw results in depth VOICE This year’s voice scores were strong overall, with all operators very close to each other. On the heavily weighted call success ratio – a key measurement of reliability and very important for enduser experience – all operators performed very well. Vodafone pulled ahead in drive testing in major metropolitan areas at 99.2%, almost an entire percentage point ahead of Optus which was closely followed by Telstra. Major metro walk testing had Vodafone and Optus tied at 99.3% with Telstra just behind at 99%. Vodafone had even better success ratios in drive testing around smaller cities and towns, at 99.7% to Optus’ 99.3% and Telstra’s 99.1%. However, with Telstra’s network covering the largest area of the country, it dominated call success rates in the more regional areas represented by drive testing on connect-

CommsDay P3 Mobile Benchmark: December 2016 Page 8


ing roads. Here, it scored a 95.7% call success ratio, against 93.6% for Optus and 86% for Vodafone. Optus and Vodafone shared the laurels for fastest call setup times; both managed 5.5 seconds on average in metro drive testing, with Vodafone faster at 5.4 seconds in metro walk testing to Optus’ 5.6 seconds, but Optus averaging a very quick 4.8 seconds in drive testing in towns and smaller cities to Vodafone’s 5.8 seconds. Optus also did best for setup times on connecting roads, at 5 seconds on average to Vodafone’s 5.8. Telstra was fairly consistent across the board, at 6.2 seconds on average for metro drive testing, 6 seconds for metro walk testing, 6.3 seconds towns and smaller cities and 6.3 seconds on connecting routes. Average speech quality was consistently high across all three operators, and markedly improved on last year’s results. All three scored 3.8 in all metro testing and 3.7 in drive tests along connecting roads; Telstra scored 3.7 in drive tests in towns and smaller cities, where Optus and Vodafone scored 3.8 apiece. Ultimately, Optus’ consistently strong performance across major metropolitan areas, smaller cities and regional connecting routes gave it the highest overall score in the voice testing.

MOBILE DATA: MAJOR CITIES Much of Australia’s population is concentrated in its major metropolitan areas, putting the heaviest load on mobile networks in the big cities; these areas are, therefore, a major investment focus for all three operators. All the networks were shown to be extremely reliable in metro areas in both drive and walk testing; again, reliability is critical for a good enduser experience. Vodafone achieved success ratios of 99% or over in all metro data testing, with a perfect 100% success rate for small file downloads in metro drive tests; Telstra managed 99% or over in all major metro testing apart from small file uploads in walk tests, and hit a perfect 100% ratio in static webpage downloads in metro drive testing. Optus achieved 99% success ratios or higher in all metro data testing except live web page downloads in drive testing and small file uploads in walk testing, where it was slightly lower. A lot of network investments and technology upgrades are dedicated to increasing mobile data speeds; even averaged out between all users sharing a network, higher speeds mean that everyday activities like downloading emails and browsing image-rich web pages go more smoothly, significantly enhancing the enduser experience..

CommsDay P3 Mobile Benchmark: December 2016 Page 9


The 2016 test results showed some huge speed improvements across the board and some key highlights for individual operators. Nowhere was this more apparent than in peak testing. Telstra hit a staggering 59.87Mbps average throughput on the download and 21.84Mbps upload in major metro drive testing, and 73.36Mbps download/23.34Mbps upload in major metro walk testing. Optus hit 56.99Mbps download and 16.02Mbps upload in major metro drive testing and 62.71Mbps download/19.14Mbps upload in metro walk testing. Vodafone peaked at 55.33Mbps average download in major metro drive testing but came in highest with 23.44Mbps on the upload; in metro walk testing it hit peaks of 52.98Mbps average download and again scored highest with 23.56Mbps upload. Telstra and Vodafone split the highest scores between them on speed consistency indicators in major metro testing. Telstra, for example, had 90% of samples faster than 12.06MBps on small file downloads and faster than 4.9Mbps on small file uploads in metro drive testing, showing the strongest consistent speeds in these categories; Vodafone, meanwhile, had 90% of small file upload samples faster than 5.21Mbps and 90% of peak file upload samples faster than 6.44Mbps in metro walk testing, coming out ahead in those categories. On the other hand, Vodafone did better in many major metro categories at the top end of the samples; it had the highest speeds for the top 10% of samples for small file downloads and uploads, as well as peak upload testing, in both metro walk and drive tests. Telstra scored highest for the top 10% of samples in both metro walk and drive tests, however. YouTube video performance was very good for all operators in all metro testing, with quick start times of around two seconds in all tests (Optus being the fastest at 1.8 seconds), and 99.9% or better video playouts without interruptions. Telstra achieved the highest video resolution in metro drive testing, tying with Vodafone in metro walk testing, but all operators managed 629p resolution or higher on average.

MOBILE DATA: SMALLER CITIES AND TOWNS Here, with measurements taken solely through drive testing, the data scores separated

CommsDay P3 Mobile Benchmark: December 2016 Page 10


somewhat. In terms of reliability, Telstra had the most consistently excellent success ratios, above 99% for all tests and hitting a perfect 100% for small file downloads, peak test uploads and YouTube playback. Optus did well, with a 100% success ratio of its own for small file uploads, but dipped substantially on live webpage downloads and YouTube videos with 93.4% and 93.8% success ratios respectively. Vodafone, similarly, had generally good success ratios but dropped to 96.2% on live web page downloads, 97.2% on small file uploads, and 95.7% on YouTube videos. Telstra also racked up the highest speeds in peak throughput testing in smaller city and town drive testing, with 45.91Mbps average download and 17.16 average upload. It also did the best on speed consistency, with the highest speeds in the top 90% of samples in all categories. However, Optus scored some wins at the high end; it had the highest speed for the top 10% of samples in small file downloads, small file uploads and peak throughput upload testing. Telstra dominated YouTube testing in smaller cities and towns, with a perfect 100% video playouts without interruptions, the quickest start times, and the highest adaptive video resolution of 652p.

MOBILE DATA: CONNECTING ROADS Again, with the largest network coverage area in Australia, Telstra performed very well in the more regional areas represented by drive testing on connecting roads and highways. It had the best reliability here, with the highest success ratios in all test categories apart from peak throughput file downloads, where Optus edged it out with the top score of 97.4%. Telstra was also comfortably ahead on peak throughput speeds in these regional areas, with 40.1Mbps average throughput on the download and 14.73Mbps average throughput on the upload. It should be noted, though, that even Vodafone’s third place peak download throughput of 20.35Mbps represented almost double its equivalent speed in 2015! And while Telstra also performed well in terms of consistent speeds, with the highest rates in 90% of samples in all but one category, Optus again pulled away at the CommsDay P3 Mobile Benchmark: December 2016 Page 11


high end in some tests; it had the highest speeds for the top 10% of samples in small file downloads, small file uploads and peak download throughput categories. Optus also managed the best ratio of YouTube video playouts without interruption, at 99.5%, and tied with Telstra on setup time at 2.1 seconds – although Telstra had a higher overall success ratio and a considerably higher average video resolution, at 625p.

THE FINAL VERDICT The final master score table aggregates all of the different metrics in each category and applies the weighting developed by P3. The percentage indicators in the subcategories indicate the percentage of maximum points obtained, to avoid rounding display issues. Telstra, the winner of the 2014 and 2015 benchmarks, succeeded in improving its overall score this year from an already high base to win the 2016 benchmark, with its total score 30 points ahead of Optus and 55 points ahead of Vodafone. Its overall improvement was driven by a hefty improvement in data testing; on average, Telstra did best on data performance across the board. And, reflecting the larger reach of its network into regional areas, Telstra secured its win in smaller cities and towns and along connecting roads and highways – with a particularly dominant showing in data testing

CommsDay P3 Mobile Benchmark: December 2016 Page 12


in these areas. Optus came in a much closer second than in the previous two years, reflecting a very strong improvement in both voice and data and a consistent performance across the various test categories. In particular, Optus excelled in voice testing for 2016, with the highest voice scores overall. Vodafone, which put in a huge improvement between 2014 and 2015, again achieved the biggest score increase for 2016. Notably, Vodafone – which has historically concentrated much of its network investment in major metropolitan areas – had the best aggregate voice and data performance this year in cities with a population over 100,000.

Benchmark comparisons: improvements all round Mobile technology evolves very rapidly, with all operators testing and deploying a range of network improvements each year and smartphone manufacturers regularly releasing new devices that support these latest advances. As a result, users’ expectations of their mobile networks change from year to year. To reflect this, P3 is constantly monitoring network performance in all of the countries it works in around the world. It uses all this data to adjust the scoring models each year for all of its benchmarks globally, making sure they keep pace with the current state of the art. (For instance, the data speeds required to get the highest scores were a lot higher in 2016 than they were in 2015.) That said, one of the reasons for publishing the benchmark is to show how each of Australia’s networks is improving over time, reflecting new technologies and the operators’ ongoing investments. And by comparing the performance of each operator against the maximum achievable score each year, we can get a clear sense of how each has not just kept pace with increasingly demanding expectations, but exceeded them. Considering the very solid performance and significant score increases achieved by all of the Australian operators last year, it was particularly impressive to see each of

CommsDay P3 Mobile Benchmark: December 2016 Page 13


them continue to improve in 2016. Vodafone again had the biggest year-on-year improvement, gaining in both voice and data testing and pushing up its total score for 2016 by just under 13% to 812 out of a possible 1,000 points. Driven by its benchmark-leading performance in Australia’s major cities, that uplift came on top of Vodafone’s massive gain of almost 40% between 2014 and 2015. Optus also saw a very significant 7% improvement in 2016 to 837 points, again building further on its huge 27% upswing last year. Underpinning Optus’ score increase was its consistently competitive performance across major cities, smaller towns and connecting roads through regional areas – as well as its dominant overall performance in voice, ahead of both its competitors. Even with its rivals working hard to catch up, however, Telstra still achieved a significant feat in improving its own score from a very high base for the third year running. The formidable resources that Telstra dedicates to maintaining its network advantages were reflected both in its benchmark-leading performance on roads and highways, reflecting more regional areas – where it has network reach beyond both competitors – and in data testing, where it also outperformed rivals to claim the highest overall score. Finally, the improvements

CommsDay P3 Mobile Benchmark: December 2016 Page 14


for each operator this year have made all of them stand out at a global level. P3 uses the same 1,000-point system in benchmarks around the world, and for all of Australia’s operators to score over 800 points this year marks out the country as having an extremely consistent level of high-quality mobile performance. Indeed, in 2016 testing, Optus, Telstra and Vodafone all scored higher than each of the four mobile operators in the UK, where benchmark winners EE and Vodafone UK scored 803 points apiece.

APPENDIX 1: VOLTE CASE STUDY Offering native voice support on 4G networks, Voice over LTE (VoLTE) promises to be a game-changer for voice performance on Australia’s mobile networks. It avoids a lot of the issues with using circuit-switched fallback to make calls on 3G, which in practice can mean faster call setup, better data speeds while on a call, and even higher quality voice calls in some instances. However, when we started the planning process for this year’s benchmark, VoLTE was still limited to certain network areas in the case of some of the operators, and was also restricted to specific device types for each network. For example, not all operators had launched VoLTE on the Samsung Galaxy S5 handset that we used for voice tests. This meant that we couldn’t ensure that any VoLTE testing included in the benchmark itself would meet our rigorous standards for a fair and standardised performance comparison reflecting real end-user experience; as a result we decided to exclude VoLTE from the benchmark scoring for 2016. Nevertheless, we did carry out some limited VoLTE testing in parallel to the public benchmark on the same routes, making over 30,000 calls on the P3 VoLTE test client. This enabled us to prepare a VoLTE case study, comparing the performance of VoLTE – aggregated from our VoLTE-only samples across all operator networks – to the aggregated conventional voice test results from this year’s benchmark. The results, organised to show minimum, maximum and average performance, give some indication of the performance improvements that Australians can expect from VoLTE. The comparable success ratios between traditional voice and VoLTE – slightly better for VoLTE in major cities, slightly lower in smaller towns and cities and on the connecting roads and highways in more regional areas – indicate that the newer voice technology already has significant coverage in general. Speech quality on VoLTE was also slightly higher on average in all areas, though it should be noted that all Australian carriers have already deployed HD Voice on their traditional voice networks, which already provides very good speech quality. Of particular note, however, is the extremely pronounced improvement in call setup time for VoLTE over traditional circuit-switched voice. In practical terms, this means that when you make a call on VoLTE, you can expect to be connected to the other party several seconds faster than you might be used to. Telstra began rolling out VoLTE in September last year, with Vodafone following suit and announcing its own launch in October. Optus started its own VoLTE deployment

CommsDay P3 Mobile Benchmark: December 2016 Page 15


in May this year, initially in Sydney, Melbourne, Brisbane, Adelaide, Perth and the Canberra CBD. With all three having continued to extend the technology across more devices and network areas, we look forward to measuring the full impact of VoLTE and how it affects the Australian operators’ relative voice scores in the 2017 benchmark.

CommsDay P3 Mobile Benchmark: December 2016 Page 16


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.