Mobile Network Performance in the US

The carriers' LTE rollouts are expanding and consumers are benefiting from faster speeds, better reliability, and a tighter race between the networks.

08-18-15 | Dave Andersen

Introduction

report-stats

The mobile race is heating up—and consumers are the winners

Mobile performance has changed dramatically over the last few years. The introduction and expansion of LTE service has altered not just what we do with our phones but also our expectations for consistent speed and reliability. The constantly changing mobile landscape means that performance that might have been considered fast enough or good enough a few short years ago would be seen as woefully inadequate now.

Mobile network performance is not a singular event. You deserve clear performance information that is relevant to all the spaces of your life, for everywhere that you work, play, and live. Think about it: you live in a neighborhood, perhaps commute through a city, are part of a state, and count yourself as a resident of the United States as well. You aren’t just one of these; you’re all of them simultaneously.

This is our fourth time offering a summary report of this magnitude, and our latest data shows important changes to the mobile race. The improvements we’ve seen in speed and reliability don’t just impact which network earns bragging rights; these upgrades have a direct impact on how, when, and where you can use your smartphone. Read on to see how the networks performed and what the changes mean for your daily mobile life.

intro-mobile-life

Performance across the United States

Providing strong coverage across the entirety of the US is a tall order. To earn our United States RootScore Award, a network needs to offer outstanding performance across all of the different places where consumers use their smartphones, from cities and towns of all sizes, to highways, rural areas, and all the spaces in between. Beyond excelling in all the places where consumers use their smartphones, strong mobile performance across the US also means offering network coverage for all of the ways in which consumers use their phones. Whether waiting for a bus or subway, attending a game, or driving between a major metro area and their hometown, consumers want to be able to make calls, send texts, and easily access the web.

Our United States RootScore Report compares network performance across both the variety of places and the variety of ways that consumers use their smartphones. It’s a one-of-a-kind performance summary that shows which networks are best answering consumer mobile expectations.

To determine which network is leading the performance race in the first half of 2015, we drove over 237,000 miles while testing performance on highways and in big cities, small towns, and rural areas across the US. To put that in perspective, consider that the distance from New York City to Los Angeles is approximately 2,800 miles, the circumference of the earth is 24,901 miles, and the moon is about 239,000 miles away. While collecting samples for this national report, our professional testers could have driven from NYC to LA about 85 times, circled the earth nearly 10 times, or made it all almost all the way to the moon. All told, we collected approximately 6.1 million samples while testing performance during driving, at stationary outdoor locations, and at more than 7,300 indoor locations.

In short, these aren’t anecdotal results or hasty generalizations. Unlike subjective surveys or summaries that rely exclusively on crowdsourced data, we follow a scientific methodology. In simple terms, that means you can trust that our results are accurate.

For those of you with a more scientific bent, we structure our approach using fundamental tools such as hypothesis testing, experiment design, and statistics. Our methodology is meant to provide objective, accurate, and unbiased assessments of performance from a consumer’s point of view. Unlike crowdsourced studies, our methodology is designed to characterize network performance and illuminate meaningful differences in network performance with a high level of statistical significance. We never settle for collecting a small number of samples or draw conclusions that are not backed by scientific rigor. We use randomized spatial sampling techniques to collect data in an unbiased manner, and we conduct testing across all hours of the day and all days of the week.

Our United States RootScore Report is built from multiple layers of data. In 2015, we expanded our testing to include even more areas where you use your smartphone. After all, your smartphone is often a fundamental part of daily life, whether you’re commuting to work, at the airport, spending time in a busy city center, at the ballpark, or walking across campus. All of our new testing areas also contribute to each network’s United States performance scoring. In addition to testing unique to the United States RootScore study, we also factor in results from all underlying RootScore Reports and studies (State, Metro, Venue, Transit, Campus, Airport, and additional testing in dense urban centers). These are comprehensive, unbiased results that you can trust to give you a complete and accurate picture of mobile network performance across the entirety of the US. Results from more populous states like California carry far more weight in our national results than those from less populous states like Rhode Island, and within a state, large metro areas carry more weight than small towns or connecting highways. For more on how we test, see our FAQ section within this report.

The United States winner

the-us-winner

AT&T a strong second on the national stage

Don't overlook AT&T. AT&T was the only carrier other than Verizon to win a United States RootScore Award, doing so in the text category for the second consecutive report. AT&T also finished a close second to Verizon in five of six categories, including the more holistic areas of overall performance, network reliability, and network speed. While AT&T again finished second in the call category, it shared the rank with Sprint in this test period.

In short, although Verizon led the way in the majority of categories at the national level, AT&T wasn’t far behind.

Sprint staying in the mix

Although Sprint did not win any of our United States RootScore Awards, the network finished third in overall performance, network reliability, and text performance. In most categories, Sprint’s scores showed progress in terms of closing the gap with the leaders from prior testing.

Sprint made noise by moving into a two-way tie with AT&T for the second position in the call category. Sprint climbed into second place thanks to an improved performance in blocked call testing, which also provided a boost to Sprint's network reliability and overall performance.

In our previous report, we noted that Sprint’s call results in metro areas had improved considerably compared to the first half of 2014. Halfway through 2015, it appears that Sprint’s improvements are starting to benefit consumers beyond just metropolitan markets.

T-Mobile trailing outside of metro areas

T-Mobile was again shut out of United States RootScore Awards, but the network finished a strong third for network speed and data performance. We’ve noted before that T-Mobile typically performs much better in metro areas than it does at state or national levels, and this was indeed the case in the first half of 2015.

As our Metro RootScore overview below shows, T-Mobile’s performance within metro areas was strong in multiple test categories, with improved data reliability and fast speeds. If you primarily use your smartphone in a major urban environment, T-Mobile remains a solid choice. Even though urban areas carry more weight in our results, T-Mobile currently lacks the broad coverage to excel in our National or State RootScore studies.

Verizon excellent at the national level

Verizon’s performance across our testing of the United States was excellent. Verizon earned bragging rights for the fourth consecutive time as winner of the United States RootScore Award for Overall Performance. Verizon also finished atop the podium in five out of six categories for the second straight testing period, winning United States RootScore Awards for Overall Performance, Data Performance, and Call Performance, while also ranking first on both our Network Reliability Index and Network Speed Index. While Verizon didn’t win the Text RootScore Award on the national stage, its rank in the category improved in this test period, moving into second behind AT&T.

Verizon’s margin of victory in five of the six categories, however, was relatively small. AT&T finished a strong second in the majority of categories, Sprint narrowed the gap in call performance, and T-Mobile made strides in our data and speed categories.

National results in perspective

As we noted in the introduction, to give you a comprehensive view of mobile network performance, we test much more than just performance at the national level. We also test to see how the networks compare within the 125 most populous metro areas across the US and within each of the 50 states.

Our testing methodology is unique to the different spaces and challenges found within each of these different areas (metro, state, and nation). Just because a network performed well in our United States testing doesn’t necessarily mean that it will also be the strongest performer when looking at a particular metro or state. For more on our testing methodology, please see the FAQ section within this report or the methodology section on our website.

Performance across the 50 states

us-nation-performance
Expectations for mobile performance have shifted.

Just a few years ago, consumers might have accepted that data speeds would lag or that finding a signal would be difficult in areas beyond major metros. That’s changed. With our smartphones playing such a large part in our daily lives, we want to be able to access our network at all times and be able to call, text, post to social media, and much more no matter where we are.

Providing coverage across an entire state isn’t an easy task for the carriers. Excelling in large, dense urban areas doesn’t guarantee coverage will also be strong in other areas of the state. These various spaces often have vastly different network demands and require a variety of approaches to provision adequately.

Rather than simply summarize results from the biggest metros, our State RootScore Reports consider performance across all of these wildly divergent mobile areas and balance results from dense urban areas, highways, smaller towns, and more rural spaces to paint a complete picture. Higher population areas carry more weight in the scoring, but all of these various spaces play a part in our final results. It’s a comprehensive testing methodology to give you a one-of-its-kind look at mobile network performances across each of the 50 states.

AT&T shows consistency as the number-two performer

AT&T finished behind Verizon at the state level in terms of award count, but consistent with what we’ve seen at the national and metro levels, AT&T remains a strong number-two performer in state-level testing. AT&T’s state award total far surpassed those of Sprint and T-Mobile.

AT&T again kept a strong hold as the second-best performer at the state level, but its tally of State RootScore Awards fell slightly since the second half of 2014. AT&T won or shared 104 State RootScore Awards in the second half of 2014 but dropped to a total of 95 in this test period.

The primary driver of this change was in network speed, where AT&T’s tally of first-place rankings (outright or shared) on the Network Speed Index fell from 13 to 7 at the state level. However, AT&T showed consistency in the more holistic areas of overall performance and network reliability, categories in which AT&T’s tallies of first-place rankings remained identical across both test periods.

Sprint improves reliability in state-level testing

We noted in our previous report that Sprint had improved its network reliability at the state level, and this trend continued in the first half of 2015. Sprint ranked first (outright or shared) on the Network Reliability Index in three states, a jump from one state in second-half testing. And as you’ll see below, Sprint also showed improved reliability results in metro testing.

While Sprint’s tally of State RootScore Awards decreased since the previous test period, falling from 32 to 25, the network again outperformed T-Mobile in state-level testing. Sprint earned 15 State Text RootScore Awards (all shared) and seven State Call RootScore Awards (one outright), but the network still didn’t manage to win State RootScore Awards for overall, speed, or data performance. However, the network’s improved reliability at both the state and metro levels suggest that the network is heading in a positive direction.

T-Mobile falls behind

As you’ll see below, T-Mobile made a strong showing in metro testing. However, at the state level, the network experienced difficulties. This is a trend we’ve witnessed before: T-Mobile continues to improve within the metros we test, but its network upgrades have yet to translate into significant performance gains at state or national levels.

In the second half of 2014, T-Mobile earned a share of three State RootScore Awards: one Call RootScore Award and two Text RootScore Awards, as well as a shared first-place finish on the Network Speed Index. In this test period, though, T-Mobile was ultimately shut out at the state level. However, with T-Mobile’s LTE footprint gaining maturity and its reliability improving in metro testing, it will be interesting to see if T-Mobile’s improvements extend beyond metro areas heading into 2016.

Verizon leading at the state level

Verizon excelled in state-level testing, leading our tally of State RootScore Awards by a wide margin. Similar to our previous report, Verizon won or shared the Overall RootScore Award in 47 of 50 states. Verizon’s next-closest competitor, AT&T, won or shared 12 state-level Overall RootScore Awards. Neither Sprint nor T-Mobile won or shared any Overall RootScore Awards at the state level. While Verizon’s tally of Text RootScore Awards at the state level ranked second to that of AT&T, Verizon did improve its total in the category compared to second-half testing.

Looking across all six performance categories shows just how impressive Verizon was at the state level. Evaluating six categories of performance across 50 states equals 300 total network comparisons. Verizon won or tied for first in an incredible 253 out of 300 opportunities. Though Verizon’s tally of awards actually fell by four compared to second-half testing, consider that the second-highest tally of State RootScore Awards was 95, earned by AT&T.

State results in perspective

As we noted in the introduction, to give you a comprehensive view of mobile network performance, we test much more than just performance at the state level. We also test to see how the networks compare within the 125 most populous metros and across the breadth of the US itself.

Our testing methodology is unique to the different spaces and challenges found within each of these different areas (metro, state, and nation). Just because a network performed well in our state testing doesn’t necessarily mean that it will also be the strongest performer when looking at metros or for the entirety of the US itself. For more on our testing methodology, please see the FAQ section within this report or the methodology section on our website.

Metro overview

metro-tally

Major metropolitan areas are much more than just city centers. They also include residential suburbs, business districts, recreational areas, and the roadways that connect them. Whether they’re relaxing at home, heading to work, or walking across town, consumers expect strong coverage and reliable mobile performance.

Mobile performance at the metro level most likely has the greatest impact on your daily mobile life. Although you are certainly part of a state and the US itself, the majority of your time is probably spent within a particular metro. Our Metro RootScore Reports give you a detailed look at how the networks compare across six categories that are important to the consumer mobile experience: Overall performance, Network Speed performance, Network Reliability performance, Data performance, Call performance, and Text performance.

Keep in mind that while call performance shows some differentiation among the networks, call reliability is generally outstanding: a call failure rate of 2% still only represents problems in 1 out of every 50 calls. Because we understand that speed is much more than just the maximum speed that a network might hit temporarily, we always test download and upload speed as well as how quickly a network would let you check email, perform typical web or app tasks, and send or receive texts.

Our scientific testing of the four major carriers is well established, with multiple years and tens of millions of tests completed. We’ve brought together individual carrier highlights, detailed performance information for the first half of 2015, and comparisons to the second half of 2014 to give you a sense of potential performance trends.

We’ve provided network-by-network summaries of performance, and within each network section, we’ll show how long it would take you to download an episode of your favorite television show. Read on for in-depth performance information for each carrier, including trend information and individual highlights for each network.

att-highlights

In our 2nd Half 2014 Mobile Network Performance Review, AT&T earned a RootScore Award tally that trailed only that of Verizon. AT&T’s performance across metro testing in 2014 was perhaps overshadowed by the significant improvements made by both Sprint and T-Mobile, but AT&T certainly deserved credit for its strong results. While Sprint and T-Mobile may have garnered well-deserved attention for their improvements and Verizon stole headlines as the top performer, AT&T found itself perhaps unjustifiably squeezed out of the fanfare. AT&T was still a strong number two in our rankings last time and couldn’t make the same level of improvement that Sprint and T-Mobile did for one simple reason: AT&T was already a much stronger performer and there simply wasn’t as much room for improvement.

At the midway point of 2015, AT&T's network is once again a strong number two, but it appears that AT&T isn’t content to rest on its laurels and stay in this position. Indeed, AT&T made significant improvements during first-half testing, earning an additional 53 RootScore Awards compared to the second half of 2014, which was the largest gain in number of awards for any carrier. As another mark of excellence in the first half of 2015, it’s worth noting that AT&T also earned distinction as the only carrier to improve its award tally in every single test category.

The primary drivers behind AT&T’s increased award total were its strong performances in our Overall Performance and Network Speed categories. AT&T ranked first (outright or tied) on the Network Speed Index in 53 metros in this round of testing, a jump from 39 in the second half of 2014. More importantly for AT&T subscribers, AT&T won or shared 68 Overall RootScore Awards in first-half testing, a substantial lift from AT&T’s 51 Overall RootScore Awards in the previous test cycle, and a sign of AT&T’s continued progress across all test categories.

Reliability recap

Continuing a trend we’ve seen in past reports, AT&T showed strong reliability across our data, call, and text testing in the first half of 2015. AT&T finished first (outright or shared) on the Network Reliability Index in 81 metros in this round of testing, far surpassing the totals earned by Sprint (30) and T-Mobile (31), and trailing only that of Verizon (117). Historically, AT&T has delivered strong reliability results in metro areas, and the song remained the same in the first half of 2015, where AT&T’s call and data reliability results were outstanding. In fact, AT&T achieved our mark of excellent call reliability in every metro we visited; AT&T recorded both blocked and dropped call rates below 2% in all 125 markets.

Our data reliability testing looks at the two hallmarks of your experience: 1) can you connect to the network and 2) can you then stay connected until you’re done with what you want to do? We use our web/app testing as a proxy for overall data reliability; we use a high bar in our reliability testing and look for networks to offer at least a 97% success rate in our web/app testing as a mark of excellent data reliability. This 97% threshold reflects performance that would pose little to no noticeable disruptions in your everyday mobile life.

AT&T’s data reliability results were excellent, with AT&T establishing an initial web/app connection at rates of at least 97% in 123 of the 125 markets we tested. Although Sprint and T-Mobile have each made significant strides in data reliability testing since 2014, the numbers from this test period show that AT&T remains on another level. AT&T’s results were outstanding compared to those of Sprint and T-Mobile and on a relative par with those of Verizon.

Establishing an initial connection is paramount to a good consumer experience, but the ability to stay connected until tasks are finished is perhaps even more important. In that respect, AT&T subscribers shouldn’t be disappointed, as AT&T stayed connected at rates of 97% or higher in all 125 metro areas we tested.

att-call

Speed specifics

Fast speeds in metro areas have become a hallmark of AT&T’s success. In our 2nd Half 2014 Mobile Network Performance Review, we noted AT&T’s improved speed performances in metro testing, and the good news for AT&T subscribers continued in this test period. AT&T improved its tally of first-place (outright or shared) finishes on the Network Speed Index significantly, jumping from 39 in the second half of 2014 to 53 in this round of testing, a total trailing only that of Verizon (70).

What’s more, AT&T’s median download speeds were fast and consistent with the speeds we found in the second half of 2014. AT&T reached the 10-20 Mbps median range in 79 markets once again, while recording median download speeds faster than 20 Mbps in 11 markets.

Though AT&T has shown improved—and fast—speed results, it’s worth noting that AT&T’s topline speeds (those exceeding 20 Mbps) weren’t found in nearly as many markets as those of T-Mobile or Verizon; T-Mobile achieved median download speeds in excess of 20 Mbps in 45 markets and Verizon did so in 51. For more information on what speeds mean in real-world terms, check out our speed chart.

att-speed-buckets
Takeaway
With strong reliability, fast speeds, and an improved total of Overall RootScore Awards in metro testing, AT&T could narrow the gap with Verizon even further as we move into the second half of 2015 and beyond.
sprint-highlights

Sprint continues to improve and expand its LTE and Spark™ services across metro markets nationwide, and it’s clear that Sprint’s efforts are paying dividends for consumers. In our last report, we noted that Sprint made significant strides in several areas, notably to its reliability and speed performances in metro areas. This theme continued in 2015: Sprint showed considerable improvement in data reliability testing and increased its number of markets in the faster speed tiers.

With these improvements to reliability and speed, Sprint improved its tally of RootScore Awards by 45 (outright or shared) since the second half of 2014, jumping significantly from 135 to 180. The primary driver of Sprint’s increase in awards was the text category, in which Sprint won or shared 77 Text RootScore Awards, a considerable bump from 41 in second-half testing. Sprint also earned significantly more Overall RootScore Awards in this test period, improving from two in the second half of 2014 to seven metros (all shared). And in one market—Dayton, OH—Sprint won outright or shared awards in every single test category, a first for the network.

While Sprint’s metro award tally ranked fourth among all carriers, Sprint’s increased total of awards (outright or shared) was second only to that of AT&T, which improved by 53 RootScore Awards.

Reliability recap

Sprint’s improved award tally is impressive and great news for its subscribers, but it doesn’t tell the whole story of the network’s improvements since the second half of 2014. As we noted before, getting connected and staying connected are the two hallmarks of a positive mobile experience; in this regard, Sprint showed significant progress. While Sprint’s total of first-place finishes on the Network Reliability Index remained similar to that from our previous test period, Sprint’s data reliability results improved considerably.

Indeed, Sprint surpassed our 97% threshold of excellence for connecting to the network in 106 metros, a sizable leap from 77 in the previous test period. Perhaps more importantly for Sprint subscribers, Sprint exceeded our 97% threshold of excellence for staying connected in 119 markets, an increase from 108 in second-half testing.

Sprint’s call reliability results were even more impressive. Sprint recorded blocked call rates below 2% in each of the 125 metro markets we tested, while achieving dropped call rates below 2% in 122 markets. Sprint’s call reliability results surpassed those of T-Mobile and were on par with those of AT&T and Verizon.

Although the data network reliability performances from AT&T and Verizon were on another level from those of Sprint and T-Mobile, Sprint’s data reliability results in web/app testing were far superior to what we found from T-Mobile and perhaps a sign of even further progress to follow.

sprint-call

Speed specifics

In this round of testing, Sprint’s speeds at both the slower and faster ends of the spectrum improved. Specifically, Sprint decreased the number of markets in which it recorded median download speeds between 0-5 Mbps, while registering improvement in the 5-10 and 10-20 Mbps ranges. Putting the two speed tiers together, Sprint recorded median download speeds faster than 5 Mbps in 94 out of 125 metros in this test period, an increase from 66 in second-half testing.

These improved speeds helped Sprint earn four first-place finishes on the Network Speed Index (one outright) in the first half of 2015, compared to zero in the last test period. Although Sprint didn’t record median download speeds above 20 Mbps in any market in this test period, the improved reliability and speed results are good news for consumers, and perhaps a harbinger of even better things to come.

sprint-speed-buckets
Takeaway
Sprint continues to improve in both speed and reliability, showing considerable progress in data reliability in the first half of 2015, while improving its performance in the faster speed tiers.
t-mobile-highlights

We’ve noted before that T-Mobile has shown fast speeds in metro areas, and that was again the case in the first half of 2015. T-Mobile recorded median download speeds of 20 Mbps or faster in 45 out of the 125 markets we tested, a close second to Verizon’s total of 51. For perspective, consider that AT&T reached topline speeds (those exceeding 20 Mbps) in 11 markets and Sprint did so in none.

T-Mobile improved more than just its speed results, though. T-Mobile’s total of 221 RootScore Awards (won outright or shared) was an increase of 20 compared to the second half of 2014, and T-Mobile’s gains came in categories perhaps most relevant to consumers: network reliability, network speed, and call performance. T-Mobile’s award total in other categories remained relatively consistent with that from our previous test period.

At the end of the day, the story for T-Mobile has revolved around the fast speeds we've found in metro areas, and the network didn’t disappoint in this test period. But as you’ll see below, it might be time to look beyond T-Mobile’s speeds, as the network made strides in data reliability testing since our last test period.

Reliability recap

In the first half of 2015, T-Mobile made improvements to its reliability results. T-Mobile’s tally of first-place finishes (outright or shared) on the Network Reliability Index increased from 23 to 31. T-Mobile also improved its blocked call reliability since second-half testing, recording blocked call rates above 2% in only eight of the 125 markets we visited, an improvement from 15 in second-half testing. Though it’s clear that T-Mobile is gaining momentum in our call testing, its blocked call rates weren't quite as strong as what we saw from the other three networks.

The starkest change we found since the second half of 2014 came in data reliability testing. T-Mobile surpassed our 97% threshold of excellence for making a connection in 62 markets, a notable increase from 43 in the previous test period. T-Mobile also improved its ability to stay connected during web/app testing, staying connected at rates of at least 97% in 111 metros, a nice improvement from 92 markets in the second half of 2014.

T-Mobile has a long way to go to catch up with AT&T and Verizon in terms of metro award tallies. However, if T-Mobile can continue to improve its data reliability while maintaining fast data speeds, the network could narrow the gap sooner rather than later.

t-mobile-call

Speed specifics

In our 2nd Half 2014 Mobile Network Performance Review, we noted that T-Mobile is fast and getting even faster. The story halfway through 2015 remains the same: T-Mobile earned 42 first-place finishes (outright or shared) on the Network Speed Index, an increase from 32 in the previous test period, and a total that is getting to closer to that of AT&T (53).

If you’re interested in fast speeds, consider this: T-Mobile recorded median download speeds faster than 10 Mbps in 102 of the 125 metro areas we tested, with speeds faster than 20 Mbps in 45 of those markets. T-Mobile’s tally of topline speeds was second only to that of Verizon (51), but far surpassed those of AT&T (11) and Sprint (0).

t-mobile-speed-buckets
Takeaway
In addition to the fast speeds we found from T-Mobile, the network improved its results in data reliability testing. Getting connected and staying connected are at the core of a positive consumer experience, and when you add fast speeds to the mix, the results could be a game changer. We'll definitely have our eye on T-Mobile’s data reliability results in the coming months.
verizon-highlights

Verizon’s performance across metro areas in the first half of 2015 was similar to what we noted in our previous report: Verizon’s numbers tell a story of consistency, with the network again winning the most RootScore Awards among all carriers in metro testing, while also achieving excellent reliability and speed results.

While Verizon’s tally of RootScore Awards dropped slightly since second-half testing, moving from 537 to 512 (outright or shared), its total was still far higher than that of its closest competitor, AT&T. The majority of Verizon’s award decline was in the text category, the only category in which Verizon didn’t win the most RootScore Awards in metro testing.

Verizon’s results in all other categories, however, were remarkable. Consider this: Of the 125 markets we tested in the first half of 2015, Verizon won or shared 110 Overall RootScore Awards and 92 Data RootScore Awards, while earning 117 first-place finishes on the Network Reliability Index and 70 first-place finishes on the Network Speed Index. To put this in context, the next closest competitor in the Overall Performance category was AT&T, which won outright or shared 68 Overall RootScore Awards.

Reliability recap

Verizon’s reliability performance across metros in the first half of 2015 was outstanding. The network not only earned a staggering 117 first-place finishes (outright or shared) on the Network Reliability Index, a deeper dive into the data shows that Verizon’s performance across call and data reliability testing was likewise excellent. Indeed, Verizon recorded blocked call rates below 2% in all 125 metros and dropped call rates below 2% in 124 metros. In fact, Verizon's blocked call rates were below 1% in all 125 metros we visited.

Verizon’s data reliability results were also stellar. Verizon surpassed the 97% threshold of excellence for both making a connection and staying connected in each of the 125 metros we tested. The only network with results similar to those of Verizon was AT&T, which reached the 97% threshold of excellence for making a connection in 123 metros, while staying connected at rates of 97% or better in all 125 metros. While Sprint and T-Mobile each showed improvement in data reliability testing, their results fell short of those of AT&T or Verizon.

verizon-call

Speed specifics

In addition to Verizon’s excellent reliability, Verizon’s network was extremely fast in metro testing. While AT&T, Sprint, and T-Mobile each had at least two markets with median download speeds between 0-5 Mbps (Sprint did so in 31 markets), Verizon didn’t record median download speeds in this lower-speed range in a single market. Even more impressive, Verizon recorded median download speeds below 10 Mbps in only four markets. To put this in perspective, Sprint recorded median download speeds below 10 Mbps in 92 metros, AT&T had 35 markets with median download speeds below 10 Mbps, and T-Mobile had 23 markets in this range.

We mentioned before that T-Mobile was fast, having recorded median download speeds faster than 10 Mbps in 102 metros; Verizon was even faster, achieving speeds of 10 Mbps or faster in 121 markets. Verizon also stepped up its topline speed results: Verizon increased its number of markets with median download speeds of at least 20 Mbps to 51 in this round of testing, an increase from 40 in the last test period. T-Mobile, meanwhile, had 45 markets with median download speeds of 20 Mbps or faster.

In short, Verizon was both very fast and very reliable across our metro testing. The question remains: Will this continue to be the case in the second half of 2015 and into 2016, or will the other networks catch up?

verizon-speed-buckets
Takeaway
Considering how dominant Verizon was in the second half of 2014 and into 2015, it’s fair to wonder how much room the network has to improve. But consider this: While AT&T, Sprint, and T-Mobile have all made progress compared to second-half testing, Verizon still led the pack in five out of six categories, many by a considerable margin.

Metros in perspective

As we noted in the introduction, to give you a comprehensive view of mobile network performance, we test much more than just performance at the metro level. We also test to see how the networks compare within each state and across the breadth of the US itself.

Our testing methodology is unique to the different spaces and challenges found within each of these different areas (metro, state, and nation). Just because a network performed well in our metro testing doesn’t necessarily mean that it will also be the strongest performer when looking at a state or for the entirety of the US. For more on our testing methodology, please see the FAQ section within this report or the methodology section on our website.

Methodology FAQ

We’ve included below some of the most common questions about what we test, how we test, and why we test. For even more information, please visit our methodology page.

What’s a RootScore Report?

RootScore Reports are part of a broad suite of free tools that RootMetrics offers to help you make more informed mobile decisions and improve the quality of your mobile experience. RootScore Reports provide an in-depth, independent, and consumer-focused look at network performance in the United States, Canada, and the United Kingdom.

Why do you create RootScore Reports?

We create RootScore Reports to improve mobile networks for you, the consumer. We’re a consumer-first company that believes better mobile decisions and improved network performance are built from accurate, unbiased, and consumer-focused measurements of how you actually experience a mobile network on a daily basis. To learn more about how we’ve set the standard for mobile performance testing, visit our standards page.

What exactly is a RootScore?

We rely on our smartphones just as much as you do, and we like easy-to-understand marks of performance. That’s why we do all the heavy, in-depth testing and then distill everything down in the simplest ways possible. RootScores offer a simple way to translate thousands or millions of complex data points into clear and easy-to-understand marks of performance. RootScores are designed to reflect a consumer’s experience of network performance. It’s simple: the higher the score, the better the performance.

A good Overall RootScore means a good user experience. It’s that simple. Using an educational analogy, think of RootScores like you would a final grade in a semester-long course: scores approaching the upper limit (100) indicate extraordinary performance, like receiving an “A” grade at the end of the semester. Scores approaching the lower limit (0) represent network performance that would be clearly unacceptable to everyday consumer usage, similar to receiving a poor grade at the end of the semester.

Just as a final grade in a semester-long course is a function of performance across multiple exams, no single test determines RootScore results for any performance category; RootScores are calculated from multiple tests that are weighted according to their impact on a user’s experience. RootScore Reports give you a detailed look at how the networks compare across six categories that are important to the consumer mobile experience:

Keep in mind that not all mobile users are created equally: Do you use your smartphone mainly for uploading pictures or streaming music? Our Data RootScore might be more important to you than the other categories. After all, everyone is different, with different mobile needs. That’s why we provide test summaries across multiple categories and across multiple areas of your daily life.

How do you measure network reliability and speed?

Reliability and speed are the most important aspects of your mobile experience and have always been the two fundamental components of our RootScore. To show you how the networks perform in these key areas of mobile usage, our Network Reliability and Network Speed Indices offer a clear, summary view of network performance across all test categories (data, call, and text). These awards illuminate network coverage across the entirety of the mobile experience. To learn more about how we’ve set the standard for mobile performance testing, visit our standards page.

Can I just roll up results from your metro testing to determine which network is best at the state or nation level?

That’s a good, important question. Each type of RootScore Report requires its own unique test sampling scheme. After all, California is made up of much more than just Los Angeles or San Francisco. Washington State is more than Seattle. You get the idea.

To paint a complete picture of mobile network performance at the state level, we test much more than just the major urban areas. We also collect test samples from non-urban locations to provide a complete look at performance for the broader area. The same theory applies to our national results: each area of testing (Metro, State, and Nation) must include samples representative of that entire level.

What’s this mean? Although a network might excel in our Metro RootScore testing, it does not necessarily mean that the same network will do well at the state or national levels. Think about it this way: It’s possible for one network to win the Overall RootScore Award in every Metro RootScore Report within a particular state, yet not win the State Overall RootScore Award due to substantially weak performance outside the large metro areas.

The bottom line: The different areas we test provide complementary—but not identical—looks at mobile network performance. We believe that rather than just give you a look at mobile network performance across one of these levels, you deserve the full picture. After all, you are part of a metro, a state, and the nation. You should know how the networks perform across each of these areas. Using our Metro, State, and National RootScore Reports together gives you a comprehensive view of mobile network performance. For more information about how and where we test network performance, visit our methodology page.

What do you test?

We test the activities that you use your smartphone for on a daily basis, like making calls, sending email, browsing webpages, using apps, and sending texts. The most important aspects of your mobile experience are reliability and speed. Our tests look at how reliably you can connect to a network and how reliably you can stay connected to the network once a connection is established. We also test how quickly you can connect to a network and how quickly you can complete your tasks once that connection is established. We apply this testing framework to benchmark network performance across thousands of data, call, and text test samples.

Our testing encompasses a wide array of real-world situations that people might experience while using their mobile devices. Examples include high and low network load situations, variations in speed from stationary to freeway, poor to excellent coverage, and indoor to open-air signal situations. We test competing networks head-to-head in these situations to remove bias. For more information about how we test network performance, visit our methodology page.

How does your testing actually reflect a consumer’s mobile experience?

“Consumer experience” is a hot topic in mobile performance reporting. We think that’s great. Other reports like to claim they reflect the “consumer experience.” But for us, consumer experience is more than a buzzword; it’s the guiding star for everything we do. Among other things, truly capturing the consumer experience means that we:

Test with the same smartphones you use:
We only use unmodified, off-the-shelf smartphones purchased from mobile network operator stores.

Test in the same places you use your smartphone:
We test indoors, outdoors, while driving, in small towns and in major metro areas.

Test the same ways you use your smartphone:
We test network reliability and speed during data, call, and text performance.

Test at the same times you use your smartphone:
We test 24/7, weighting results for periods when mobile usage is typically at its highest.

Wouldn’t test results from actual consumers (or “real people”) be a better representation of an actual consumer’s experience?

Test results from actual users are certainly “real,” but they are simply not a good or fair way to compare performance among the national carriers. Through our scientific testing methodology, we are able to test all the networks at the same time and place using actual consumer devices that are benchmarked and quality controlled to show each of the networks consistently in the best light possible. Other anecdotal “crowd-sourced” approaches that rely on a “grab bag” of random user tests at different times or places for the carriers are neither fair nor a basis for network comparisons.

Imagine if all user test results from one network were submitted late at night during excellent network conditions outdoors near a cell tower, while another network’s user results were all submitted indoors during peak-usage times in sub-optimal network conditions. It would not be fair or reasonable to compare the networks based on these two sets of results. Therefore, we ensure that all RootMetrics testing is conducted in a rigorous, scientific manner. It is also worth noting that the challenges with crowd-sourced data are not solved by simply collecting more crowd-sourced data; poor data sources for comparing networks are poor no matter how much data is collected.

How many tests do you perform?

We’re thorough with our testing. For the 2015 First Half Mobile Network Performance Review, we collected over 6 million total test samples.

Where do you test and how do you decide where to test?

Everything we do is based on objectivity. The boundaries of the areas we test are defined by governments and official agencies—not by RootMetrics. For our State and National RootScore studies, we test data, call, and text performance indoors and while driving in United States Census Places in all 50 states; census places are used to designate areas within each state where people live. Samples are collected in the 125 most populous metropolitan markets across the United States, as defined by the US Census Bureau, and we also test network data performance within the 50 busiest US Airports, as designated by the FAA. It’s important to note that our State and National RootScore results are weighted by population size, so the larger, more populous cities do contribute quite a bit more to our calculations than less populated, more rural towns.

Is it fair to test networks in rural areas where the networks have limited coverage?

Yes. We test the national networks with roaming available to mirror a consumer's experience. We believe consumers expect and deserve good mobile service as they drive on highways across states or when they’re at home or visiting medium-size or small towns. Since the national networks market to consumers based on the strength of their networks at metro, state, and national levels, it is certainly fair to give consumers a view of how the networks perform across all these areas of testing. Since our State and National tests are population weighted, we don’t test “too much rural.” Rather, we aim to test and speak to network performance for the entire population of each state and for the United States, not just for those who live and work in large metro areas.

When do you test?

Since mobile networks rapidly evolve, we conduct tests nearly every week of the year in order to provide you with the most current view of mobile network performance. Our data collection periods span all hours of the day and night (weighted more heavily toward typical consumer usage hours), and all days of the week, to establish a comprehensive, temporal view of performance.

We also ensure that tests are performed in each location for long enough periods to ensure that important, but rare, events are captured to accurately characterize a consumer’s experience.

Again, everything we do is designed with objectivity in mind. To prevent bias in our sample collection, RootMetrics utilizes a sampling methodology that randomly selects the indoor locations used for testing; drive testing takes place during travel between these random indoor locations. To measure network performance at the State and National levels, test locations are randomly selected within each state.

Which mobile network operators do you test?

We test AT&T, Sprint, T-Mobile, and Verizon.

What “test equipment” do you use to test the mobile network operators?

That’s easy: we use only unmodified smartphones that are purchased off the shelf at mobile network operator stores, just as you would. We never alter the phones with external antennas or use any other non-standard equipment. You don’t use those things, and neither do we. We also never “root,” jailbreak, or modify the phone’s software in any way. We regularly select from leading Android-based smartphones currently available to consumers. Why Android? It offers the most flexible platform for testing. For more information about how we test network performance, visit our methodology page.

How you decide which phones to use and how often do you update devices?

We select leading Android-based smartphones for each network available at the time of selection. We generally update the smartphones used for testing twice per calendar year. To ensure consistency of results and to make sure we have time to comprehensively evaluate any potential device, we do not change the phones we use in the middle of a testing period. During the selection process, RootMetrics benchmarks device models to determine the best commercially available phone model from each network in order to capture the best possible user experience on each particular network. Benchmarking models before testing helps remove limitations that can be caused by specific model/network interactions.

Our device selection process mirrors our RootScore Report testing: We use off-the-shelf handsets obtained directly from mobile network operators’ stores; we test in multiple geographic locations covering indoor, outdoor, and driving; and we test data, call, and text performance. We analyze our benchmark results in the same fashion as that of our RootScore analysis in order to select the model for each network that will be used for our next round of reports. For more information about how we test network performance, visit our methodology page.

Do you use the most advanced smartphones available for testing?

We select phones that support the most advanced and widely available network technology for each mobile network operator at the time of selection. During the selection process, we benchmark device models to determine the best commercially available phone model from each network in order to capture the best possible, widely available consumer experience on each particular network.

We follow a semi-annual testing schedule, visiting locations once during the first half of the year and once during the second half of the year. To ensure consistency of results, we do not change test devices once a testing period has started. Therefore, it is possible for a carrier to release new technology in the middle of a testing period that we are unable to include until our benchmarking procedures have been completed while we select phones for the next test period.

For example, in the first half of 2015, one carrier was in the process of rolling out 700 MHz spectrum band in certain markets and releasing phones that supported this additional spectrum. The additional spectrum was not widely available to consumers at that time because it was released in limited markets and the devices were new. With the expansion of this rollout and support on additional devices, our testing in the second half of 2015 will cover the release of this new 700 MHz spectrum deployment.

Do you use different phones to test a network’s data, call, or text performance?

Absolutely not. We test data, call, and text performance with the same device. That’s what happens in real life, so that’s how we evaluate performance. In other words, we don’t use one phone to test a network’s data performance and another phone to test the same network’s call or text performance. That might seem like a strange thing to point out, but other companies test performance with separate devices for each category. We know that’s a bad idea—our testing has found reliability problems when the phone has to move directly from data to call and text services versus isolating these tests. Again, this is how you use your smartphone in real life, so that’s how we test. For more information about how we test network performance, visit our methodology page.

A flexible, evolutionary framework

RootMetrics regularly re-examines our testing and scoring methodologies to assure that they continue to reflect your experience as accurately as possible. When advances in mobile technology alter the landscape or consumer behavior changes markedly, we adjust our methodologies and scoring accordingly. Changes are made so that we continue to capture the true consumer experience.

The RootMetrics First Half 2015 US Mobile Network Performance Review © 2015 RootMetrics. All rights reserved. RootScore® is a trademark of RootMetrics. Root Wireless, Inc. (aka RootMetrics) owns this RootScore® report, including the associated intellectual property rights and performance data, scoring and similar information about wireless networks, carriers, products, and services in the report. RootScore® reports, including all performance data, scoring and similar information in the reports, may not be reproduced, distributed, published, linked to or otherwise referenced for any advertising, promotional or commercial purpose without RootMetrics’ prior written consent. If RootMetrics provides such written consent, the approved use must prominently incorporate a statement that uses information and terms similar to the following (as approved by RootMetrics):

Source: The RootMetrics First Half 2015 US Mobile Network Performance Review ©2015 RootMetrics. All rights reserved. RootScore® content and performance data, scoring and similar information is owned by RootMetrics and may not be reproduced, distributed, published, linked to or otherwise used or referenced for any advertising, promotional or commercial purpose without RootMetrics’ prior written consent.

*August 20, 2015 - In a previous version of the report, Verizon’s time to download a 45-minute TV show was erroneously stated to be 2.8 minutes. We regret the error.

speed-chart