Broadband picture may not be so bleak

A new study disputes the claim that Internet data rates in the U.S. are only half as high as advertised; study’s authors call for better data.


In March, the Federal Communications Commission released its National Broadband Plan, in which it reported that “the actual download speed experienced on broadband connections in American households is approximately 40-50% of the advertised ‘up to’ speed to which they subscribe.” That finding, which the FCC had previously cited, caused some consternation among bloggers and op-ed writers, to say nothing of broadband subscribers.

But a new study by MIT researchers calls it into question. Most of the common methods for measuring Internet data rates, the researchers conclude, underestimate the speed of the so-called access network — the part of the Internet that Internet service providers control. The number of devices accessing a home wireless network, the internal settings of a home computer, and the location of the test servers sending the computer data can all affect measurements of broadband speed.

The researchers don’t cast their findings as supporting any particular policy positions. But they do argue that everyone with an interest in the quality of broadband access — governments, service providers, subscribers, and market analysts — should be more precise about what they’re measuring and how. “If you are doing measurements, and you want to look at data to support whatever your policy position is, these are the things that you need to be careful of,” says Steve Bauer, the technical lead on the MIT Internet Traffic Analysis Study (MITAS). “For me, the point of the paper is to improve the understanding of the data that’s informing those processes.”

In addition to Bauer, the MITAS team includes William Lehr, an economist, and David Clark, a senior research scientist at the Computer Science and Artificial Intelligence Laboratory who from 1981 to 1989 was the Internet’s chief protocol architect. The researchers analyzed a half-dozen different systems for measuring the speed of Internet connections, from free applications on popular websites to commercial software licensed by most major Internet service providers (ISPs). Both MITAS and MIT’s Communications Futures Program, which also supported the study, receive funding from several major telecommunications companies.

In each case that the study examined, the underestimation of the access networks’ speed had a different cause. The study that the FCC relied upon, for instance, analyzed data for broadband subscribers with different “tiers of service”: Subscribers paid differing fees for differing data rates. But the analysts didn’t know which data corresponded to which tier of service, so they assumed that the subscription tier could be inferred from the maximum measured rate. The MITAS researchers show that, in fact, the subscribers in lower tiers sometimes ended up getting higher data rates than they had paid for. In the study cited by the FCC, exceptionally good service for a low tier may have been misclassified as exceptionally bad service for a higher tier.

In other tests, inaccurately low measurements were the result of an idiosyncrasy of the Transmission Control Protocol (TCP), the software that determines how Internet-connected computers exchange data. With TCP, the receiving computer indicates how much data it is willing to accept at any point in time; the sending computer won’t exceed that threshold. For some common computer operating systems, however, the default setting for that threshold is simply too low. 

In practice, many applications get around this constraint by opening multiple TCP connections at once. But if an Internet speed test is designed to open only one TCP connection between two computers, the computers can’t exchange nearly as much data as they would if they opened multiple connections. Their data rates end up looking artificially low.

In yet another case, Bauer was running a popular speed test on his own computer. Much of the time, he was getting rates close to those advertised by his ISP; but one afternoon, the rate fell precipitously. For days, the test had been pairing Bauer’s computer in Cambridge with a test server in New York. But on the afternoon in question, the New York server was overburdened with other requests, so it redirected Bauer to the nearest free server it could find — in Amsterdam. The long sequence of links, including a transatlantic link, between his computer and the test server probably explains the difference in data rates, Bauer says. His ISP’s access network may not have been any more congested than it had been during the previous tests.

This points to the difficulty of using a single data rate to characterize a broadband network’s performance, another topic the MITAS researchers address in their paper. “What is it that people care about if they want to compare a metric of merit?” Lehr asks. “If you’re watching lots of movies, you’re concerned about how much data you can transfer in a month and that your connection goes fast enough to keep up with the movie for a couple hours. If you’re playing a game, you care about transferring small amounts of traffic very quickly. Those two kinds of users need different ways of measuring a network.”

The researchers have submitted their report to both the FCC and the Federal Trade Commission and will present a version of it at the Telecommunications, Policy, and Research Conference in Arlington, Va., in October. “This report from Dave, Steve Bauer, and Bill Lehr is the first comparative study that I’ve seen,” says FCC spokesman Walter Johnson. As Johnson points out — and the MITAS researchers acknowledge in their paper — the FCC is currently in the early stages of a new study that will measure broadband speeds in 10,000 homes, using dedicated hardware that bypasses problems like TCP settings or the limited capacity of home wireless networks. “What we’re doing right now," Johnson says, “is a follow-up to the broadband plan, recognizing that we need better data.”


Topics: Broadband, Computer Science and Artificial Intelligence Laboratory (CSAIL), Federal Communications Commission (FCC), Internet, Piezoelectric materials, Research Laboratory of Electronics

Comments

In my view, if the US broadband network were where it needs to be, you would not have major carrier ISPs offering TV via direct broadcast satellite. Phone (VOIP), TV and Internet access all need to be available on one media feed not several. For me Internet costs approx. $40.00/mo., Satellite TV $55.00/mo and phone $30.00/mo. In Utah (Utopia) I can buy all three for around $100.00/mo with 60 Mbit Internet access (symmetrical). In Hong Kong I could buy 1 Gig Broadband feed for $26.00/mo US. http://www.convergedigest.com/DSL/lastmilearticle.asp?ID=30837&ctgy= Tell me again this country does not have a serious problem.
I subscribe a 4M line with other two roommates, and the highest speed of download is 400K. It usually gets congested. Maybe the growth of data rates depends on the growth of GDP.
tomblanford, the reason 1Gig Broadband is only $26.00 US / mo is because Hong Kong is a dense city and it costs relatively little to bring connections to each customer. Run one mile of fibre in Hong Kong and you have a Million customers served and paying for it. If you live in the middle of nowhere Utah, you've chosen cheap land and sprawl over smart development and thus it is very expensive to bring high speed to your area via fibre (satellite or wireless is a bit more practical but unfortunately has higher latency and reduced performance, which may be okay for an end user). Your utility bill may sound high but I can imagine what you paid in mortgage or rent on your home in Utah is tiny compared to that of Hong Kong (which hovers around $1000/ sqft to buy).
Getting back to my previous concern, in the US we've sprawled out too much while other countries have focused development around urban centers and transit oriented development. Thus it is more economical to wire up countries where people live in cities. In the US we've spread ourselves across the entire country and opted for cheap exurbs and a car based society rather than urban renewal, public transit and sustainable utility design.
Back to the top