Vous êtes sur la page 1sur 12

Measuring and Understanding Broadband:

Speed, Quality and Application


June 8th, 2010
There are a variety of efforts underway that should help the FCC understand the state of
broadband, and the FCC/Sam Knows program is merely one of the latest. Although unsolicited,
Ookla recently released Net Index, a free compilation of results broken down by geographic
regions, derived from the more than 1.5 billion test results performed using our technology. Those
performed at our free sites, Speedtest.net and Pingtest.net, are directly utilized, and combined,
these sites provide a free and accurate way to quickly measure every important technical aspect of
a broadband connection download and upload speeds (throughput), packet loss, jitter and latency
(ping) that are widely accepted as the fundamental attributes necessary to a quality Internet
experience.

That Ookla looks beyond the concept of the speed test and towards true service quality makes us
a unique and important voice in the ongoing education of residential broadband customers, academic
thought leaders, and industry stakeholders.

Before diving into the methodologies of our testing and why measurement matters, lets examine
Ooklas background, and address concerns that our free services and data are anything but serious
technical tools and resources from a profitable, thriving, and sustainable business. Our paid licensing
division counts almost every major Internet Service Provider (ISP) in the world as a client, including
more than 80 percent of the top ISPs in the U.S. plus hundreds internationally. A wide variety of
interested parties all around the globe use our solutions, including the FCC. Our applications and
data already serve a wide range of paying clients in the academic (Harvard, MIT, Stanford), service
(AT&T, Comcast, Cox, Verizon, Vonage), hardware (Nokia, Siemens, Sun, Linksys), and media (Reuters,
Doubleclick, CNN, ESPN) markets.

So how did we become the single largest and most trusted source for broadband measurement?
As founder and CEO, I built Ookla, on a foundation that includes 10 years of personal leadership
at Speakeasy, a broadband service provider sold to Best Buy in 2007, and the collective team
experience of 30 years working at a national independent broadband provider. We pioneered a
focus on faster upload speeds, an emphasis on lower latency connections to enable voice-over-IP
(VoIP) and were recognized for our tireless effort to deliver a richer broadband experience for our
customers. This background provides team Ookla with a unique and deep set of experiences with
which to build the applications that allow millions of daily visitors to determine if their providers are
measuring up. We deeply understand that, in the presence of competition, there is tremendous
incentive to deliver more, for less.

With this unique perspective and compounded experience, its safe to say that we know better than
most that to understand the quality of a connection, you must go beyond simply measuring speed.

Broadband measurement is a highly variable process. For example, testing to and from a server
far away introduces latency that, in turn, means a slower download speed. Testing to a completely
different server demonstrates results that vary based on network distance, all while reflecting
the real-world fact that the Internet itself, beyond an Internet Service Providers (ISP) network, is
unpredictable and mostly unmanageable. For this reason, our Net Index utilizes only those test
results where the server is within 300 miles of the end-user, largely eliminating the significant
latency introduced when transiting multiple hops (i.e. routers) and minimizing the Internet middle
significantly.

Still, people want to test their service quality not just across town, but also to servers around the
world and quite understandably for this is ultimately how a connection is used. We have shifted from
a hyper-local to a daily, global Internet experience, accessing information from servers scattered
all over the world. We understand this basic fact, allowing people to select any of 600 servers at
Speedtest.net understanding that, when they do, those particular results are naturally going to vary
widely. The real-world network context must be considered when evaluating the accuracy of any test
being applied to it.

So with this in mind, its obvious you must first understand what youre measuring before interpreting
the results. Its complicated, measuring the various aspects of a connection isnt a perfect science
nor is it as simple as many people would like. Frustration and confusion are the unfortunate side
effects of complex problems and variable measurements, and this is no exception. However, when
this confusion bleeds into and muddies the conversation about the National Broadband Plan, the
risk to the evolution of broadband in the United States is great. What follows is an honest and open
exploration of the methodology, data, and analysis that will provide a framework for next steps of the
broadband revolution happening in America today.
Guiding Principles Of Ookla Broadband Measurement
1. What We Measure

2. How We Measure What We Measure

3. Why We Measure What We Measure

What We Measure
Reference: http://blog.ookla.com/2010/05/14/testing-speed-tests/

We measure the maximum sustainable throughput of a connection at a given location to and from a
server of their choosing. We believe we do this better than anyone else for two reasons:

1. In some cases, other speed tests are not trying to measure this but rather, as an
example, the speed you might see when streaming a video clip or downloading a
PDF via a web browser. Measuring a single file transfer is entirely different than
measuring maximum sustainable capacity of the connection.

2. Our testing engine is superior for this task due to specific technical reasons we
discuss in our blog.

Remember that, with many variables in play, even two tests run back to back to the same server
can and do vary somewhat. Whether its an application updating itself in the background or your
DVR downloading an HD trailer for a VoD (Video on Demand) service, occasionally a test result is
wrong, and its for this reason that we encourage people to test their connection multiple times
and at different times of the day. The results are always correct, but only insofar as they reflect
the potential of the connection at that moment in time. With Net Index, and in all of the broadband
reports we produce, we critically analyze results from the same computer (using a combination of IP
address and cookie to identify each uniquely) in order to mitigate these types of issues.

How We Measure What We Measure


Reference (again): http://blog.ookla.com/2010/05/14/testing-speed-tests/

For the sake of avoiding significant technical details that can make this process unnecessarily difficult
to comprehend, here is a brief outline of our download testing methodology, once initiated by a client:

1. Small binary files are downloaded from the web server to the client to estimate the
connection speed

2. Based on this result, one of several file sizes is selected to use for the real download
test

3. Up to 8 parallel HTTP threads are employed for the test, based on the estimated
speed
4. Throughput samples are received at up to 30 times per second

5. These samples are then aggregated into 20 slices (each being 5 percent of the
samples)

6. The fastest 10 percent and slowest 30 percent of the slices are then discarded

7. The remaining slices are averaged together to determine the final result

Cache prevention and other technical fail-safes, as well as environmental corrections and remedies
are employed throughout the testing process. Eliminating portions of the test results are done to
eliminate ramp-up and other anomalies, this method is the result of intensive testing and provides
optimum accuracy.

The test is a true measurement of HTTP throughput from the server to and from your client. The
server controls packet size and all other TCP parameters, so tuning is performed to maximize
throughput potential.

No method or analysis is left unexplored when it comes to pursuit of the most accurate result. One
glaring problem with many state of the Internet reports is that businesses, schools and other non-
residential locations are included in their results, which creates rankings that are highly misleading
when attempting to determine, as is usually the case, the state of household broadband. Currently
at Net Index, Ookla includes and ranks only those tests that, after multiple sophisticated filters,
are determined to be from a residence. For many years some reports have cited Sandy City, Utah
and Iowa City, Iowa as two of the fastest cities in America. These are the homes of the University
of Utah and the University of Iowa two campuses that have, as is true at most major universities
now, implemented not only extremely fast Internet connections that arent realistic choices for most
people at home, but also web caching servers that deliver a lot of content over an Ethernet or Fiber
LAN, the speed of which appears to be included in these historical reports.

What you measure is key, and Ookla is careful to ensure its strictly that measurement that can and
should be used to compare to the headline or promised speeds of residential users that are the
subject of current discussion and debate in the U.S.

Why We Measure What We Measure


Reference: http://blog.ookla.com/2010/05/10/why-take-a-speed-test/

With a firm understanding of what should be measured and why, read more about why you should
join the 40 million people who use our speed and quality tests every month. Its also a method that
allows you to understand the true potential of your broadband connection rather than only showing
you what a single application on a single computer can do. This reflects the practical reality of the
connected home where if its not due to multiple applications its multiple computers and increasingly,
a wide variety and number of devices that require Internet connectivity to function. In the popular
plumbing analogy, we measure the equivalent of the maximum flow of water feeding your home
rather than the flow of a single faucet. In terms of measuring and managing broadband for the
purposes of understanding what an ISP is providing you, this is the only method that accurately
reflects the critical components of download and upload speed.

While alternative testing methods do exist and loosely use the term speed test to describe their
application, those results are naturally going to vary from ours. This is a part of what creates
confusion in the marketplace, leading some analysts and technical professionals to conclude all
speed tests vary widely and, to paraphrase, produce results that are all over the place and cant
be counted on. Ookla firmly believes that, once you understand what were measuring, you will also
understand why were measuring it and can be confident that the results you see are absolutely
accurate1 and actionable.

With a better understanding of what should be measured and why, read more about why you should
join the 40 million people who use our speed and quality tests every month, 40 percent of them new
users.

1 In rare circumstances some security software can act as a proxy, delivering the speed test payload from cache,
meaning from the local memory on the client machine. In such cases, users see a result that is obviously highly
unrealistic. We eliminate such bogus records from our reports and over the years have eliminated almost all of these
problems. We dont believe that asking someone to turn off their anti-virus software is a reasonable solution and
fortunately, the next iteration of our application makes web proxies like these an entirely moot issue.
A Closer Look
1. Inconsistencies

2. Misconceptions

3. Key Considerations

Inconsistencies
As discussed earlier, variance in the questions being asked - not to mention the methods employed
attempting to answer them - lead to significant inconsistencies. These inconsistencies in turn
lead to misunderstanding and completely false conclusions by trusted authorities on the subject
of broadband performance. Most often, these false or damaging statements are completely
accidental. In other cases, they are convenient straw man arguments necessary to support foregone
conclusions. In still others, they impact the very nature of the discussion, throw people off the trail of
what is so critical to pursue and too often, lead to subject fatigue where people simply begin to lose
interest due to on-going confusion and heated, dead-end arguments.

It is the responsibility of stakeholders to consider areas where they can uniquely add value, even
if merely to corroborate results. Any contribution has merit and should be aggressively explored
for the value it contains when the very future of a countrys broadband infrastructure is at stake.
The Internet represents the single greatest technical achievement in the history of mankind. At the
same time, in relative terms, we are just beginning to discover what a ubiquitous reliable, high-speed
communications and digital delivery system can do for people, business, institutions, and states. The
stakes are very high, critically impactful to our nations future GDP, to say nothing of the countless
benefits society as a whole.

Ookla believes that the FCC should be using every resource at its disposal to achieve the aggressive
and worthwhile goals that have been set out. To miss opportunities to make use of a massive data
store that promises untold depth of information regarding the state of the Internet in the United
States is unacceptable, much more so when the alternative carries a multi-million dollar cost and a
questionable track-record when previously deployed in other countries.

Misconceptions
Some may question the validity of the data made freely available by Ookla, going so far as to label it
unscientific (due primarily to self-selection) and therefore useless to policymakers. This vocal minority
goes on to suggest that asking some 10,000 individuals, just 200 per state, to allow a third-party
and foreign owned company to deploy mysterious devices on their home network, is somehow much
more so (scientific). Let us explore that claim.

We feel compelled to point out that asking for volunteers willing to add a device to their home
network that is going to monitor everything they do on the Internet will, without a doubt, create a
common demographic and psychographic profile, not a broad-based representation of the Internet.
Here we see self-selection in play in a more pronounced and impactful way than at Speedtest.net,
and with a sample set that is but one hundredth of one percent (0.0158 percent) of the 65M unique
U.S. visits to our Speedtest and Pingtest sites in just the past 12 months.

Of great concern, participants will be required to meet certain criteria, which will likely eliminate
extremely key participants in a study such as this. Consider for a moment that one of the
requirements to be a volunteer and accept this device into your home is that you must not be a
heavy user (download more than 30 GB of traffic per month2). First-hand experience dictates that
it is precisely these heavy users that compound the challenges of any broadband ISPs network
management, but we would encourage anyone doubting the claim to speak with the operators
at other major carriers. One could argue that finding out exactly what it is these people are up to
should be one of the primary goals in determining whether the infrastructure, packages, and policies
(for starters) are geared toward a workable future.

Finally, we understand the study is to go on for 25 months and, so, if you are likely to move or
change providers in that time, you are also undesirable. This all but disqualifies residential users in
the younger, more mobile, demographic of Gen-Y, who are more prone to changes of residence and
service due to economic and personal upheaval. It is these users who realize the heaviest and most
forward-thinking access of broadband service in the U.S. and will be the key audience served by any
improvements stemming from the FCCs current efforts3.

The concept of science is being liberally applied here, on face value, those claiming this method
vastly superior to the data freely available around both quality and speed, plus the data the FCC
is collecting using the MLabs and Ookla testing engines are either well outside of their range of
expertise, or severely misinformed.

Key Considerations
There are a number of key elements involved in creating and operating a global broadband testing
solution, what follows are a few that set us apart:

1. Volume of Data

Ookla compiles more than a million speed and line quality test results every

2 A single Blu-ray movie is between 8-12 GB and a standard DVD at least 4 GB. Apparently if you watch more than
one or two movies per week over the Internet you arent a material part of the future of broadband in the United
States.! The streaming and VoD industry, not to mention the future of television is just took a big hit if so.

3 Gen-Y moves residence and employer every 12-18 months, on average. http://www.census.gov/population/
www/socdemo/migrate/cal-mig-exp.html & http://www.businessweek.com/investor/content/jun2007/
pi20070624_294649.htm
day, dating back to September of 2007, history of 5 percent compound
growth each month since.

Data collected includes the entire range of technologies, wireline (cable, DSL,
fiber) and wireless (WiMax, Microwave) and major mobile devices (iPhone,
Android) across the entire globe.

2. Testing Server Infrastructure

Proposed alternatives for speed and quality testing is limited to just eight
servers and in some cases encouraging ISPs to also host nodes within their
network, resulting in measurements of strictly the last-mile and unable to
account for Internet backbone capacity and congestion.

Severely limited test server infrastructure results in significant latency due to


distance that will dramatically impact the results.

The Ookla server infrastructure has been in place for many years and
continues to add new servers almost daily. More than 600 servers are in
service, including a presence in every country recognized by the U.N. Ookla
offers more than 140 servers in the United States alone.

3. Fully Managed, Dedicated Host Infrastructure

Ookla uses rigorous QA measurements to verify servers are performing


accurately based on daily server statistics, reports from users and geographic
comparisons to other servers in the area.

4. Compatible, Immediate and Free

Our web-based applications work across all Operating Systems and all major
Browsers.

There are no special requirements for user data collection, such as


mandatory hardware or user profiles.

Pingtest.net and Speedtest.net are free tools that anyone can use.

5. Globally Accepted and Comparable Solution

Because Ookla collects and compiles data from around the world using the
same technology employed by thousands of ISPs, it is consistent, making it
accurately comparable to a huge number of other data sources.
6. Ookla Is An Industry-Adopted Standard

Serving thousands of ISPs, ASPs and MSOs, we are in a unique position to


understand and measure their networks, our tools are constantly evolving in
accordance with changes in the broadband industry.

The FCC and several other U.S. Government Agencies, along with
Independent Non-Profits, and other BB Initiatives use our solution.

Approximately 257 million unique individuals have used Speedtest.net or


Pingtest.net to test their broadband performance.

7. User Privacy

All tests are completed voluntarily by users and any private information is
removed from Public Data and Reports.

No installation, software or hardware, is required, so there is no opportunity


for a third party device to monitor traffic or activity on the users computer or
network.

Why You Should Care


To be clear, its going to take more than one study or data resource to get a clear understanding
of residential broadband service in the United States. It is easy to talk about the limitations or
issues involved in one method or another without bothering to recognize that, with a little effort and
thought, many of these can be largely mitigated or effectively eliminated and extremely valuable
data derived. It is in this spirit that we explore some of the common sense issues that ought to be a
part of the conversation.

A recent survey (PDF) that reported 80 percent of people dont know what their speed is supposed
to be could actually be quite misleading. For one thing, it was a question asked via a telephone
survey where you are expecting the person to know the answer off the top of their head. It is easy
to imagine most people would need a minute to look at their bill or check their online statement.
Many people you ask dont know exactly how much they are paying each month for cable. How
many people know the total amps of power being delivered to their home, the number of BTUs their
furnace creates, or the horsepower of their car?

For most people, most of the time, their broadband service works perfectly well for what they use
it for and even if they experience problems or slowdowns, the answer doesnt lie in memorizing
their broadband speed most dont really care and shouldnt have to. Since when do we expect
consumers, on a broad basis, to know the answers to the technical attributes of even fundamental
home services and products they use every day? Its as if the Internet is so new we believe theres a
unique way of measuring and managing it. Why are we making this so difficult?

Consider this series of uncontroversial,


actionable questions that exemplify the What do you do with your broadband connection in the mornings?
email/web stream video work
sorts of things we really need to know upload media download media

as we look to our broadband future: Does that differ between what you do with it in the evenings or on
weekends (your spare time)?
email/web stream video work

A survey recently carried out at upload media download media

Speedtest.net resulted in more than What are the biggest problems you have with your connection:

100,000 unique completed entries in


usually slow only certain times of the day
less than one week. Importantly, this
complete disconnects, "freezes", all services stop working
approach eliminates the variable of
self-selection in terms of its impact on Not Important Somewhat Important Important Very Important

the data it does not matter whether What do you pay for your broadband?

a person is technically inclined or not Do you get everything you want for that price, if not, what is it lacking?
when we already know their connection Yes No

speed and the result of speed tests Would you pay more for that, if so, how much?

performed on that connection. Indeed, No Yes

even the geekiest among us are limited Rank (1-5) the aspects of your broadband that are the most important to you:

to choose from the packages provided Price Speed Line Quality Services Value

by the ISPs in our area. Please rank from1-7, your complaints about your broadband service currently:
(1 being the biggest complaint)

Slow Periods Price Support Download Speed


And so its important not to miss the Upload Speed Services Line Quality

point of these exercises. People want


to know if they are getting what they
pay for. Governments want to know if
carriers are sufficiently competitive such that a healthy market is in play and just about everyone,
carriers included, are eager to successfully deploy next-generation technologies and perform the
infrastructure upgrades necessary to keep customers happy.

Consider all of the variables in play, measuring the connections of just 10,000 Americans, whether
you do that occasionally or all day long for a year, seems like a flimsy way to exclusively shape
broadband policy. So what is a more valuable way to measure, and ultimately address the questions
the FCC is struggling with in the ongoing debate?

For starters, Ookla has made over 800 million speed and quality test records available to the FCC,
for free. As of today, Ookla now politely asks Speedtest.net visitors to enter the speed they see
on their bill and how much they pay for this promised speed from their service provider. This
information will allow us to compare specific package offerings from an ISP in a given area and report
on not only what people are paying per Mb of speed but also detail how closely providers come
to meeting or beating expectations. This is merely the beginning of a continuing series of indexes
backed by a massive data set in collaboration with major academic institutions and our clients, all
aimed at delivering a faster and higher quality Internet for the community we serve.

Our goal is not to shape government policy4, but to frame the debate surrounding these questions
with the help of facts and a free exchange of information. After all, isnt that the very spirit and
promise of the Internet, itself?

@ copyright 2010 Ookla

4 Full disclosure: We do not want nor are we applying for any future government work in this area but some of the
data we have collected historically and several of our applications are already deployed and serving the FCC and other
federal government agencies today.

Vous aimerez peut-être aussi