May 2010 Archives

Yesterday, Ookla released the Net Index, a ranking of average broadband speeds by city, state, and country.

As background, Ookla is the company that powers the bandwidth tests at Speedtest.net and Pingtest.net. They also license their bandwidth testing engine to most of the major ISPs as well as the FCC, which used it along with MLabs to run bandwidth tests on broadband.gov.

This release of data is significant for a number of reasons.

Arguably the biggest reason is the sheer amount of data. While the FCC was able to collect about 100,000 bandwidth tests on their own, Ookla conducts more than a million tests a day, with more than 1.5 billion completed to date. So this data is an order of magnitude greater than anything else that's come out to date.

This data is also significant because it shows actual speeds being realized by users. Too much of our broadband speed data to date has been based on guesses about theoretical performance, with estimates generated through looking at maps and relying on network engineering principles, whereas this data comes from actual users on real-life broadband networks conducting bandwidth tests.

Ookla cites their bandwidth testing methodology as another significant strength of this dataset. "We right-size the payload," says Mike Apgar, Ookla's co-founder and managing partner. "In our engine on the server side, we have nine or ten different file sizes. What we do is a brief test that happens at the beginning that sends a single file that gives some idea of the connection we're talking about and whether we ought to use the 100MB payload or the 10MB."

Ookla also has the ability to use multiple threads where they can use more than one web server, domain name, and IP address to provide more avenues to make sure they're truly filling the pipe. Apgar shares the following analogy to explain the importance of doing this: "If you have a connection to your city's water, if you turn on a single faucet you're not measuring the system's full capacity. By turning on multiple faucets you can."

For these reasons, Ookla's bandwidth testing engine is able to accurately gauge the throughput of high-capacity broadband networks.

But that doesn't mean this data isn't without its limitations.

For example, in the ranking of cities by average broadband speed, Ookla only include those cities with at least 75,000 unique IP addresses. While it makes sense to do this to make sure you don't end up with smaller cities with skewed numbers, an unintended side effect is that it basically rules out cities with full fiber networks from these rankings since most fiber communities don't have 75,000 homes let alone 75,000 customers connected to fiber.

I mentioned this to Mike and he offered to pull data from smaller cities to help rectify this as he's a big supporter of fiber. And that's one of the great things about this Net Index is that the index data is available for free to everyone to use and analyze. They're also making the raw source data available to academics for research purposes. So we now have a resource that we can pull from for all sorts of reasons.

For example, Apgar shared that they have future plans to expand upon this, like creating a Value Index, which will collect and rank data about broadband networks' promised vs. actual speeds relative to the cost of service.

Another potential weakspot of this data is that it doesn't get into answering granular questions about where bandwidth bottlenecks might be occurring. While users can choose to test bandwidth between servers in different geographic locations to help determine where slowdowns might be occurring, this data doesn't reflect that.

A final limitation is one that's potentially unavoidable, namely that the data is skewed because it self selects its participants. Average joes likely aren't going out of their way to test their bandwidth; Ookla's primary users are techies. Techies tend to value bandwidth more and therefore their results will likely skew higher as they're more likely to sign up for higher capacity service.

But that being said, in talking it through with Apgar I came to realize that the full truth is a bit more nuanced than that.

One major trend is that with debates around broadband reaching the mainstream media that we have more of a public emphasis on bandwidth and broadband performance than ever, which is likely driving more people to become aware of bandwidth testing who might not have before.

It also dawned on me that all of those non-techies need techies to help them when their computers break, so that could be another way that non-techie homes are getting their bandwidth tested.

And Mike also shared with me that he's hearing from many people that installation techs for major broadband providers are using SpeedTest.net as the final step of setting a home up with broadband to have a way to easily show customers the kinds of speeds they're realizing. So in other words, there are a number of ways that these tests are reaching a more mainstream audience.

All in all this is an exciting development for broadband policy debates. More data should help us make better broadband policy decisions. And the Net Index represents one of the largest such datasets to date.

I'm going to be reviewing through their data and will report on what interesting tidbits I find.

But for now, if you're interested in starting to learn more for yourself, go check out the Net Index at www.netindex.com.

With each passing day America's national broadband plan looks more and more pathetic.

Yesterday I was reminded of Tasmania's broadband ambitions. They've set the goal of reaching 200,000 homes and businesses with 100Mbps within the next four years, which is estimated to cost $700 million in total.

The FCC set the goal for the US of reaching 100 million homes with 100Mbps by the year 2020.

Comparing the two, Tasmania's goal of 200,000 homes is roughly/relatively similar as the US's of 100 million relative to each area's respective population (500,000 vs. 300,000,000).

And yet Tasmania's goal will see its people achieving their 100Mbps future six years sooner than we will in America.

As another comparison, if the US were to step up like Tasmania is financially, then relative to GDP we should be finding some way to invest more than $400 billion in building our country's 100Mbps broadband infrastructure.

Instead, while Tasmania is actively executing on a plan to achieve this goal, our national broadband plan was silent on a specific strategy for realizing this 100Mbps future.

And Tasmania's example can't be swept under the rug by the excuse that their demographics or geography are more conducive to deployment than the US. Tasmania's density is 19 people per square mile, whereas the US's is 83 people per square mile. Plus it's not as if Tasmania's known as a global tech leader like South Korea or Japan.

So what this all means is that the US isn't just getting beat by the usual suspects, we're getting lapped by small island states like Tasmania.

How is this acceptable? How can the FCC say with a straight face that their national broadband plan sets our country on a path to being globally competitive when it aspires to achieve goals that will be six years behind Tasmania? What does it say about our country that despite having invented the Internet we don't have the gumption to come up with a plan that can keep pace with Tasmania?

When will we wake up as a country to the fact that the trajectory we're on threatens that when it comes to the digital economy we're going to be laggards rather than leaders?

For my money, that realization can't come soon enough, especially when we can't even keep up with Tasmania.

On Friday I found a link to this UK article on DSLReports.com with the title "Government should concentrate on 2Mbps broadband."

In it is the statement from Simon Piper of ConsumerChoices, "Focussing on national superfast rollout is great for government sound bites but it detracts from the need of simply getting everywhere connected with 2Mbps."

This argument is made in the context of the UK's push to establish a national broadband plan of its own. So this isn't a theoretical or philosophical point, but rather is an argument intended to shape national policy.

And what is that point? That a national broadband plan should focus all of its energy on getting everyone connected at a baseline speed instead of worrying about getting anyone connected at a superfast speed.

To date, the trajectory of the UK's broadband policy debates seems to be largely headed in this direction, and yet what no one seems to be realizing is that the debate between whether to focus on universal baseline broadband vs. next generation broadband is a canard.

Any comprehensive national broadband policy must focus on BOTH universal baseline and next generation broadband deployment.

Is achieving universal baseline broadband extremely important? Yes! Every last citizen must have access to the Internet. At least with today's Internet, that first couple of Mbps are more important than the next hundred. And the Internet does become more powerful if everyone's on it.

But to suggest that pushing to achieve universal baseline broadband should come at the expense of paying any attention to also pushing for the establishment of a next generation Internet infrastructure is misleading at best and a dangerous obfuscation at worst.

For starters, there are common elements to the solutions for achieving universal baseline and next-generation broadband. For example, the need to deploy more fiber deeper into networks. Wireless towers need more fiber, as do the central offices of telephone companies that are upgrading legacy telephone networks to DSL. So no matter your ultimate goal, you need more fiber to achieve it.

But this is really only just the beginning. Where this either/or proposition gets dangerous is in how it limits the scope of your country's ambitions.

Take statements like this, also from Piper: "Access to reliable 2Mbps connections all the time would be a great help to keep smaller businesses online and running at a level which will in turn support Britain's economy."

Now on the surface, this statement is true. If businesses don't have access to broadband today, then 2Mbps will allow them to start benefitting from the Internet, which will ultimately help the country's overall economy. But let's pick out a couple of key phrases and analyze them a bit further.

"reliable 2Mbps connections all the time" - It's great that they're keying in on the need for networks that can actually deliver the bandwidth they're selling. What they fail to mention, though, is that to get reliable networks like this you need a lot more fiber. Also, no word on affordability. Just because reliable 2Mbps connections are available, does that mean they're economical?

"keep smaller businesses...running at a level which will in turn support Britain's economy." - This is where things get dangerously obfuscated. This statement seems to suggest that 2Mbps is good enough for the UK's businesses to be able to use the Internet to grow both themselves and therefore the economy as a whole. But is that really true?

What if a small business has ten employees? If they're stuck with a 2Mbps connection then that means if everyone's online at the same time they each only get 200Kbps, which isn't enough to do a lot of what the Internet makes possible, like any video-based applications.

What if that small business has large files to send? Like an engineering firm sending architectural drawings to their general contractor or a doctor's office sending MRI results to be reviewed by experts or an independent production house sending videos to be proofed. These files can be many megabytes and even gigabytes that may take hours to send over a 2Mbps pipe.

What if that small business wants to host its own servers in-house rather than paying to co-locate them in a data center elsewhere? That'll be almost impossible over a 2Mbps connection.

What if that small business wants to start using high-end videoconferencing to have remote meetings? Depending on how high end the system is, you might not even be able to sustain one meeting with only 2Mbps of bandwidth available.

Now let's add another layer of context: this 2Mbps baseline discussion is primarily happening relative to rural areas.

So what does that mean from the perspective of the future economic viability of your rural communities if all small businesses can get is 2Mbps? Are we saying that we're OK with the notion that rural communities won't be able to support small businesses that rely on the Internet? That if you want to start a small business you have to be in the city to have access to the connectivity you need?

See, this is where this baseline-vs.-next-gen canard gets dangerous. If the article would've said that consumers only need 2Mbps, then that's one thing. But to say that 2Mbps will be adequate for small businesses ignores the realities of 21st century economic development.

Beyond these more specific points, let me also say how absurd I think it is for anyone to suggest that we should focus solely on universal baseline broadband and not put any thought to spurring the deployment of next generation networks.

First off, everyone agrees that superfast broadband is essential for the long-term economic health of any country in the 21st century. Piper is even paraphrased as saying so in this article: "superfast broadband is a great concept and essential to keeping Britain competitive in the global digital economy..."

With this in mind, why wouldn't we want policies that helped encourage superfast broadband deployment? This is especially true given two factors:

1. Superfast broadband takes a lot of time and energy to deploy.
2. The world's leading broadband countries are already way ahead of the curve.

So what this means is that all things being equal, a country like England is already behind the broadband curve, and yet it has people advocating that it do nothing proactive to attempt to at least stay even let alone catch up.

Think about that. Piper's saying that while superfast broadband networks are vital to our future, that it's not super important we do anything about them despite the fact that other countries are pouring billions into deploying superfast networks of their own.

Does that sound like a recipe for long-term success in the digital economy? To not just cede any hope of taking a leadership position, but to completely ignore the trajectory that other countries are on.

Of course, I understand that at the core of these debates is the age-old question of what government's role in the deployment of broadband should be, and that much of the push back against the idea of a national broadband plan that focuses on superfast broadband is the assumption that such a plan will include massive government subsidization of networks.

But I think that's too simplistic a way of looking at things. Instead I say that government's role is to identify what its citizens need, what they currently have, and then find ways to close that gap. Now, that could mean government writing a check, but it doesn't necessarily have to.

The point of the matter is that there's a need for the UK, the US, and every country to some day have superfast broadband. There's the reality that it's going to take a lot of time, money, and energy to build these superfast broadband networks. And there's the likelihood that the market alone won't get these networks built.

Now the question needs to be, what can government do to close that gap? Because without government playing an active role in doing so, then the best a country like the UK can hope for is universal baseline broadband while the rest of the world steams ahead at superfast speeds into the 21st century.

There are so many things wrong about the way we debate broadband policy, but one of the most erroneous is the conflation of regulating the Internet vs. broadband.

Now, I don't necessarily blame people for this confusion as the terms "Internet" and "broadband" themselves are not as well defined as they could be.

But here's a simple way to think about it: we need to regulate networks differently from traffic, just as we regulate roads differently from the cars that traverse them, or electrical networks from the appliances that use them.

We need one set of regulations for how physical networks are built and operated, and another for the things that happen on these networks.

While it may seem like an obvious proposition, too often broadband policy debates ignore or gloss over this distinction.

Arguably the most egregious example of this is the idea that net neutrality regulation and the push to reclassify broadband represent massive new pushes to regulate the Internet.

Ideally, if done correctly net neutrality rules and broadband reclassification won't have any direct regulatory contact with the Internet. That's because these issues deal with how broadband networks are operated, not with how the Internet is used.

Of course, I'm not naive enough to think that this issue is cut-and-dry. There are a number of areas where the lines blur between operating broadband networks and using the Internet.

For example, content delivery networks--which are what host and deliver most of the Internet's websites, content, and apps--often strike up deals with broadband operators to get preferential treatment to deliver their traffic.

Unfortunately, there's not necessarily any easy regulatory answer to the many complicated questions that attempting to regulate either broadband or the Internet pose.

But I do know this. If we continue to allow our broadband policy debates to blur the lines between regulating broadband and regulating the Internet then we're never going to really good broadband policy, or even productive broadband policy discussions.

To get us headed in the right direction, we need to start by recognizing that regulating the building and operation of broadband networks requires different solutions than regulating the delivery of sites, services, applications, and content on the Internet.

So the FCC just released its first technical paper in support of the national broadband plan. It purports to be analyzing the "broadband availability gap" and solutions for closing it.

Presumably this document is intended to make the data-driven case to justify decisions that are soon to be made in the reform of the Universal Service Fund to subsidize broadband deployment to those that don't have access to it today.

While there's lots to discuss regarding the assumptions and recommendations they made, I want to start our analysis by focusing on the FCC's continuing tendency to be uneven in their support of fiber and policies that encourage fiber deployment.

In some ways, this paper is incredibly supportive of fiber. For example, check out these quotes from the report:

"...fiber is expected to be the primary backhaul choice for service providers because it offers a scalable, future-proof backhaul solution."

and,

"...a fiber-only architecture has one significant strategic advantage. As broadband needs continue to grow, fiber emerges as the only last-mile technology capable of meeting ultra high-speed needs."

and,

"...any solution that brings fiber closer to the home by pushing it deeper into the network puts into place an infrastructure that has long-term strategic benefits. On balance, therefore, we need to weigh this strategic benefit against the higher associated cost to evaluate the value of a fiber-only architecture..."

I read these quotes as basically suggesting that we need to make the deployment of fiber a focal point of our national broadband policies.

And yet elsewhere in the report the FCC ignores fiber completely, like on their chart Exhibit 4-B that lists the capacity of different broadband technologies for supporting simultaneous streaming that for some reason fails to include FTTP.

Or in their discussion of BHOL (or busy hour load assumption, which is essentially the capacity of a network to deliver bandwidth to every user if they were all online at the same time) they don't say a word about how fiber blows every other technology away in this category.

Another omission is the relative cost of bandwidth that can be delivered by different broadband technologies. They do cite the cost of bandwidth delivered via satellite as being prohibitively high, but they ignore the additional value users receive by getting a lot more bandwidth for their buck through fiber than any other kind of broadband.

I also think this report might also be factually wrong when it comes to the ongoing costs of fiber. The FCC claims fiber has highest ongoing costs, yet I have a hard time believing that when operators have shared that going all fiber produces a 70-80% savings in operating costs, plus fiber should be much cheaper to upgrade over time as all that's needed is a change in the electronics whereas DSL and other networks will require additional fiber deployment.

Perhaps the most frustrating omission of all is any discussion surrounding the lifetime cost of making different investment decisions. The FCC seems to imply that the best approach is an incremental one where first we upgrade telephone lines to 12kft DSL then to 3kft DSL then to fiber. But how much will that cost compared to just putting fiber all the way to the home in the first place? How much money will be wasted on DSLAMs and field electronics? How much opportunity will be missed by rural Americans who can only access limited bandwidth while the rest of the world is operating at 100Mbps?

Instead of answering these more holistic questions, the FCC decided to instead just look at the bottom line of how much will these broadband networks cost to build, and then chose their recommendation based on the cheapest option.

The FCC says they want to make decisions based on data, and yet their report is incomplete if the intent was to provide a comprehensive comparison of the options available to us. And even worse, the FCC seems to be ignoring their own data, which suggests that the right approach is to focus on policies that encourage fiber deployment ever closer to end users.

The FCC says they're trying to be technology neutral and yet they base their whole report around the notion that since 12kft DSL and 4G wireless appear to be the cheapest to deploy initially that they're the best option with no consideration of the long-term ramifications of investing in infrastructure that we're going to have to significantly upgrade again in less than a decade.

So in summation, when it comes to fiber, the FCC has decided to take a narrow view of what's most important about broadband, to ignore its own data that supports fiber-centric policies, and to basically set the country on a path towards mediocrity.

I'm not suggesting that we need to set the goal that every last home in America get wired with fiber. I understand that there are some that just may be too far out there. But we should be looking at those outliers as the exception not the rule. We should be setting standards that strive to connect as many Americans as possible to the only broadband technology that's truly competitive globally. We need to stop assuming that fiber's too expensive and start looking at these issues more critically and holistically.

I had assumed that given the tremendous time, energy, and taxpayer dollars that went into the national broadband plan that the FCC would've made the effort to do this for us, but I guess I hoped for too much.

Because of this I think the onus is going to be on us, the supporters of fiber, to correct the FCC where it has erred and to make sure that we don't end up setting policy based on faulty assumptions and inaccurate or incomplete data.

So if you see something in this report that doesn't look right or that's missing, please feel encouraged to speak up! Add a comment to this post! Write your own blog post! Submit comments to the FCC! Do something!

We need to not allow these oversights to be overlooked. Our country's future depends on making sure that we use the right data to make the right broadband policy decisions.

Yesterday FCC Chairman Genachowski announced his intent to reclassify broadband as a Title II service from its current position as a Title I service.

Without getting into policy wonkdom, the gist of this is that broadband will move from being largely unregulated to being somewhat more regulated, which will give the FCC the authority it needs to regulate what happens on broadband networks and how they get built.

Not surprisingly when it comes to an issue as polarized as net neutrality, everyone's unhappy. Net neutrality opponents are referring to this as a nuclear bomb, as 9/11 for the internet. Net neutrality supporters are cautiously supportive, but only because as recently as last week there were rumblings that the FCC wasn't going to do anything substantial.

Basically the FCC is in the middle of one of the biggest tug of wars in telecom policy history, with each side claiming that if they give an inch the world will end.

But I'm seeing a potentially huge problem with the tenor of things to date. It seems like the FCC is thinking about this policy more from the perspective of trying to make people happy than focusing on what's best for America.

They introduced their desire to move broadband to Title II, but hedged that push in a number of ways to appease the incumbents. So their removing of rules associated with Title II service at least appears from the outside as being less about good policy and more about trying to minimize how much this will piss off incumbents.

While this might just be political rhetoric, there also seems to be a strong sense that part of what led to this announcement was the backlash from net neutrality supporters that Chairman Genachowski was on the verge of deciding not to do anything to rock the boat following the FCC's loss in the Comcast-P2P case.

These are both incredibly troubling.

For one, Genachowski needs to realize that the only way incumbents won't scream bloody murder and threaten to sue and slow down broadband investment is if he does whatever they tell him. It doesn't matter how much he forebears from the Title II classification, he's going to get tremendous push back from the private sector.

For two, Genachowski should be able to take strong policy stances on his own without being pressured into it by activists.

We're not going to get anywhere in terms of making forward progress in these incredibly contentious debates if the FCC Chairman's too worried about trying to make everyone happen. We need an FCC Chairman who's willing to take stands on his own based on what's going to make the best communications policy for America, not based on the politics of who it might piss off.

We need an FCC Chairman willing to tell the incumbents that they will be regulated, that the more they fight the tougher those regulations might be, and that they need to be willing to compromise otherwise they won't get any special considerations

We need an FCC Chairman willing to talk down the grassroots activists, to not allow idealism to cloud the realities of needing to be pragmatic, that sometimes laws can't go as far as true believers want them to, that sometimes we don't have the resources to do what's right and needed, and that we can't write off incumbents as evil because they have to play a major role in moving the country forward.

We also need an FCC Chairman who's willing to provide leadership to Congress on these complicated issues, which frankly most of them have at best only a passing understanding of. We can't just sit back and only take what Congress is willing to give. We need an FCC that can pressure Congress to get off their hands and help the country take action where needed.

Dear Commissioner Genachowski, we need you to step up and lead, to realize that you can't and won't ever make everyone happy, and that instead what's more important is that we get the best possible communications policy to move our country forward.

So consider the gauntlet thrown down at your feet. You can continue to talk about a third way as some feel-good compromise, but I don't care about whether anyone likes what you're doing, I just care about figuring out what's going to be the best policy for America. I can only hope that you do too.

So there's a lot of talk about how, for good or bad, public broadband competes with private networks.

Supporters of public broadband say that the impetus for these projects is a need for greater competition.

Critics of public broadband paint it in terms of unfair government-subsidized competition that distorts markets.

But I've got a different view: I think public broadband is complementary to private.

Starting at a high level, public broadband projects fill that gap between what communities feel they want and need in terms of network capacity and performance, and what existing private operators are willing and able to deliver.

Private operators have legitimate fiduciary responsibilities to their shareholders that forces them to focus their limited resources on investing in projects that offer the greatest returns. Naturally this limits how much they'll invest in any given community, how fast they'll ramp up that investment, and which communities they choose to invest in.

Assuming we all agree that every American wants, needs, and deserves access to lots of bandwidth at low costs, then there's no denying that a gap exists between what the market needs and what private operators alone are able to supply.

This is where I see the value of public broadband as complementary to private networks.

Public operators have different motives than private operators. They're driven more by off-balance-sheet benefits rather than profits, more by getting everyone connected than focusing only on those that can pay the most money.

Because of this, public operators have a much broader range of projects they can justify investing in.

This is an especially important fact when considering the challenge of connecting rural America. Private profit-maximizing entities will always be limited in terms of how far out into the country they're willing to go as the lower the density the higher the costs and lower the revenue potential of an area.

In fact, I'd argue that pretty much all of the most successful rural broadband projects which have created the most well connected rural communities were public networks. A simple truth that I've shared before on this blog is that our country's rural broadband challenges won't be solved by private profit-maximizing entities. We can only truly address rural broadband if the operator cares about people more than profit.

Now this isn't an indictment of private operators and the desire to make profit. And in fact public broadband is also complementary to private in the way that it enables private operators to expand their networks more quickly with lower costs to generate even greater profits.

For example, I'm starting to hear rumblings that the availability of ubiquitous high capacity public fiber is becoming a key determining factor in where next generation wireless networks are first being deployed.

In areas that have these networks, private wireless carriers are realizing that tapping into these networks is cheaper than if they were to try and build their own fiber. It's also much faster to deploy as now all they have to do is worry about putting up radios and plugging into the existing fiber.

These same complementary benefits of public fiber networks are being realized by private operators to support their wireline businesses as well. Rather than building new long-distance runs where they either don't exist or the ones that are there are at capacity, some private operators are making the smart business decision to instead run on public networks.

And on a more contrarian note, I wonder if the competition public networks create is actually complementary to private networks as well.

Is this competition driving down prices and potentially hurting private profits? Yes, but let's think about that a bit more. If prices were artificially high prior to public competition, then what impact might that have been having on suppressing demand? How many people and businesses weren't getting service because it was too expensive or not fast enough? Sure public competition may be lowering the profit driven from each user, but it also could be expanding the number of overall customers.

I've also long wondered when or if the presence of public competition will spur private providers to innovate more in the service offerings. To date, public competition has primarily led to lower prices and faster speeds. But if one argument against public broadband is that government can't innovate like the private sector, then maybe public broadband will spur all sorts of new innovation in private broadband that might not have happened otherwise.

I bring all this up today to point out that I think we need to expand the debate around public broadband. Simply saying that we need public networks because we need more competition is missing the bigger picture.

We don't necessarily need more competition. What we need are better, faster, cheaper broadband networks. And public broadband is helping achieve that goal all across the country in ways that complement the efforts private operators.

Our debate over getting America wired shouldn't be an either/or proposition. It's not about public vs. private broadband. It's about getting all Americans connected as quickly, cheaply, and robustly as possible, and utilizing all of the resources at our disposal to achieve that goal.

About this Archive

This page is an archive of entries from May 2010 listed from newest to oldest.

April 2010 is the previous archive.

June 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.