October 2009 Archives

If Verizon Stops Deploying FiOS, Then What?

| No Comments | No TrackBacks

As Verizon nears the end of its initial $23 billion rollout of FiOS, it's seeming likely that they're going to focus more on marketing what they've already built rather than expanding their fiber footprint further.

While I'm admittedly disappointed that they're not going to continue investing in extending FiOS to all their customers, I can understand the business rationale behind this decision. Also I realize that just because they're slowing now doesn't mean they won't pick back up at a later date.

But even still, this shift away from deployment raises some important questions related to America's national broadband policy.

First of all, as an advocate of full fiber networks, I'm more than a little concerned about what's going to happen to the growth of full fiber connections in the US with Verizon no longer deploying. On the one hand they're going to be focusing more on marketing, which should help significantly increase the number of people using fiber. But on the other, a huge percentage of fiber's growth has been attributed to Verizon's efforts. So while fiber deployments have been on a steady upward swing over the past few years, are we going to look back at 2010 as the year that the pace of fiber deployment declined?

Considering the fact that the rest of the world is ramping up in a big way, the fact that our fiber growth may actually slow now that Verizon's not investing as heavily is terrifying.

And that brings us to the second major point. I've heard buzz in DC that many policymakers have the impression that Verizon's investment in FiOS (and to a lesser degree AT&T;'s investment in U-Verse, and the cablecos in DOCSIS 3.0) is a sign that the broadband market's working and that significant government intervention is not needed to make sure that next generation broadband networks are getting deployed.

But what happens to that political equation when you take out Verizon as a major investor? If they're no longer putting billions into fiber, and assuming no one else steps up to fill that void, then how can we say that our country's headed in the right direction?

The fact that we might be facing a decline in the growth of fiber deployment should stop any government official who cares about America's international standing in the digital economy in their tracks. We need to be seeing exponentially greater growth, not less!

And there's another trend here that policymakers can't ignore, namely that if we rely solely on the market to drive broadband deployment then we're going to be creating a nation of haves and have nots.

Just look at Verizon. They've basically only deployed FiOS to the most economically attractive areas. They haven't gone to many poor neighborhoods, inner cities, or rural communities. And even more troubling is that most cities they've gone into they've just built out the more well-off areas while ignoring the poorer neighborhoods.

Now again, I don't necessarily blame Verizon for doing this. They've got finite resources, and their job is to maximize profit, so it makes sense that they'd focus their investment on the areas with the greatest chance at realizing the largest returns.

But from a broadband policy perspective, I think we'd be remiss to ignore the fact that the market alone does not work if the goal is to bring world-class broadband to all Americans.

I hope this message makes it through to our policymakers as too often they seem ready to accept the notion that the market alone can solve all of our problems, but America's leading next-generation broadband deployer has proven that that's not the case.

I also hope this message makes it to many of my fellow fiber believers as I see this moment as a call to action for the rest of us. To some degree we've been able to hide behind Verizon's push, to get credit for progress being made even though that progress was largely attributable to one company.

The time has come for those of us ready, willing, and able to be deploying fiber in a big way to start doing it. Not only is the opportunity unlimited from a business perspective to capture markets with truly next-generation broadband, but our country needs us to step up and tackle this challenge. We can no longer count on one big corporation to push the ball forward on achieving America's full fiber future. It's up to us to make it happen.

RUS Skating On Thin Ice

| 1 Comment | No TrackBacks

One of the most notable things I observed at yesterday's Senate Commerce Committee oversight hearing on the broadband stimulus was the precarious situation that RUS finds itself in.

For starters, Sen. Rockefeller went out of his way multiple times to suggest that if he had it his way that RUS wouldn't have gotten any money from the broadband stimulus. He said that at the beginning, and then made another jab at the end where when referring to both Adelstein and Strickling he said that they represent one and a half agencies, alluding to his belief that RUS doesn't even constitute a whole agency.

It was interesting to hear a Senator call them out so directly, though it's not exactly surprising given the agency's many issues, like its historic inability to distribute all of its funds in a timely manner, like its penchant for giving money to areas that don't need it, and like its structure which favors profit-making deployers over community broadband efforts.

But that leads to another interesting happening from yesterday's hearing: watching Administrator Adelstein attempt to introduce his agency and its past in glowing terms.

Now, everyone acknowledges that Adelstein had nothing to do with RUS's past failures, and everyone seems to like him and want to see him succeed. But he's got a monumental challenge ahead of him to overcome, all while having to walk across a delicate tightrope.

The monumental challenge is obviously getting RUS working again, in particular the broadband grant, loan, and loan guarantee programs. While I get the sense they've got good people over at RUS, it can be hard to break through established habits and groupthink to chart a new course for the organization's future. It can be especially hard to introduce new ideas that exist outside of the box of their current view of the world.

The delicate nature of what Adelstein has to overcome is how he frames RUS's past relative to its future. I think he risks discrediting himself if he goes too far in praising everything RUS has done and doesn't address head-on the many concerns that Congress and others have about the agency's efficacy. Yet he obviously can't go too over-the-top in criticizing it as then he risks disenfranchising his troops by marginalizing their past efforts.

And while all of this is going on, the stakes couldn't be any higher. Already there are a number of Congressional types who are ready to divert all future funding away from RUS. And even those that still support the RUS must be rethinking that support to some degree given all the pressure against them during stimulus discussions to not further fund RUS.

So what this all leads me to believe is that in many ways this stimulus funding is RUS's last chance to prove that it can be an effective, efficient means of supplying the capital needed to overcome our country's rural broadband challenges.

If they can get money out (relatively) quickly to the right projects without being overly burdensome in their oversight, then they can start the process of proving that under new leadership they're righting the ship and not just being good stewards of taxpayer dollars but are also being an agent of change.

If they can't, if the money is slow in coming, if it ends up funding any of the wrong projects, if the stimulus distracts them from getting out their regularly appropriated funds, then I think it's likely they will not only not be included in any future broadband stimulus appropriations, they may also risk losing their authority over rural broadband altogether, with Congress potentially looking to shift those powers elsewhere.

I'll admit that this worst-case scenario may never come to pass as there are some Senators who may be willing to defend RUS until the grave, it can take a while to make any significant changes in Congress anyway, and there may not be anywhere better in government to handle these responsibilities, but I do think that through the convergence of the availability of these stimulus funds, the ongoing formulation of a national broadband, and the installation of someone as widely respected as Adelstein, that RUS faces its best, and possibly last great, opportunity to redefine itself, to prove to Congress that it can be an effective steward of taxpayer dollars.

I think the key of what it'll take for RUS to seize this opportunity can be found in Sen. Rockefeller's closing comment, where he echoed the sentiments of his colleagues by stressing the importance of doing the broadband stimulus right, and that to do that both Adelstein and Strickling were going to have to be clever. That just doing things the way they've always been done isn't good enough. And that both agencies need to prove themselves worthy of receiving additional funds in the future.

So I say to RUS, don't shirk from this opportunity! Don't allow yourselves to fall into the rut of doing business as usual. Now is your opportunity to redefine your agency for the 21st century. Now is your time to showcase how you are the best stewards of rural America's broadband future. I charge you to show the world what you're capable of. Rural America's counting on you.

Why Is The White House Ignoring Broadband?

| No Comments | No TrackBacks

I try not to be too alarmist in my proclamations about the state of broadband policy in DC, but sometimes I can't help but get piqued when it feels like broadband's being ignored by the very people who we so desperately need to make it a top priority.

To that end, I have to get this off my chest: I think the White House is ignoring broadband.

Now I don't mean the entire White House. For example, I've had a chance to meet Susan Crawford, President Obama's broadband adviser, and I know for a fact that she's not ignoring broadband.

But when I look at some of what's coming out of other parts of the White House, I get the sense that broadband just isn't that important to this administration, or if it is, then they're not showing it enough by their actions.

Take this quote I came across yesterday:

"A top White House economist says spending from the $787 billion economic stimulus has already had its biggest impact on economic growth and will likely not contribute to significant expansion next year."

It's important to note that this isn't just any White House economist who said this, this came from Christina Romer, the chair of President Obama's Council of Economic Advisers.

Why did this make me so upset? Because it completely discounts the impact of the broadband stimulus!

As I just said Wednesday, no broadband dollars have gone out yet. So they're basically saying that stimulating the deployment of broadband isn't included in having the biggest impact on economic growth.

I read this as meaning one of two things. Either they're saying that deploying broadband doesn't positively impact the economy in a big way, or they're admitting in a roundabout way that the amount of money set aside in the stimulus for broadband isn't enough to have a significant impact.

If it's the former, then that's terrifying as that means they still don't understand that broadband is the key infrastructure that will drive economic growth in the 21st century.

If it's the latter, then I'm even more confused.

Why wouldn't you allocate enough money to the drive the deployment of the key infrastructure of the 21st century when you had the chance to do so?

Or if this is as many have described it as, merely a downpayment on our broadband future, then why not be talking about how allocating more money in the future to broadband needs to be a top priority for Congress?

Instead, there's talk of no more money from Congress for broadband, of the fact that we're likely to get a poor-man's national broadband plan.

Yet does this mean the White House has stopped asking Congress for money? Nope. In fact, just in the last few days the President has been asking for more money. But for what? What could be more important than investing in our broadband infrastructure?

Sending every senior in America a $250 check. Total bill for doing this? $13 billion. Or put another way, nearly twice as much money as was allocated to broadband in the stimulus.

Don't get me wrong. I'm all for helping out our seniors as I totally respect the service that they've done for our country. But I'm having a hard time understanding how sending them all $250 checks will have a greater economic impact than investing in our country's broadband infrastructure.

What this demonstrates to me is that the issue isn't that there's no more money, it's that there's not enough will to rearrange our priorities so that broadband's at the top of any list when it comes to what we should be investing taxpayer dollars in. I get the sense that if the President really wants something, he can get it. Yet with trends like this I can't help but question how important broadband really is to him.

Making this even more flabbergasting is that this President came into office elected on a platform where broadband was supposed to be a significant plank. There were claims that this administration truly understood the importance of broadband and that they were going to make it a top priority.

Still today you'll hear the White House pay lip service to the benefits of broadband in various speeches. But actions speak much louder than words. And so far, based on those actions, it seems like the White House is ignoring broadband.

The ramifications of this are more than a little troubling. Let us not mince words: without strong leadership and commitment from the White House, we can't make significant progress forward into our broadband future. And if we don't start making more significant progress to bolster our broadband infrastructure, America won't be a leader in the 21st century digital economy. It's that simple.

So now the ball's in the White House's court. Will they continue to ignore broadband? To not make it a top priority? To talk more than act? To not acknowledge its importance spurring economic growth? Will they continue to put it on the backburner?

I sincerely hope that I am being overly alarmist in this post. That great things are in the works behind the scenes that I'm not privy to. But until I see more real progress than mere rhetoric, I can't help but worry that the White House is not paying enough attention to our country's most important 21st century infrastructure.

Broadband Stimulus Has Failed To Date

| No Comments | No TrackBacks

Let's stop kidding ourselves: to date, the broadband stimulus has failed.

The original purpose of the stimulus was to get money flowing quickly to create jobs. We're now more than eight months in and no money's gone out. So the stimulus has failed to get money flowing quickly.

The stimulus has also failed when it comes to creating jobs. Not only has no money gone out to help create these jobs, but it seems like no one's even seriously talking about the broadband stimulus in terms of creating jobs any more.

Instead, most of the discussions center around fixing America's broadband shortcomings, but on that front the stimulus has failed too as people are waking up to the fact that $7 billion isn't near enough to solve all our broadband problems.

So that's the big picture of how the stimulus as a whole is failing: it's not working quickly enough, it's not creating jobs, and it's not up to the task of fixing all our broadband woes. But it's important to note that the process itself also seems to have failed.

First it took months for the rules to come out, which might have been OK if it weren't for the fact that the rules were so underwhelming. Not only did they lack vision but they were largely divorced from the realities of how networks are built, forcing projects to slice and dice service areas to meet definitions creating inefficiencies, requiring technologies be chosen today that may or may not be available to purchase once the money arrives, and setting timelines for how fast the money must be spent that may be impossible to live up to. The list goes on and on.

Then once the rules were out, applicants only had a month to get their applications in, which required a herculean effort given all the information that was required to fill out the applications.

Then we sat around for a month basically hearing nothing.

During this time people were applying to be volunteer reviewers, but there seemed to be little serious effort being made to vet the reviewers for competency and bias. And we still don't know if any of these volunteers are actually being used to review anything.

Which brings up another clear failure: the breakdown of the review process. The fact that there was no initial weaning out of bad applications before they were sent to the states is more than a little troubling. It meant that states that were already ill-prepared to properly review these applications had to wade through more than should have been necessary. As a result, most states recommended a larger number of projects than they would've if that initial culling had happened, which will likely dilute their recommendations.

Now we're all left in the position of waiting again, unsure of what all the states had to say, with no idea as to what influence those recommendations will have, and without any real sense for what's coming next and how much longer we have to wait until we start seeing real progress.

And then to top it all off, we're likely soon going to be asked to give feedback on a how to improve this process for the next round before we even get a chance to see the results of the first round to know whether or not this seemingly slapdash process worked!

With all this being said, I do want to say that I don't necessarily blame the good people at NTIA and RUS for these failures. NTIA has never given away this much money before and quite simply didn't have the staff initially to handle this task, and RUS has never been known for its ability to get money out the door quickly. It's also insane to think that the stimulus was passed, money appropriated, and rules started being discussed months before the heads of NTIA and RUS were confirmed.

So, as many have said, in many ways this process was doomed for failure before it began. And it's hard not to label what's happened so far a failure no matter how you look at it.

And yet, I still have hope that this process can be saved. Where we currently stand is that the entire weight of this process now falls on the shoulders of two men: Adelstein and Strickling, the respective heads of RUS and NTIA.

If they can parse through whatever insight has been gathered through whatever vetting process the applications have endured and make the right decisions to invest taxpayer dollars in real projects that can deliver real results we can learn from, then they have the potential to save the stimulus from being a colossal failure.

They can then also seize the opportunity to learn from their mistakes in the first round and come back with a second round that makes more sense, that doesn't get so caught up in the urgency of doing something that it loses sight of doing the right thing.

Despite the doomsday title of this post, I do still believe that things can right themselves in the end, especially as everything I've heard about Adelstein and Strickling is that they're the right people to tackle these challenges.

But that being said, until we start getting some signs that positive progress is being made, we can't afford to beat around the bush any more about what's happened so far with the stimulus. We can't sit back and assume that everything's going to be all right when all signs point to the fact that they're not.

That's why I'm writing this post today. To throw down the gauntlet to Adelstein and Strickling: so far the broadband stimulus has been a failure. The pressure's now on you guys to make sure it doesn't stay that way.

Congress Needs To Understand The Basics Of Broadband

| No Comments | No TrackBacks

Last week in a meeting with a highly respected colleague of mine in DC, I experienced one of the most eye-opening and disappointing moments of my time in the world of broadband policy.

The conversation revolved around my argument that if we keep trying to patch the holes in FCC regulations with band-aids that we'll never be able to realize truly effective reform. If we want real change we need to reframe the FCC's mission for the 21st century, namely to focus on the availability, affordability, adoption, and openness of bandwidth. (More on this argument here.)

What stopped me in my tracks is that while he agreed with the ideas in theory, the reason he couldn't get behind adopting these new principles was that he simply didn't believe that Congress understood the concept of bandwidth enough to fully comprehend the benefits of this converged approach.

Now, it's no surprise that Congress doesn't know as much about broadband and the Internet as we'd like. And for the most part I don't blame them. What people don't realize is that Congressional staffs have limited manpower, and therefore everyone pulls triple duty and beyond. Because of this, very rarely do you meet a staffer who's sole focus is on understanding the intricacies of broadband technology and policy.

I've met many staffers charged with handling telecom for their Representative or Senator that were thrust into the position with little to no prior knowledge of or experience with broadband-related issues. Layer on top of that the fact that they've got a ton of other areas they're working on, and I don't see how we can expect them to understand all the ins and outs of our intricately esoteric world at the nexus of technology, business, policy, and philosophy thoroughly enough so that they can craft policies that help rather than hinder it.

Quite frankly, it seems like an impossible task.

Yet at the same time, I can't see how we can consider limiting the scope of our ambitions because of our assumptions about what Congress has the capacity to understand.

The other thing I've learned about those Congressional staffers is that they're very bright, extremely hard working, and dedicated to trying their best to understand as much as they can.

I refuse to believe they can't understand something as basic as bandwidth. In fact, I'd say that if they still don't get it, then that's more our fault than theirs as we obviously haven't been doing a good enough job explaining it. And as a result of our failure, our policy ambitions are suffering, being limited by unnecessary ignorance.

I'm not trying to suggest that we should expect Congress to understand everything that's going on under the Internet's hood. Instead that they must have some understanding of the basics, of terms like bandwidth, of how in broad terms the Internet came to be. Otherwise, how can we expect them to be capable of creating broadband policy that goes beyond bumper sticker slogans?

Sure, it's easy to say, "Internet for all!" and "The Internet should be open!" But where the rubber of good intentions meets the road of making policy, we can't afford to have policymakers that don't know where they're supposed to be going or how they're supposed to get there.

And perhaps even worse: we can't afford to scale back the policies our country needs to move forward because of an assumption that Congress can't or won't understand the issues at stake.

I think Congress is ready to listen. But the onus is on us to reach out to them and figure out how to talk about these complex issues in terms that they and the public at large can understand.

Because I fear that without an understanding of the basics of broadband, that Congress will never be able to create the kinds of forward-thinking broadband policies that our country needs to make the rapid forward progress required to remain a leader in the global digital economy.

Simple, Justifiable Bandwidth Goals For America

| No Comments | No TrackBacks

After yesterday's piece on how defining broadband based on 3Mbps down 1Mbps up is inadequate, I want to share more specific thoughts on the kinds of bandwidth goals American should be setting.

An important first step is to realize that we should have at least two tiers: baseline served and fully served.

Baseline served refers to the lowest level at which we consider a network to be delivering broadband. This is also the level that 100% of Americans (or as close as possible) should have access to.

Fully served refers to the speeds at which we consider a person, building, or area fully served relative to the bandwidth demands of that time.

The second vital step is to acknowledge that we need to have different goals for different types of users, in particular differentiating between residential and what businesses and community anchors need.

Finally, we need goals not just for today but also for tomorrow that are primarily based on the needs of users rather than the limitations of technologies.

Without further adieu, let's take a look at proper goals for residential broadband:

Residential
Today
Baseline - 1Mbps up/down
Fully Served - 10Mbps down, 2Mbps up

2015
Baseline - 10Mbps up/down
Fully Served - 100Mbps down, 20Mbps up

Now let's take a moment to consider why these are the best goals for today:

Today, Baseline @ 1Mbps up/down - The simplest explanation for this is that I think it would be criminal to define broadband in terms of Kbps a decade in to the 21st century. But diving a bit deeper shows that with 1Mbps up/down you can use a lot of the broadband apps available today, like watch video from YouTube or Hulu, do videocalling over Skype, use hosted apps like Google Docs, and, of course, surf the Internet and check email. And from a broadband tech perspective, pretty much everything can deliver this much capacity today, and if it can't yet handle 1Mbps, then we probably shouldn't be taking it very seriously any more.

Today, Fully Served @ 10Mbps down, 2Mbps up - For starters, I wish I could say 10Mbps up/down but I know that's going to get a lot of providers up in arms as their networks can't handle that much upstream bandwidth, largely because of mistakes they made in forecasting future bandwidth demands. But with 2Mbps up I have to admit that you'd have enough bandwidth to use most every broadband app that requires this capacity, though sending very large files will still take a long time. In terms of 10Mbps down, as I laid out yesterday, that amount of bandwidth can handle any of today's generation of apps plus it's enough capacity to support multiple simultaneous users reasonably well. It may not be up to supporting lots of people streaming HD video at the same time, but with a 10Mbps connection you can do pretty much everything else you'd want to on today's Internet.

Now let's consider the goals for 2015:

2015, Baseline @ 10Mbps up/down - If need be we could scale this back to 10Mbps down, 2Mbps up, but my thinking is both that five years from now we'll be using a lot more upstream capacity but also that if there are areas without this much capacity today, that when we're building out to reach them to achieve this baseline goal we sure as heck better not be investing in technologies that don't feature a whole lot more capacity, especially upstream. It would be silly to put more money into last generation technology, so I'm assuming the new technologies that we'll be deploying will feature sufficient upstream capacity to reach these goals. In terms of why to set the baseline at 10Mbps, quite simply I think that we need to get into the mindset that what's fully served today should be tomorrow's baseline. I wish we could say that everyone should get access to the same capabilities, but knowing that that's a challenge that many shirk from and acknowledging that there's a lot that can be done with a 10Mbps up/down connection, I'm trying to make sure that the goals we set are generally considered realistic.

2015, Fully Served @ 100Mbps down, 20Mbps up - Another case where I'll admit that I'm limiting what I think we need in order to be more "realistic" about what we can actually accomplish. I'd much rather say 100Mbps up/down, but I know that calls into question the long-term viability of most technologies other than full fiber networks. And again, I'm willing to acknowledge that 20Mbps upstream is a lot of bandwidth that will enable a lot of things, including being able to upload HD video in real-time. In terms of setting the goal of a 100Mbps Nation, I firmly believe that striving to achieve anything less than that would be a national embarrassment. When so many other countries are setting goals of 100Mbps and beyond, how can we expect to be globally competitive by setting our own goals at anything less? And we can't use our size and relatively low density as an excuse as Australia's set the goal for itself to reach 90% of homes with 100Mbps. Put simply: if we want Americans to have the same opportunities as citizens of other countries, both to compete and participate in the next generation of the digital economy, then we must at least be attempting to level the playing field when it comes to the availability of bandwidth. Plus from a practical standpoint, research like the Need for Speed report from ITIF has shown that the average house of the near future will be using 90Mbps+ of bandwidth, assuming it's available for them to use.

The next set of important goals to set relate to the bandwidth needs of community anchor institutions and businesses, referred to for now as "Commercial" to make things simpler:

Commercial
Today
Baseline - 10Mbps up/down
Fully Served - 100Mbps-1Gbps up/down

2015
Baseline - 100Mbps up/down
Fully Served - 1-10Gbps up/down

The rationale behind these numbers is pretty straightforward. Commercial broadband presumably serves buildings with lots of people sharing the same connection to get online, like a school or office building. Simple math shows that if ten people are using the same connection at the same time that you need 10Mbps just so that each can have baseline access at 1Mbps.

The reason we should set a range in terms of fully served today and tomorrow is that there's a range in the size of buildings and number of potential users that needs to be served. A small business with ten employees has different needs than a school where 1,000 kids have a laptop trying to get online simultaneously. I've made the argument that all schools need fiber-like speeds of 100Mbps and beyond today, and some could already make use of 1Gbps. Additionally any business that wants to host servers on-premise that are open to the public need sufficient bandwidth to support lots of hits to whatever sites or apps they're hosting.

In terms of the goals for 2015, this is a matter of simply bumping the goals up the chain. I'd hope that within five years every business and community anchor institution in this country could have ready, affordable access to 100Mbps. If we can't accomplish this simple goal then I think our national broadband plan has failed. Additionally, if we're not delivering next generation connectivity to those buildings that are considered fully served, how do we expect to be able to enable the next generation of Internet apps?

Assuming we can reach the goal of ubiquitous 100Mbps commercial access, then the technologies supporting those speeds should be able to scale to 1Gbps and beyond relatively easily, or rather, we should only be investing in technologies that can scale in this way, otherwise we're going to be permanently limiting the horizon of our possibilities in optimizing our society around the use of these next-gen networks.

Finally, I think we also should be looking at setting goals for mobile wireless, but acknowledging wireless's many limitations, we should consider setting these goals a tier behind our residential goals, in other words:

Mobile Wireless
Today
Baseline/Fully Served - 1Mbps up/down

2015
Baseline/Fully Served - 10Mbps up/down

Even these lower goals may be overly ambitious, but at least they give us something to work with that insures any wireless deployment is relevant to users' needs and that we're treating wireless as it should be treated in our national broadband plan: as a complementary rather than competitive service to wireline given its limited capacity/reliability.

There's lots more that could be said about setting proper bandwidth goals for America, but in this post we've covered the basics and laid out a simple, straightforward, realistic, and justifiable set of goals that reflect the needs of users while also being internationally relevant.

I'm open to discussions about whether or not we should set the bar higher so that we can be striving to lead rather than follow the rest of the world, but I can not accept setting the bar any lower. I can not feel comfortable with keeping our goals low so as to be more "realistic" when other countries aren't limiting themselves in this way. I can not understand why we'd base our goals more on the current limitations of technologies and trajectories of private sector deployment when what we really need to be looking at are the bandwidth demands of apps and the uses of these networks.

The question shouldn't be, "What can we get?" Instead we must answer the question of, "What do we need?" Then we can determine how to get it.

Quite simply, America can not afford to set mediocre bandwidth goals. We won't be competitive in the global digital economy without a robust enough broadband infrastructure. So our future, our country's future, and our childrens' future depends on setting goals that while achievable are also aspirational, that's based on a realistic analysis of what we need, and that we can then focus our deployment efforts on achieving.

3Mbps Down and 1Mbps Up Are Inadequate Bandwidth Goals

| No Comments | No TrackBacks

So apparently there's some level of consensus getting entrenched among DC policy circles that setting the residential broadband bar at 3Mbps down and 1Mbps up is adequate.

Well I'm here today to refute that thinking, to show that this isn't enough bandwidth to support today's applications let alone tomorrow's, and to lay out the ramifications of what happens if we set inadequate goals for broadband.

To start with, let's take a look at some of the apps that already exist today that need more than 3Mbps down and 1Mbps up:

OnLive - This distributed gaming service allows users to play their favorite console games without having to own the console. Instead the game is hosted on remote servers and delivered to TV sets through a thin-client set-top box. In addition to gaming, this technology can be used to deliver educational simulations. If a user wants to be able to experience these games in HD, they need 5Mbps of downstream bandwidth.

PBS - PBS is digitizing all of its archives of video content, much of it in HD, and making it available to the public and in particular to schools across the country, enabling students to do things like watch reports from the NewsHour with Jim Lehrer about historical events from the last thirty years. If a student wants to be able to access an HD video from PBS, they need 8Mbps of downstream bandwidth.

Livestream - This site has an app that enables robust webcasting for a whole range of purposes, whether it be a rock concert, government meeting, public forum, or other event. The professional version of the app can deliver high quality full-screen video. If a webcaster wants to be able to stream video of this quality, they need 1.7Mbps of upstream bandwidth.

Home monitoring - There are a host of solutions offering to help secure homes with video cameras that can be viewed over the Internet. While most tout their ability to operate using little bandwidth, the reality is that means they're delivering lower quality video. These cameras often can deliver higher quality video if sufficient bandwidth is available, plus we need to factor in that a single home can have multiple cameras running. If a homeowner wants to have four cameras streaming low to high quality video at the same time, they need anywhere from 1-100Mbps of upstream bandwidth. (The 100Mbps is for MPEG HD video.)

These are just a sampling of the kinds of apps available today that require more than 3Mbps down and 1Mbps up.

Even more important is to recognize that the real demand for bandwidth doesn't come from any single app but rather from the simultaneous use of multiple apps. Between desktops, laptops, netbooks, game consoles, and media players, many if not most households already have multiple devices ready to make use of broadband.

But what's the point of having multiple computers in you can't use them at the same time? If little Johny's in the living room playing OnLive, does that mean little Susie shouldn't be able to watch a video from PBS in her room because there isn't enough bandwidth available? Should Mom have to wait until the kids go to bed to talk with friends and family on a videocall? And how much bandwidth is Dad left with then?

If a household has three computers but can only get 3Mbps of downstream bandwidth, then that means if everyone's online at the same time they can only get 1Mbps each. That's the bare minimum that's needed to do basic web surfing on today's Internet effectively and efficiently.

We need to understand what setting the broadband bar too low means in practical terms. By saying you're "served" if you get 3Mbps down and 1Mbps up then you're basically relegating everyone at the bottom to be unable to use the top tier of today's apps let alone be able to benefit from the next generation of tomorrow's apps. It also means discouraging simultaneous usage of even the basic apps of today.

Put another way, imagine if we had the same attitude towards electricity. It'd mean not being able to make toast while you're drying your hair. Or it'd mean only being able to run one of your home's multiple computers at a time. When thought of this way it's an absurd proposition, and yet that's what we're on the verge of doing with how we define who's fully served by broadband.

We also have to consider the ramifications of setting the bar too low from the perspective of the decisions that government must make on what broadband to subsidize.

First off, we risk investing taxpayer dollars in inadequate technologies, in deploying broadband that's already outdated before it's even deployed.

Secondly, if we rush to get the unconnected served by these lesser technologies we may put rural areas in the position of permanently being second-class digital citizens, both in terms of equipping them with technology that has inadequate capacity as well as preventing them from being eligible for getting future government funding to build the networks they need.

One of my biggest concerns in all of this is that we're not being technology neutral in setting the broadband bar. I'm worried that we're allowing what's "realistic" to infect decisions that should be first and foremost about what we need. If we allow the capabilities of today's technologies to be the sole determinant of where we set our broadband goals, then how are we going to get to a tomorrow where bandwidth's much more widely available, much more capacious, and much less expensive?

I'm not saying we can ignore the constraints of different technologies, or that we should put aside any technologies that can't deliver next-generation speeds, but instead that if we let their limitations dictate our actions then we're going to be holding back the entire country.

We have to set the broadband bar based on the kinds of capacity that we need for people to be able to benefit from all the apps that the Internet makes possible.

That's why I think we should start the conversation about who's fully served by broadband today at 10Mbps down and work our way up from there. At 10Mbps you can do everything that today's Internet has to offer. And you can support the simultaneous usage of multiple apps. 10Mbps is also a speed that most upgraded cable and DSL modems are already delivering today, plus it's a speed that the next generation of wireless should be able to deliver in the next year or so, if we are to believe their claims.

At the end of the day, we can't afford to set meek definitions. We can't allow ourselves to set broadband benchmarks that are more appropriate for the 20th century than the 21st. We can't ignore the apps that can already make use of bandwidth greater than 3Mbps down and 1Mbps. And we'd be remiss if we didn't acknowledge the fact that there's a wave of new apps on the horizon that simply won't work over broadband that's that slow.

So let's stop kidding around and start setting the broadband bar where it needs to be, establishing definitions based on what users need today, and then work diligently towards bringing this kind of capacity of all Americans.

Dear NTIA/RUS: Why Not Let The Public Have Its Say?

| No Comments | No TrackBacks

So here's where things currently stand regarding the broadband stimulus as I see them:

- It's been eight months since the stimulus passed, and the only money to be awarded has been to four states for mapping.

- It's been roughly six months since the public had a chance to provide any input on this rulemaking, review, and approval process.

- It's been two months since applications were submitted, and we have no idea if anyone's reviewing anything, and we're a day away from states having to make their hurried recommendations.

What's next? Nobody knows. Presumably we're going to hear about some more mapping grants going out. Hopefully we're going to get to hear what projects states recommended. At some point incumbents get to have their say. And eventually decisions will be made on who gets what.

Also, before long the public should be asked to comment on how this whole review process has gone in order to help inform how the next round should be handled, despite the fact that the first round is a long way from being done.

In the meantime, the public and applicants are left waiting without any idea of what's going on or any way to help contribute to the process.

So here's my thought: why not let the public have its say on who should get what money?

The way to do this could be relatively simple and straightforward.

Once states have submitted their recommendations, hopefully NTIA and RUS will have at least completed an initial culling of all the applications that aren't up to snuff. By combining these two lists NTIA and RUS can put out their short lists of finalists.

At that point we put the finalists on a site where the public can vote for and comment on specific projects. We'll have to monitor the site closely to insure it's not abused, and ultimately the results of the site will just help inform the final decision-making rather than allowing the public to dictate agency actions.

But in doing this we can collect a lot of good feedback from the public about these projects: the good, the bad, and the ugly. We can also get a sense for how popular various projects are relative to others in their state and others like them across the country, and how committed local community officials are to seeing them succeed.

Not only would this provide some cover for the hard decisions NTIA and RUS have to make, but it would also give the public some way to contribute and feel better informed and more involved with this process. I think doing this would do a lot of good for settling some of the sense of unease among those of us who have been observing or participating in the process.

This public site could also be the forum in which NTIA and RUS asks followup questions of finalists. While there may be some questions that need to be asked privately, I think there are a lot that we could ask of all applicants and require that they answer on the public record. For example, I'd like everyone to answer: "What are your plans to make sure this project is sustainable? For networks, what accommodations do you have for reinvesting revenue in upgrading and expanding capacity? And how are you going to insure that prices stay low for customers in places where you're going to be a natural monopoly?"

There are a host of other questions I can think of that I'd like to pose to applicants publicly. In asking these questions we can get a better sense for how prepared, thoughtful, and committed an applicant is to serving a particular area.

While it will take some work and resources to build a site like this, maintain it, and make sure the results are usable for decisionmakers at NTIA and RUS, ultimately I think getting the public more involved will pay dividends, and a site like this could help make sure the best applicants are receiving funding.

On a related note, even with this site I wouldn't wait to announce all of the winners. In fact, from a PR point of view, I think the best thing to do would be to pick the top ten no-brainer applications that you know you want to fund and get them out there ASAP as winners. Then suggest that you need help making the other tough decisions out of a huge list of worthwhile finalists, and that's why you're enlisting the public's support.

By following this thread NTIA and RUS can transform this situation from a PR nightmare, where no one knows what's going on and the public's getting antsy about what's happening with their tax dollars behind closed doors, to a huge PR win, proving that these agencies are committed to engaging the public in new ways using technology.

We Must Anchor Our Communities With Fiber

| 2 Comments | No TrackBacks

I've recently learned that there's still some debate going on as to whether or not all community anchor institutions need fiber. Let's put that to rest here and now.

Before going any further, let me admit upfront that I understand that we might not be able to bring fiber to every community anchor institution in the country. In some limited circumstances, point-to-point wireless technologies like microwave may be the best or only option for getting these buildings connected. But the point still stands that they all need access to fiber-like speeds.

When I say "fiber-like speeds" I mean bandwidth that starts at 100Mbps and goes up from there. The reasons for needing this much capacity are clear and manifold.

Take schools, for example. A few weeks ago, I wrote a post that laid out a simple equation: to support one classroom with 20 computers that each want to access an HD video stream encoded at 8Mbps from PBS's archives requires 160Mbps of simultaneous bandwidth.

The really important thing to take away from this example is that unlike residential broadband, where there's an assumption that not everyone will be online at the same time, in schools there's the likelihood that everyone will be wanting to get online simultaneously as they're all in class at the same time. And in fact I'd hope that every computer lab is being used all the time since most schools have far fewer computers than students.

But what about those schools that have embraced the concept of one laptop per child? I've heard of schools like that in the US today. So how do you support 1,000 kids in the same building with 1,000 computers all trying to get online at the same time?

Putting aside what capacity would be needed so they could all stream HD video, to simply surf the Internet each computer needs at least 1Mbps to have a reasonable experience, where more time is spent learning rather than waiting for pages to load.

So 1,000 computers at 1Mbps each means that school needs at least 1Gbps today.

Going even further, the high school I went to, which was large but not the biggest in the Minneapolis suburbs, had more than 2,000 students. If they were able to equip every student with a laptop, they'd need more than 2Gbps.

These are the fiber-like speeds that schools need to support today's applications, let alone tomorrow's. Imagine if those 2,000 computer-toting students all want to start streaming HD video to supplement their classes! Soon we're talking about a single school needing 10Gbps+ of bandwidth, which only fiber can reliably deliver.

But schools aren't the only community anchor institutions that need fiber. Let's consider hospitals.

In a hospital you're somewhat less likely than schools to have everyone online at the same time, but what you do have a lot of is moving around big files like MRIs and CT scans combined with the urgent need to transfer data quickly.

Advanced imaging technologies are allowing doctors incredible new insight into the innerworkings of a patient's body, but to get that insight you need to handle very large files, often hundreds of megabytes. Here's a paragraph describing the range of files produced by digital imaging technologies:

"Medical image quality continues to increase thereby increasing the image size. The size of a medical image averages 38MB (megabyte) for digital radiography, 20MB for fluoroscopy, 225MB for angiography, and as high as 350 MB for Echo... Digital pathology (DP) is an emerging trend in the digitized image archive sector. DP images have high resolution and each image size can vary from 0.5 GB to 2 GB. A 100 bedded hospital performs 40,000 to 45,000 radiological examinations each year, which amounts 2GB of storage space a day, or up to 1 terabytes (TB) a year."

To keep things simple, let's say the average result is just a 100MB file.

Now let's look at how long it takes to upload/download a file of that size at various speeds. First we must convert MB into Mb (8 bits to 1 Byte), so 100MB equals 800Mb. Now divide that by your bandwidth, or:

800Mb @ 1Mbps = 800 seconds
800Mb @ 10Mbps = 80 seconds
800Mb @ 100Mbps = 8 seconds

The time it takes to move a single 100MB file at 10Mbps might not seem too bad, but you must realize that if a hospital only has 10Mbps for the whole building then it's unlikely that all 10Mbps would ever be free to transfer a single file. Also, what happens if you have multiple files to send or receive at the same time? Or multiple people are trying to transfer simultaneously? Or what if you're trying to transfer one of those 1GB files mentioned above?

Now think about this in terms of an emergency room situation, where ER doctors need to get access to these files quickly, where they may need to be able to consult with remote specialists efficiently, and where seconds wasted waiting can mean the difference between life and death.

In these circumstances, having insufficient bandwidth means putting lives at risk unnecessarily.

And this is only looking at one pretty basic use of broadband to improve healthcare. There are a host of other broadband applications available for use today. But in order to support them as well as the needs for moving around digital imaging files, hospitals must have access to fiber.

Let's take this discussion a step further. While community anchors are often referred to in terms of public institutions like schools, libraries, and hospitals, I think when it comes to how fiber can anchor communities it's important to consider the needs of major employers in any given area, because without them to anchor employment communities can't survive.

There's lots of talk about how fiber can create new jobs and attract new businesses to communities that get wired, but what's as important and significant to economic development is how fiber allows existing businesses to stay put.

I've heard of many employers across the country, especially in rural America, that want to keep growing where they started but can't because they don't have access to 100Mbps connectivity and beyond.

Whether it be a manufacturer that needs to be able to communicate with the top materials engineers from around the country to build their next generation of products, or a catalog retailer with a booming online business, there are companies across the country that are begrudgingly having to consider taking their jobs elsewhere if they can't get access to fiber. (I'm going to be working on finding and sharing stories like these in greater detail on this site in the weeks ahead.)

These are all clear and present reasons for why if we want our kids to have access to the next generation of education, if we want everyone to benefit from the opportunities of telemedicine, and if we want our businesses to be able to grow where they are, then we need to anchor our communities with fiber.

But the unfortunate reality that most communities face is that today many of these anchor institutions are stuck starving for bandwidth with nothing better than T1s available, which can only deliver 1.5Mbps and often cost hundreds if not thousands of dollars a month.

That's too little bandwidth for too much money to enable the next generation of healthcare, education, and economic development. And while this lack of access to affordable, substantial bandwidth is a problem across the US, it's especially acute in rural areas, where the lack of competition has kept prices high and speeds low, leaving many communities unable to fully use the broadband applications of today, let alone be able to prepare themselves to utilize the apps of tomorrow.

We've got to find a way to get everyone's head out of the sand on these issues. We can't afford more milquetoast generalities that hold onto 20th century mindsets about the bandwidth needs of community anchor institutions. We are already a decade into the 21st century. It's time we recognize what's possible today if only we had enough bandwidth, and to commit ourselves to coming up with a plan to anchor every community's public and private institutions with fiber, or at least fiber-like speeds, across the entire US as soon as possible.

As without it, we're going to be stuck dealing with the bandwidth constraints of the 20th century while other nations forge ahead, fully equipped to take advantage of all that the 21st century has to offer.

Paying For A National Broadband Plan

| No Comments | No TrackBacks

Perhaps the biggest issue to be resolved in any national broadband plan is figuring out how we're going to pay for whatever goals we set.

The FCC has recently put out an estimated cost of $350 billion to upgrade our country's infrastructure to 100Mbps to every home, which is definitely a big number.

Throughout these discussions the FCC has been careful to cite that there's an expectation that the bulk of this money will come from private sources. But what I worry about is that we're conflating private capital sources with private providers.

We need to acknowledge that private providers alone may not have the capacity to invest that much money on their own over the next few years. They've already sunk a lot of money into their existing networks, and for the most part are only looking to incrementally upgrade. Plus they have a fiduciary responsibility to their shareholders to be prudent in what they do invest in, which is why rural areas are often left behind.

That's why we need to make sure the conversation about how to pay for an ambitious national broadband plan focuses more on answering the question of how do we make broadband a more attractive investment for private capital sources, regardless of who's doing the deployment.

While there are viable projects across the country now stuck spinning their wheels because credit's dried up, despite the economic downturn there's still a ton of money out there, it's just sitting scared on the sidelines looking for a safe place to invest.

The problem broadband deployers face is that their networks are considered a risky investment, especially in rural areas.

Also, regulatory uncertainty surrounding broadband acts as a disincentive for private capital to invest as they don't want to put up money today if the regulatory environment is going to change tomorrow in any significant way that may threaten those investments.

So to get private capital flowing to broadband projects we need to work towards clearing up any of that regulatory uncertainty, and we need to put in place incentives that make broadband less risky and therefore more attractive to investors.

This brings us back to the idea of loan guarantees as a mechanism to unlock private capital. By government stepping up to share the risk, we can transform broadband networks into safe investments for private capital and free up billions for all manner of broadband projects. I'm going to revisit our fast-track partial loan guarantee proposal in more detail next week.

For now let me just say that I think it's vitally important we address this issue of how to incentivize private capital to invest in broadband. The biggest question I'm always asked when discussing plans for nationwide fiber deployment is where's the money going to come from. And I firmly believe that these guarantees could be the key to unlock the billions we need to get the job done.

Conversely, if for some reason we don't want to have private capital involved, we could follow Christopher Mitchell's suggestion that government simply set aside $35 billion in budget authority so that it can hand out $350 billion in direct government loans.

Given the positive economic impact that would be realized from doing this, it's hard to argue against the idea. But I do still prefer guarantees as they don't require government to write a check, take on all of the risk, or do all of the vetting of loan applicants.

Either way the main point is that we need to find a way to pay for our national broadband plan, and that we need to look beyond the questions of how do we make broadband deployment more attractive to private providers to figure out how we can either make broadband networks more attractive to private investors or how government can step in to get the job done if private entities aren't able or willing to step up.

Too often broadband policy debates get caught up in attempts to be technology neutral, losing sight of the significance of individual technologies, the most significant of which is fiber.

When you think of the Internet, you should think of fiber as the vast majority of the interconnected networks that make up the Internet are fiber.

When we talk about middle mile and backhaul networks, those are most often fiber.

When remote broadband networks complain about not having affordable access to the Internet, what that really means is having affordable access to the long-haul fiber running through or near their communities.

When anyone's discussing the next generation of wireless access, that means finding a way to get fiber to every wireless tower.

The same holds true for most next generation wireline networks, which generally all rely on laying fiber ever closer to homes to deliver higher speeds.

And when you talk about the next generation of healthcare and education, those conversations presume the presence of reliable, high capacity, symmetrical access capable of handling lots of simultaneous usage, which only fiber can truly deliver.

A national fiber plan encompasses all of this. It recognizes that you can't have an effective national broadband plan without a comprehensive national fiber plan because fiber is so critical to delivering truly next generation broadband.

A national fiber plan should look at all of America's fiber assets--be they public, private, or somewhere in between--and consider them as a homogeneous pool, as America's network, as the primary veins and arteries of America's digital ecosystem.

A national fiber plan should set high standards for America's network, demanding that it be universal, reliable, and affordable. It should put aside who owns what and look at the capabilities of all the networks as one, and in so doing be able to find the holes and weak spots where access isn't universal, reliable, or affordable.

With the problem areas identified the national fiber plan can then look at who owns what and identify the best way to leverage existing resources to fill those holes in America's network.

A national fiber plan should set standards for which buildings do need fiber today. Putting aside the argument over whether homes need fiber, a national fiber plan would clearly state that all schools, hospitals, libraries, and other community anchor institutions where lots of people can be expected to be using the network simultaneously need fiber, and the plan should include a specific implementation strategy for getting them all wired.

A national fiber plan should strive to make sure that all businesses everywhere at least have the option of getting fiber, lest they end up having to move their business out of their community in order to get the connectivity they need to grow.

In facilitating this deployment of fiber to community anchor institutions, a national fiber plan should have a mechanism to ease the deployment of fiber to wireless towers, because without a robust wireline connection you can't take advantage of the all the capabilities of new wireless technologies.

A national fiber plan should also make sure that middle-mile and backhaul fiber networks are competitive enough in all areas to keep prices down, and if prices are too high then there needs to be ways to bring them down, either by inducing competition or regulating prices. Otherwise even if we get high capacity broadband networks built out to homes, they won't be able to offer up their full capacity at an affordable price to consumers.

In creating our national broadband plan, we can't afford to ignore the importance of fiber to everything that we want to accomplish. We can't allow "technology neutrality" to get in the way of sound, effective policymaking. This isn't about picking winners or losers, it's about creating the best possible national broadband plan. And to do that we need a comprehensive national fiber plan.

Should Stimulus Dollars Fund 3G Deployment?

| 1 Comment | No TrackBacks

In reading through some more of the stimulus application database, a question keeps popping into my head: should we be considering funding deployments of 3G wireless?

The biggest reason I ask this is because isn't 4G and LTE deployment just around the corner? Unless incumbents are lying about their intent, we should be seeing large scale 4G/LTE deployment starting as early as next year.

Given that the amount of stimulus dollars we have is limited and may be a one-time thing, I can't help but wonder if we can afford to invest in lagging edge technology, in networks that may be duplicative, or in networks that may not be financially self-sustaining.

Calling 3G a "lagging edge technology" might be a little extreme as it is providing true value today and should continue to do so for another couple of years, but even still it's hard to get excited about investing taxpayer dollars in technology that's going to arguably be outdated not long after it's deployed.

I know we shouldn't hold back deployment on the possibility that incumbents might upgrade their networks soon, especially in rural areas, which are likely to be upgraded last, but we can't ignore that nothing would be sillier than subsidizing a 3G network buildout in an area that's soon to get 4G/LTE capabilities.

Which brings us to my final point, that 3G networks have suspect financial sustainability, both because of their limited capacity and the potential for competition coming in equipped with new technology that will eat their lunch. The last thing I'd think we'd want is to dump a bunch of money into 3G networks that don't just deliver inadequate capacity but that also end up out of business in the not too distant future.

Let me now say that I'm not necessarily suggesting that all applications that use 3G technology are bad. I just want to make sure we're not getting caught up in thinking about this broadband stimulus only in terms of access. I think it's vitally important that we look at the networks we subsidize as infrastructure. With this mindset the emphasis should be on networks that will be relevant in the long term, and I don't think that throwing up a bunch of 3G radios qualifies as that.

Even more egregious are that some of these 3G projects don't mention any form of fiber deployment. I'm hoping this doesn't mean that they're seriously considering relying on T1s for backhaul. Quite frankly: 3G + T1s does not equal infrastructure.

Unfortunately, simply saying that stimulus dollars should focus on 4G and LTE when it comes to wireless technologies isn't necessarily the best answer either. Just look at the middling to poor reviews Clearwire's been receiving for the 4G networks they're deploying. And they're not even dealing with rural areas.

The point of all this isn't to say that all wireless is bad and not worth investing in, but rather to suggest that if we worry more about access than infrastructure we're likely to get burned in the long run. If we focus too much on what technologies can connect the most people at the lowest cost then we're going to end up with the cheapest broadband. If we emphasize getting people online at any speed with any technology than we're going to be sacrificing the long-term big-picture for short-term gains.

I do recognize and respect the need to get the unserved online in any way at any speed as quickly as possible, but I also think we need to be good stewards of taxpayer dollars, and that putting a lot of money into technology that's about to be last generation, that provides access but not infrastructure, seems like a big mistake.

That's why if I was running NTIA or RUS, I'd be taking a long, hard look at any application that focuses most of its attention on deploying 3G to make sure that we're investing in networks that can be considered infrastructure and that will be relevant and financially sustainable for the foreseeable future.

Net Neutrality: A Problem In Search Of A Solution

| No Comments | No TrackBacks

With net neutrality firmly back in the spotlight, we're again hearing it decried as a solution in search of a problem.

To some degree I actually agree with this sentiment, primarily because there really haven't been all that many cases brought to light where the spirit of net neutrality is being grossly violated. And where bad practices have been identified they've been generally resolved through a combination of the court of public opinion and the threat of government regulation.

But at the same time, I have another way of looking at it. I think net neutrality is actually a problem in search of a solution.

The problem I see isn't necessarily that network operators are messing with traffic, but rather that there aren't any clear rules of the road for what traffic management is OK and what isn't.

Because of this I know of smaller operators that are holding back innovation in their networks to make sure they don't run afoul of these nebulous rules. And these problems will likely only get worse as new ways to manage traffic emerge that will create more uncertainty about what's OK and what's not.

This is especially true as the concept of managed services begins to take hold, which opens a whole other can of worms about who should get access to what and what rules need to be in place to protect free and open competition on these next-gen networks.

The undeniable truth is that network operators need to know what they can and can't do in terms of managing access to and traffic on their networks, and app developers and content creators need to be able to have an understanding of what to expect as they create experiences that leverage these networks. Without some level of certainty about all of this we're going to force everyone to be hesitant when it comes to investing in expanding their capabilities in all the different directions that digital technologies make possible.

Some have tried suggesting that we don't need any more rules, and that the market alone will take care of sorting out what's right and wrong. But I think doing this relies too much on trusting network operators to place the public good over their private profit motives. Remember, these private operators have a fiduciary responsibility to their shareholders to maximize profits. That doesn't make them evil, that's just the way the system works. And because of this, it may be too tempting to muck with traffic to drive up profits if there aren't rules in place protecting open competition between services on the Internet.

On a related note, some in Congress have tried to prevent the FCC from taking ownership over this space and laying out the rules of the road. I think this is a really irresponsible thing to try to do. Someone needs to be in charge of this, and there's no agency better positioned to do so than the FCC. Also, we can't expect Congress to pass laws that have any chance of keeping up with the pace of technological evolution around network management and next-gen applications. Put simply: we need rules of the road, and we need someone with the flexibility to set and maintain those rules as conditions change over time.

But this is easier said than done. While some claim net neutrality's a solution in search of a problem, in reality I think we have yet to find anything resembling a specific, actionable solution to the regulatory uncertainty surrounding net neutrality. And that's another big problem as the more uncertainty the greater the likelihood any attempt to legislate or regulate net neutrality will just get stuck in the courts, which is proven by the fact that Comcast has already taken the FCC to court even though they weren't really punished for what was a pretty obvious violation of the spirit of net neutrality.

One of the biggest challenges in finding a solution to net neutrality is that the term means so many different things to different people. It's almost impossible to find a solution to a problem that isn't clearly defined.

Any solution to net neutrality must also have to balance being specific enough to remove the uncertainty surrounding what's OK and what's not yet broad enough to accommodate future advances in technology. Finding a solution that's rigid in terms of defining the consequences for bad actors yet flexible in terms of adapting to the circumstances of each unique situation.

We can't afford to put in place narrow rules that limit innovation and investment, but at the same time if we're too broad than we're likely going to leave too many loopholes and gray areas, which are ripe for litigation.

That's why I think any solution to the net neutrality problem has to be incremental and has to be prepared to evolve over time.

Because of this, I'd start simply.

A first step would be to firm up truth-in-advertising rules around broadband so that consumers can know what they're getting.

A second step would be to enshrine "thou shalt not degrade traffic" rules with strong penalties for abusers. This is an issue on which general consensus seems to have already been reached.

Then the third step I'd embark on initially is to set up a mechanism at the FCC to track the evolution of network management techniques and application development trends to be able to study and recommend what's OK and what's not moving forward. A key mission for this entity would be to work towards having a better understanding of how managed services work and what impact they have on network operators' investment in upgrading the open bandwidth part of their networks' capacities.

The final step I'd recommend is to establish a voluntary standard for best practices associated with open networks. Perhaps this could be created as a brand that operators that are committed to delivering a truly open network could use to help differentiate themselves from their competitors. There could even be additional incentives like tax breaks for anyone who lives up to this standard. I sometimes worry that we focus too much attention on punishing bad actors when we could also be finding ways to reward good behavior.

The most important thing to remember is that net neutrality is a problem and it needs a solution, but at the same time we can't get caught up in thinking that the sky is falling and that we need to rush to enact imprecise legislation or regulations that may end up doing more harm than good.

I do believe there is a solution out there to overcoming the problem of uncertainty when it comes to managing networks. So let's all work together to figure out how to create the best solution possible to maximize competition, investment, and innovation in our broadband marketplace.

About this Archive

This page is an archive of entries from October 2009 listed from newest to oldest.

September 2009 is the previous archive.

November 2009 is the next archive.

Find recent content on the main index or look in the archives to find all content.