July 2010 Archives

While there's nothing new about lamenting over America's falling in the standings relative to broadband in the rest of the world, I continue to be amazed at how that gap seems to be widening, not just in the capacity and cost of bandwidth customers have access to, but in how quickly other countries are moving forward with real plans to achieve real progress relative to the US.

Here in the US we barely have our first broadband plan. We're still caught up in debates over whether or not we're setting the standard for broadband too high, whether or not government should play any role in the deployment of broadband, and whether or not the agency charged with overseeing our country's communications infrastructure even has the authority to regulate broadband.

Now let's compare that to the rest of the world.

In China, China Telecom and the State Grid Corporation recently signed a strategic framework agreement to promote the deployment of fiber-to-the-home. While China obviously has a different political reality to work within, it's almost unfathomable to imagine America's major telecom and utility players coming together on an agreement to work together on deploying large scale fiber.

In Australia, there was the recent news that the federal government is going to buy and retire Telstra's copper network to make way for its fiber network. Yet here in the US we've got politicians protecting copper and refusing to buy in to a fiber-powered future. Can you imagine the US buying up all of AT&T; and Verizon's copper plant? Not in a million years.

And in Japan the government recently declared the goal of a fiber pipe to every home by 2015. Of course Japan has a much smaller country with a much higher density, but notice how they set the goal of fiber to EVERY home, not just fiber to most homes. Yet here in the US, our government has no problem setting lower standards for its rural residents creating second class digital citizens.

What these stories highlight is that the gap between America's broadband infrastructure and the rest of the world isn't just an academic exercise. It's real, it's widening, and if we as a country can't get past our navel-gazing and excuse-making then we're going to end up in a position whereby our country's trying to compete in a global digital economy with 20th century infrastructure while countries like China, Japan, and Australia equip their innovators with 21st century networks.

If we feel any pride as a nation, if we have any sense about what our economic future can and should look like, then we need to make sure these stories about where the rest of the world's heading aren't swept underneath the carpet.

The time to act is now. The rest of the world realizes this and is moving forward full steam ahead. The question is when will America wake up and start getting serious about our international competitiveness as a whole rather than focusing almost all of our attention on how do we get every American connected to yesterday's networks.

Below is the abstract I submitted to the Open Technology Initiative's call for papers for their Broadband Act of 2011 invitation-only event in late September.

I thought I'd share it with you all as it sums up points I've tried to make over the last and a half related to our country's attempt at crafting a national broadband plan.

I'll continue to expand upon these ideas in the coming weeks, and hope that they're of sufficient merit to warrant an invite to what sounds like will be a tremendous event in September.

"The Need For A New Regulatory Paradigm For Broadband"

The Internet and the broadband ecosystem that supports it are too complex and revolutionary to try and regulate in an incremental fashion. The time has come for a complete re-imagination of how government regulates 21st century communications.

Communications policy has always evolved alongside communications technology. New technology begat new regulations. The problem now is that there are layers upon layers of policy silos that aren't necessarily relevant to the silo-busting nature of the Internet. Government regulates voice differently from video while new technology has destroyed and redefined these distinctions.

To bring communications policy up to speed with communications technology, and especially to have any hope of regulations being able to keep up with and not hinder technological evolution moving forward, America needs to establish a new core to frame what government's role and purpose are in regulating communications in the 21st century.

This new framework must be based on the most fundamental principle of this new communications paradigm. While there are numerous issues related to 21st century communications that are vitally important to America's future, the most fundamental is the universal availability of high capacity, high reliability, low cost bandwidth.

Without access to bandwidth America's citizens, businesses, and institutions won't be able to take full advantage of all that the Internet makes possible. Issues like driving adoption and use and protecting net neutrality are meaningless without the capacity to deliver the applications that makes these issues in the first place.

With the availability of high capacity, high reliability, low cost bandwidth at the core of America's 21st century communications strategy, a focused plan can be developed to specifically address these issues, to identify America's current bandwidth capacity, its future needs, where gaps exist, and what strategies can be pursued to close these gaps.

But to be successful, this plan needs to rid itself of preconceived notions, to kill the sacred cows of broadband policymaking. America can't afford to wait for the private sector to deliver bandwidth only where competition dictates investment. America can't waste resources on protecting the notion of technology neutrality when what's needed is a fully articulated plan that's aware of the strengths and limitations of various broadband technologies.

With a new bandwidth-centric plan in place for connecting all Americans, all of the other existing communications policies and regulations can be revisited within this new unified context. Dealing with issues of voice and video can start by insuring that sufficient bandwidth exists to enable the delivery of these applications, and then rules can be developed around how they can be used and this use can be encouraged.

American can't afford to keep taking baby steps down the wrong path. Instead we need to reset our nation's course to position us to take the bold leaps forwarded that are needed to keep up with the rest of the world.

During the course of packing and getting ready for our move to Lafayette, LA, I happened to meet someone who works at the Commerce Department.

While making small talk, I shared my interest in fiber. The gentlemen had an interesting response: "Oh, do you mean like BTOP?"

I was a bit taken aback by this at first as he didn't seem to be overly knowledgeable about broadband, and yet he immediately knew about BTOP. It's when I asked him what he does that I learned he works in the Commerce Department, which explained how he knew about BTOP.

But probing a little further, I found out that while he didn't work directly for NTIA, he had been a witness to what had happened during the BTOP program.

We shared a quick back-and-forth about how we both thought NTIA had some good people working over there who were put into an impossible situation. I mean, how are a couple dozen people supposed to properly distributed four billion dollars in a year and a half with limited institutional wherewithal related to a grant program of this size?

Then things got interesting: he admitted that NTIA was basically in crisis management mode and that he had been asked to step in and help review grant applications because he had had experience as a grant reviewer previously.

I had to ask him straight out: does he know anything about broadband? And he candidly admitted that he doesn't.

He seemed like a bright, nice guy, but I'm not even sure he knows the difference between different broadband technologies. He's a self-described finance guy, and yet he had no awareness about the business models of broadband before reviewing these applications.

While I know each BTOP application was supposed to be reviewed by multiple reviewers, and hopefully the ones with less knowledge like him were put onto teams that had a lot more knowledge, I still can't help but be disappointed by this confirmation of my fears.

The greatest complaint that I picked up from everyone I knew who applied for BTOP money is that it didn't seem like NTIA truly understood the business models that were submitted to them. In particular, many with the most advanced, creative models felt like their applications had been dismissed for being overly complicated.

Well if some portion of NTIA's reviewers were just numbers guys, with no nuanced understanding of the business of broadband, wouldn't it make sense that the result of this would be the likelihood that the simplest models won out over the most complex?

What's most frustrating about this is that it suggests that because of the limitations of its reviewers, NTIA likely missed an opportunity to identify and support the most innovative models for broadband deployment.

Any application that appeared overly complex likely didn't get a fair consideration as its reviewers may not have fully understood what they were reading.

And another unintended consequence of this is that if your reviewers don't know broadband and can only rate applications based on NTIA's scoring criteria, then what you create is an environment where the projects that get funded aren't necessarily the best but rather are the projects with the most well completed applications.

I, for one, would have much preferred NTIA picked the best projects with the most innovative models rather than the projects that had the simplest models and the best paperwork.

Unfortunately, we can't turn back the clock on the broadband stimulus. Luckily, the sense I get is that NTIA has been able to step up its game in the second round. And since the bulk of the money's being given out in this second round, hopefully we haven't blown too much of an opportunity to get the greatest return on our investment in broadband.

But let this be a lesson for the future. Just because a government agency already exists and has some experience giving out money for broadband does not necessarily mean they're prepared to manage an exponential increase in available funds.

There's been a lot of buzz recently around Congress questioning the FCC's decision to set a universal broadband goal of 4Mbps down 1Mbps up by 2020 relative to the FCC's 100 Squared goal of 100 million homes having access to 100Mbps by that time.

Congress thinks that by setting a low universal broadband standard that the FCC is relegating rural America to being second class digital citizens that will always be at a disadvantage connectivity wise compared to city dwellers.

The FCC's argument is threefold:

1. That the 4Mbps/1Mbps standard reflects the average speeds that Americans use today
2. That their 4Mbps/1Mbps standard for universal broadband is actually higher than almost any other country's
3. That their plan is to revisit this standard every couple of years so they can adjust these numbers as needed over time

But there are significant problems with all three of these assertions.

1. How can we justify setting standards that state that it's good enough that rural Americans will have access to the average speed needed for today's applications ten years from now? That's the very definition of creating second class digital citizens, and seemingly ignores the likelihood that by 2020 life-changing apps that require 10Mbps and beyond will be commonplace.

2. What the FCC conveniently fails to note is that those other countries with lower standards for universal broadband also have much shorter timeframes for achieving them. Sure they may only be shooting for 1Mbps but their goal is to have that universally available within the next two to five years, not the next decade.

3. This is where it's hard not to perceive the FCC as being rather ignorant of the potential impact of their decisions. It's easy to say that we can just revisit the speeds in the future, but these aren't academic discussions, these decisions have real-world consequences.

What the FCC seems to fail to understand is that whatever standard they set should have a direct impact on the kind of broadband networks that get built out, but even more importantly whatever broadband networks get built are what rural America's going to be stuck with for the foreseeable future.

Investment in broadband is done with an understanding that whatever networks get built are expected to be used for years in order to realize a return on that investment. So a 4Mbps/1Mbps network built today is likely expected to still be in use by 2020.

So if we invest all our money in 4Mbps/1Mbps networks, how do we guarantee that our broadband infrastructure can be leveraged to deliver exponentially greater capacity in the future?

Imagine a scenario where the FCC doesn't revisit its universal broadband standard until 2012. That means we'll have had two years of investment in networks that can support 4Mbps/1Mbps but with no guarantees that those networks can reliably deliver more bandwidth. So let's say the FCC wises up and increases the universal standard to 10Mbps/10Mbps. Now what are we supposed to do with all the networks that have been built to only support 4Mbps/1Mbps?

Luckily, the FCC isn't the only player in this equation, and at least RUS seems to have an understanding of this fundamental challenge of rural broadband as they've focused the bulk of their stimulus funding on full fiber networks that can scale to meet any bandwidth demand.

But that doesn't change the fact that if the FCC sticks to its 4Mbps/1Mbps universal standard, that we risk wasting a lot of investment by saddling rural America with inadequate networks that may have to be overbuilt in the not-too-distant future.

While I fully respect the need to get everyone connected as quickly as possible, we can't ignore the reality of what it's going to take to make sure all Americans have equal opportunity to realize the full potential of the Internet.

If we continue to duck the hard questions by kicking this can down the road, we risk wasting our precious limited resources building networks that are inadequate for America's needs and thereby miss the opportunity to protect the future of our rural communities.

And I, for one, have to agree with Congress that this is an unacceptable scenario, that we can't afford to create second-class digital citizens, and that if anything rural America needs high capacity networks even more than urban.

Traveling Europe With A 50MB Bandwidth Cap

| No Comments | No TrackBacks

I spent the last two weeks with my wife eating our way through Paris and Italy.

We went into the trip with very little preparation due to our hectic schedules and relatively last minute nature of the trip, except for one important step: I bought 50MB of international mobile data for my iPhone.

I did this because if you just go and use your iPhone without a prepaid plan, you're going to be paying by the KB, and 10MB can cost $200.

To my astonishment, I still had to pay $60 for my 50MB. More than a dollar a megabyte. And there was no discount by volume as it's $120 for 100MB. Though they did provide a discount where overage cost $5 for 1MB instead of $20.

I decided that 50MB should be about right as a quick test showed that downloading email and using Maps didn't use much data, and I could load a webpage for about 1MB or so. I also assumed I'd have access to Wi-Fi hotspots at various points to conserve my mobile data.

I noticed that voicemails can involve a lot of data transfer, though, so I set up a Google Voice account where I could listen to all of my voicemails online through my computer as I knew I'd have Internet access in most of our hotel rooms.

As we got onto the plane in Philadelphia headed for Paris I reset the bandwidth tracker built into every iPhone and swore that I wouldn't take my eye off that throughout the trip. Here's a recount of my experiences:

Paris - We get off the plane bleary-eyed into a cab that leads to a lovely day in Paris filled with baguettes, a bike ride down the Seine to Notre Dame, and a perfect first French dinner outside at a bistro, finished with a chocolate croissant from a bakery across the street.

My data usage stays relatively low until I casually click on a couple of links to sites within emails. After loading not more than a few webpages my cap bounces up towards 10MB. Even though I know browsing's data intensive, this experience leads me to be ultra vigilant and view only a handful of webpages through the duration of our trip.

We spent the rest of our time in Paris seeing the sights, eating the food, and enjoying quality time together. Even better was that I was using email and maps indiscriminately while avoiding the hassle of trying to find open wi-fi networks, and yet we made it through four days and I was only in the teens megabyte wise.

Venice - We only spent a day in Venice, but the iPhone could have never been more essential. We took the water bus in, saw our stop come up, was preparing to get off, and then watched our stop go right on by. Apparently they don't stop at every stop automatically.

So we get off at the next stop and have to navigate what's described as a "10 minute walk" by the water bus captain back to the last stop, and we can get there by just going to the first street and taking a left. Heh... In his defense that street was the main thoroughfare and did kinda get us back there, but it wasn't straightforward and it wasn't 10 minutes. Our saving grace was using Google Maps on the iPhone.

We had it on constantly to make sure we were heading in the right direction, and for the most part the GPS had a perfect bead on where we were so we could closely track our progress. Having it on so constantly did seem to sip data a bit faster, but it's still an inherently low bandwidth kind of operation, so our usage was staying on track.

Florence - Not a lot of report here mobile bandwidth wise. We ate some tremendous food, drank a lot of wine, and wandered the streets. I continued using email and maps indiscriminately, and we stayed on a good trajectory over these four days. I was going to be able to make it to Rome with more than 10MB left.

Rome - While I felt strong going into Rome cap-wise, a throwaway click onto a webpage and the need to read a document sent to me as an attachment started getting me uncomfortably close to my 50MB cap.

I started changing my usage behavior. I suppressed the urge for the idle email check. I relied more on real maps and quick glances rather than walking around with it on and in my hand.

The iPhone did continue to amaze me at its versatility in making our lives easier. When our cab driver at the train station didn't understand much English nor my butchering of the pronunciation of where we needed to be going, I simply showed him the address and where it was on the map. When we were walking into the Vatican and I hadn't had access to a printer to print out our tickets bought online, I just put my iPhone up to the glass showing the email confirmation. There were numerous times it helped us bridge the language gap and that's without any specialized translation apps!

By the end of the trip I'd somehow managed to stay just under 50MB, ending up at 49.6.

Reflecting upon this experience, it's interesting how this cap absolutely did start to change my behavior and cause anxiety. I had to be conscious of conserving my data, of suppressing my usage.

While I know this might be an extreme case, why aren't we talking more about the impact of bandwidth caps on suppressing the utilization of broadband? I don't know how big of an impact it has but anecdotally I can vouch that it certainly has some.

I'm also a bit flabbergasted at why this international mobile data was so expensive. Is mobile data really that expensive in Europe for everyone? Or just for foreign travelers? I'm a bit embarrassed that I don't already know the answer to that myself, but I'm definitely going to find out!

A final interesting development to share bandwidth wise began in Florence when I started to download a lot more to my computer as I'd run out of movies I brought with us on the iPad and was starting to rent them through iTunes. Over the next week I downloaded a handful of movies that consumed gigabytes worth of bandwidth.

What this highlights is the realities of how wireless caps suppress media consumption. This simple act of renting movies can be exponentially higher than the bandwidth caps that are being set in most every country for wireless broadband. And I wasn't even downloading the HD versions as I was using what felt like DSL.

If we want everyone to be able to buy into this new economy for digital media, we need to make sure everyone has access to wired connections with much more robust caps to handle the heavy lifting of high quality video delivery.

But I digress on this tangent for now as what I'm most concerned at this moment is that I didn't unintentionally blow through the bandwidth caps that my hosts might have had for their wired connections I was downloading movies through!

In any event, our trip to Europe was perfect and in large part precisely because of this 50MB of data. So all in all, I'd say it was the best $60 I spent, other than maybe that truffle steak my eat devoured in Paris. C'est la vie!

Earlier this week I came across this interesting study by BroadbandChoices that suggested consumers care more about broadband reliability than speed.

While our national broadband plan does touch on the issue of broadband reliability, suggesting it's a topic worth researching further and one that's essential for applications like public safety and smart grid, it's pretty much silent on the issue of whether or not America's broadband infrastructure is currently reliable enough.

And yet, here are consumers saying that reliability is more important than capacity. So let's consider this issue a little further.

First, a fundamental truth of broadband reliability is that shared networks are less reliable than dedicated, and lower capacity networks are less reliable than higher capacity ones.

Put into practical terms, that means that DSL networks tend to be more reliable than cable or wireless networks as each customer essentially has their own pipe vs. sharing with all their neighbors. That also means that fiber's more reliable than any of these other networks as it has by far the most capacity.

Second, broadband reliability also includes issues related to the physical characteristics of networks. For example, DSL reliability is limited by distance, so if you're too far from the central office your connection won't work as well as if you're closer. Also, networks with active components in the field are less reliable than those with passive. Since copper networks require electronics throughout they're less reliable than passive fiber networks.

Third, broadband reliability encompasses the nexus of technical and business decisions made in connecting your local network out onto the Internet at large. So if your community doesn't have a big backhaul pipe to the Internet, or your provider decides to overprovision that backhaul connection to the extreme to try and drive greater profits your network will be less reliable.

What's so frustrating is that while our national broadband plan isn't completely silent about these issues, it also doesn't really address them head on within the context of insuring Americans have access to reliable broadband networks. Instead it almost seems to assume that broadband networks are and will continue to be reliable.

And yet the exact opposite is the case anecdotally all across America today. I can think of any time I've been at an outdoor event with tens of thousands of people and how my iPhone stops working. Or when I was visiting relatives in Cape Cod and learned that when schools let out in the afternoon Internet surfing slows to a crawl. Or when I try to use videoconferencing software at home and the call is dropped or the video quality deteriorates. Or any time there's a major national event and the Internet as a whole nearly shuts down, unable to handle the massive influx of traffic.

While I know this last point often has more to do with the reliability of websites and servers rather than broadband networks, it's another example of the reliability issue that we need to be finding solutions to if we intend to be able to rely on all the things broadband makes possible to improve our day-to-day lives.

It was bad enough to know that the FCC largely ignored issues of reliability when it was from the perspective of being a more academic exercise. But now that we're hearing that customers themselves demand networks that put reliability over capacity, it's even more galling that there's been little to no serious debate in DC around how we can improve the reliability of our country's broadband infrastructure.

So as we move forward, let's not forget that capacity alone is not the only issue that matters. That when push comes to shove, it's reliability that matters most as without it how can we rely on the Internet?

About this Archive

This page is an archive of entries from July 2010 listed from newest to oldest.

June 2010 is the previous archive.

August 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.