June 2009 Archives

The vast majority of today's Internet traffic is delivered on what's referred to as a "good enough" basis. As there are often many hops between where content or an application is hosted and the end user no one can guarantee service delivery, but since things generally work well enough to not be a nuisance, "good enough" delivery is good enough for most users and use cases.

But this "good enough" mindset seems to have infected and shaped too many people's perceptions regarding the kind of broadband America needs, leading some to think thoughts like:

- If we have adequate capacity to handle today's applications then what's the point of more bandwidth? What we have is good enough.

- If speeds are getting higher and prices lower, regardless of how incrementally slow that progress is, then that's good enough.

- If broadband providers advertise enough speed it doesn't matter if they can actually deliver it or if they're providing service of high reliability and low latency, promising adequate broadband is good enough.

- If rural areas can get online at any speed than that's better than nothing and therefore good enough.

But you know what? I reject all of these notions as not good enough, as inadequate to support the goals of a country that throughout its history has always strove for greatness.

Touting that we have sufficient capacity for today's apps means nothing as apps always fill whatever pipes are available. And bigger bandwidth apps can't come into existence without bigger pipes to run over. So pointing to the lack of bigger bandwidth apps as the reason why we don't need bigger bandwidth pipes is not good enough.

Realizing incremental growth is a good thing and we have made some progress towards a better broadband future, but we can't win the race to be a leading nation in the digital economy by taking uncoordinated, unguided baby steps while other countries are leaping forward with purpose. That's not good enough.

Basing our entire broadband future on the speeds providers claim they can deliver means we'll never have networks that can live up to their promises nor will we ever realize the many possibilities of living in a world where networks are robust and reliable. That limited way of defining broadband success just isn't good enough.

Getting the unserved online at any speed is important, but we're doing rural America a disservice by not setting the more aspirational goal that all Americans deserve equal access to the best broadband. Rural Americans need next-generation connectivity as much if not more than urban dwellers, so goals that create second-class broadband citizens aren't good enough

As we go about formulating our first national broadband strategy, it's my sincere hope that we don't allow ourselves to limit our expectations to that of achieving a networked future that's only "good enough." What we need now is an aspirational plan that doesn't define success based on what's happened yesterday or is happening today but on what our country needs to compete in the decades ahead.

The simple truth is that to be leaders we have to lead, to be competitive we have to compete, to realize all that's possible in the 21st century we need a 21st century plan.

Setting our sights on a broadband future that's only "good enough" isn't good enough for the America I know and love. So let's put aside these low expectations and commit ourselves to a plan that aims for greatness, that defines success not solely based on incremental progress but on taking giant leaps forward. Because that's the only mindset that I consider to be good enough for determining our country's broadband future.

One of my biggest pet peeves is the claim that some day we'll live in a world without wires, that all of our broadband connectivity will eventually be delivered over the air.

While I don't rule out the possibility that some new technology could be invented that makes this possible, the reality of today's technology is that wireless is having enough trouble trying to keep up with the increasing demands of mobile applications and devices.

Just take a look at AT&T;'s struggles handling iPhone traffic. There have been reports from across the country that anywhere too many people with an iPhone come together in one place--be it at a baseball game, a concert, a conference, or otherwise--the network starts to slow down dramatically. I've experience this myself, even to the extreme of not being able to make phone calls or send text messages, let alone surf the web, and at events as relatively mundane and preplanned as the Washington Nationals' home opener.

And these were issues with the old iPhone. What's going to happen now that the new video-capable iPhone is out? Already YouTube has seen mobile uploads of video increase 400% a day since the iPhone 3GS came out. And these trends don't just apply to the iPhone as in the last six months YouTube reports having seen a 1700% increase in mobile video uploads.

Also important to point out is that this is just one video application--recording and uploading video--mobile devices are or soon will be capable of. Tons of mobile phones now have on-demand and broadcast TV as well as other video available to watch. Some phones are enabling live video communications between users. And there's a growing trend of webcasting live streams out to the world from a mobile device.

All of these applications don't just require mobile connectivity, but a lot of sustained bandwidth as well. So what's going to happen to these mobile networks when everyone's using their mobile devices to push video over the network? They're having a hard enough time supporting limited web browsing and text messages!

Increasing demand for mobile bandwidth further is the ability some phones feature to allow laptops to be tethered to them to use their wireless connections to get onto the Internet. There are already rumors that the reason AT&T; doesn't yet support the new iPhone's tethering feature is that they're afraid their network won't be able to support the increase in demand for bandwidth. Making this even more notable is that most of the time when someone wants to tether their laptop to their phone they're in a mobile use case. So again, the demand created by mobile apps alone outstrips the capacity of mobile networks.

Also worth noting is that regardless of whether or not this rumor of lacking capacity is true, there's no denying the limitations of wireless when Apple and AT&T; force some apps--like the one from Slingbox that lets you watch your home TV service from your iPhone--to use WiFi rather than wireless to transfer data. By doing this they're basically admitting that more bandwidth-intensive applications need wireline connectivity.

An interesting sidebar to this discussion is this recent survey that showed nearly half of Americans would drop their mobile broadband service if they ran into financial trouble whereas only 10% would cancel their home broadband service. While some of those home connections may be wireless, the vast majority are wireline. So I take this to mean that the market already recognizes the value of reliable wireline service relative to less reliable wireless when it comes to their primary connection onto the Internet.

The simple, undeniable reality of wireless is that its capacity constraints are due to clear technological limitations.

For one, wireless spectrum is artificially limited, which restricts the throughput and overall capacity that you can get wirelessly. If we were able to get lots more spectrum that would relieve some of these issues but not all.

Because the more significant limitations are found in individual towers. When building a wireless network you only build enough supply for whatever demand you estimate will be there. But what happens when that demand increases dramatically, like for an event or, even worse, an emergency? You can't necessarily snap your fingers and upgrade that wireless capacity, which is why we run into issues so often whenever lots of people are trying to use their phones in the same place.

And some of these issues are purely evolutionary in that when you're deploying a new network you can overbuild capacity only so much, so as people start adopting it service is likely to degrade during those periods before the networks are upgraded with more capacity. This reality gets worse the larger the service footprint of a wireless network and in particular a wireless tower as there's a bigger area for too many users to try and get online wirelessly at the same time.

Fiber, on the other hand, can always have enough capacity and can easily be upgraded to meet any level of demand whenever its needed. It's the only broadband technology that can do this, too.

And as I've said many times before, you can't have robust wireless access without a robust wireline infrastructure. Those towers need big fat pipes to plug in to if they're going to offer lots of reliable bandwidth.

So for all these reasons, barring some unforeseen technological advancement, we will never live in a world completely without wires.

That being said, wireless is an essential component of a fully realized 21st century broadband network. It's the extension cord that brings service everywhere you don't have a wired outlet nearby.

The key point in all this, though, is that we should not burden wireless networks with expectations that are too high, that someday they'll be able to replace wireline networks, as that's setting them, and in turn us, up for failure.

Instead we need to acknowledge the limitations of wireless, and keep wireless broadband focused on supporting mobile apps and devices. Because as I detailed throughout this article, wireless has plenty of work on its plate just to keep up with the burgeoning demands for mobile bandwidth for mobile devices.

So let's stop trying to lump wireless and wireline broadband together and start realizing that they're complementary not competitive.

So in one camp you have folks who think broadband deployment should be purely market-driven, all broadband networks should be owned/operated by private companies, and these networks should be vertically integrated with services.

In another camp you have those who believe all broadband networks should be publicly owned and operated, and that these networks should operate as dumb pipes.

The public guys don't think we can get world-class connectivity to everyone without government intervention, and that the old model of providing telecommunications services is holding back the new paradigm of abundant bandwidth and open competition.

And the private guys don't think a public network can be run efficiently, will be adequately invested in over time to continue driving innovation, and should be allowed to duplicate and compete with private investment.

Yet at the same time these debates are going on, a common theme coming out of both sides is the need to focus on public/private partnerships as a way to solve our broadband dilemma. But what does a "public/private partnership" mean or look like?

There's a lot that's possible when we step beyond any specific ideology and start considering how best to marry the strengths of both public and private models into something greater.

To get this conversation rolling, let me suggest some guiding principles that public/private partnerships should be striving to achieve:

Make as much capital available as possible.
Getting the best broadband to all citizens is hugely capital-intensive. While we should absolutely acknowledge the tens of billions private industry are investing each year in their networks, we should also make full use of the many tools at government's disposal to make more funding available, whether that be by incentivizing continued investment through tax credits, encouraging greater availability of private capital through government partially guaranteeing loans, or government being a direct source of capital through loans or grants.

Aggregate demand and reduce barriers to deployment.
By working together, public and private parties can be educating and inspiring the public to better integrate the use of broadband into their lives, thereby creating more customers and improving the business case for broadband deployment. Also, by the public sector doing more to streamline the government processes needed to facilitate deployment, and by the private sector doing more to respect their public interest obligations, we can make broadband deployment more economically viable and efficient.

Concentrate that investment on non-duplicative networks.
Leaving the deployment of broadband purely up to market forces has proven to result in communities with more attractive markets getting all the investment in upgrading capacity while less attractive markets get left behind. Public/private partnerships should be geared towards insuring that all citizens get equal access to the best broadband so that everyone can have a level playing field to compete on in the 21st century. That means less of a focus on facilities-based competition and more on making the best broadband ubiquitous.

Encourage as much competition and innovation as possible.
As facilities-based competition is too capital-intensive to support more than a couple of big players, and as the key to fostering innovation is embracing the principles of openness that the Internet has proven to be so effective, the best way to accomplish these goals is to have networks that are supported by public/private partnerships be open to multiple service providers competing on the same open infrastructure alongside unfettered over-the-top competition.

Offer many opportunities for private companies to make money.
I am a believer that market-driven, for-profit entities are often best at driving new efficiencies and creating new innovation, so public/private partnerships should embrace every opportunity to incorporate that mindset into their model. And everyone needs to understand that just because something's called "municipal broadband" doesn't mean there aren't many potential roles for private enterprise to make money.

With these thoughts in mind, I want to propose a more fully realized model for public/private partnerships in broadband deployment.

I'll start with the premise that the underlying network infrastructure should be publicly owned, and that that network should be open so that we're maximizing opportunities for competition. Now let's unpack this a bit.

To get these networks funded we should leverage a mix of public and private capital sources. As readers of App-Rising.com know, I'm particularly fond of government partially guaranteeing loans from the private capital markets as a way of maximizing the leveraging of government dollars while minimizing risk. By doing this we reduce the outlay needed to be made by government while providing a way for private industry to make money. It's really the ultimate public/private partnership.

Then, to get these networks built, we can outsource the design and build work to private contractors. In some instances local public entities may have the resources on staff to help get this job done, but in most these projects would be better served by bringing in outside experts. In this way the networks will be built with the public interest in mind yet still provide opportunities for private companies to drive new efficiencies and make profit.

Next, to operate these networks, while they are owned by some public entity it makes sense to outsource this management work to a private company. So again, profit can be made but the public will be protected against any predatory practices since they own the network and can kick off whoever's managing it if they're abusing their position.

To maintain these networks and upgrade them over time use whatever money is being earned by the public owning the networks, and also consider having ways to either charge private service providers or give them the opportunity to help invest in upgrading the network's capacity to better support their services and applications.

The most exciting part of this model is in the offering of services, which will most definitely be driven by private companies in search of a profit. Only now, because of the public's involvement in helping finance and owning the networks, these private providers can focus their investment on developing new services rather than deploying and maintaining physical infrastructure. Because of this public/private partnership we'd be able to realize a whole new era of open competition that will drive a host of new innovation.

So there you have it, through a more complete model for public/private partnerships we can achieve the best of all broadband worlds, balancing the needs to respect public interest with the benefits of embracing profit-driven efficiencies and innovation.

Often I worry that when people talk about "public/private partnerships" they stress too heavily one side of that equation or the other. Either they get caught up in the ideology of public networks without acknowledging the benefits of having profit-driven private involvement, or they think a "public/private partnership" should only mean government doing more to make the life of private companies easier as they go about their market-driven ways.

But there is a better solution to be had through pursuing a more fully realized definition of what public/private partnerships can mean.

Now I'm not saying that what I've laid out here is necessarily a silver bullet to solve all our broadband problems as there are lots of details still to be worked out. And I'm not claiming ownership over these ideas as many of them are already being put into practice both in the US and abroad.

But if we are to tackle the massive challenges associated with bringing the best broadband to all Americans so that everyone can compete equally in the global digital economy, then we need to be leveraging all the tools at our disposal to their greatest degree. And the best way to do that is to embrace all the possibilities of public/private partnerships in order to help us achieve our broadband goals.

Taking The FCC's Website To 2.0

| 1 Comment | No TrackBacks

So the FCC's website is horrendously outdated. While perhaps cutting edge in the 90s, it's painful to use today, especially the public commenting system, which is difficult to navigate, find what you're looking for, and even read that information once you find it. For the best walkthrough and subsequent indictment of how bad things are, check out David Isenberg's post lamenting his experiences with cries of, "Oy, oy, oy!"

I can tell you that as a first-time commenter this year I've found the whole process to be frustrating, artificial, and distant. It feels like I'm just throwing my comments down a hole. While I know they're being read as they have to by law, I don't know how well they're actually being considered nor is it immediately obvious if anyone's commenting on my comments. It's also cumbersome replying to the comments of others as gathering and reviewing all the data is time-consuming, and the response mechanism again makes me feel disconnected from the decision-making process.

Yet we now live in an era where interactive participation is the new norm on websites, where there's a host of new technologies that can be used to facilitate conversations and interactions, where the public can engage in a true dialogue with government to aid in the formulation of effective public policy.

Luckily, it also looks like we're soon to have an FCC Chairman in Julius Genachowski that understands the value of embracing this new level of online engagement and who has already voiced his commitment to establishing a new FCC website that utilizes new technology rather than holding onto old ways of doing things.

So to help further this conversation along I wanted to share some of my thoughts on how we can enable an FCC 2.0.

Most of my thinking in this area revolves around the need for the FCC to have its own social network.

Basically what would happen is if you're an individual or an organization that wants to comment on an FCC proceeding, you sign up for an account, which opens up a slew of features including:

Easy comment/reply submittals - Site remembers your information, provides easy-to-use forms, and archives all your comments/replies.

FCC proceeding tracker - Flag relevant proceedings, get notifications for deadlines, and follow comments as they come in, also be able to follow replies to comments including your own.

More robust reply system - Instead of just written replies citing comments, have things like reply trees that branch off of comments so you can track threads easily, more informal discussion boards to facilitate conversations about comments, private boards that groups can setup to discuss and coordinate responses to comments, also consider introducing public voting/ranking system to inform which comments have the most public support. Another thought would be to enable a system for easy in-line commenting so that replies can target specific areas rather than just having to be standalone documents.

Commenter ratings - The site should have some way of highlighting the opinions of accomplished and/or frequent commenters. Perhaps the public can vote on whether they like or dislike comments and this input then produces a commenter rating. Or social network members could recommend other commenters, the more commenters who recommend you the higher your rating. The higher your rating perhaps your comments show up first in search results.

Commenter classifications - This is a bit more out there, but I'd like us to consider having commenters identify which camp they're in: Public, Private, or In-between. For many communications-related issues there are clear fault lines between those who believe in a market-driven private and a government-guided public approach, so why not get the biases out there? It may help foster collaboration between like-minded entities thereby refining messages and hopefully helping bring us closer to consensus on various issues by uniting mindshare.

Public requests for information - Instead of being a formal RFI process, I'd encourage the FCC to set it up so they can request information from their social network on a much more regular and informal basis. By collecting all the best brainpower on communications issues in one place they'd be one step away from any answer to any question they might have.

Deputize some commenters - I include this idea because setting up a social network like this will likely create a lot more input coming into the FCC. The challenge that creates is the point I mentioned earlier that they're mandated by law to read all public comments, and they already have a massive challenge getting through the number of comments they get with today's system. What I'm suggesting to resolve that is create some way to deputize trusted commenters to extend the FCC's reach. Count their actions reading and responding to comments as part of the FCC's overall response. This may not work for everything, like the official comments on FCC proceedings, but perhaps it could for the new class of conversation this site would be enabling.

Empower some commenters as moderators - While it will have to be closely watched to insure any power given isn't used to suppress any voices, by charging some members with the responsibility to keep certain conversations moving forward we can make sure this site continues to keep discussions progressing.

Consumer complaint section - If the FCC wants to truly be useful to the public and not just those of us inside the Beltway trying to make policy, then it makes sense to have there be a place in the FCC 2.0 for the public to notify the FCC of suspect or illegal activities. This section could serve as the repository of information and rallying point for coordination around identifying the misuse of communications technologies.

Regular video updates - This can be as simple as having an intern make a videoblog reading the happenings of the last week at the FCC, or it can become a more robust way of helping the FCC communicate its messages to the public that could include comments from FCC Commissioners and public service announcements related to communications issues that are meant to go viral through sites like YouTube. The FCC should embrace the fact that video's the dominant form of communications in the 21st century and start using it more. This function could also be crowdsourced to the social network as there are already some in the broadband policy space who've begun creating their own videos in an attempt to help keep the public informed and up-to-date.

---

These are just some of the many ideas that an FCC 2.0 can be incorporating into not just their website but also the way they do business so as to drive new efficiencies and open new opportunities for the kind of more robust public input that can help the FCC be more effective in its actions.

The biggest point to remember in all this is that arguably more important than the technology of the network is the sociology behind how you bring people together. Notice that most of my ideas aren't about technology; they're about how to better facilitate conversations and the dissemination of information.

Also know that I'm not claiming ownership over any of these ideas. I'm just trying to pull together the first ones that pop to mind so that we can start sketching out what we should be expecting from an FCC 2.0 website.

I know there are lots more ideas out there, though, so if you have any not mentioned above please speak up and share them in a comment below! If you have any thoughts that build off of anything I've listed in this post, or if you don't agree with one of my suggestions, let us know!

The more voices we hear from in this process the stronger the outcome will be.

The time has come for the FCC and its website to enter the 21st century. Let's help them make it happen in the best way possible!

What Happens When Everyone Becomes A Server?

| No Comments | No TrackBacks

I recently framed the debate about how much bandwidth we need around what happens when we reach a point where everyone's using high quality video applications all the time.

But there are other ways that demand for bandwidth is trending upwards, like the explosion in technologies that cause broadband users to start acting like servers and therefore demanding a lot more upstream capacity.

The most infamous example of these are peer-to-peer (P2P) file-sharing networking. Instead of computers downloading content (like movies, music, etc.) from servers they link together to form networks whereby computers are downloading content from each other, with each computer sharing its files through its upstream connection to the Internet.

Consumers are embracing new server-like devices such as the Slingbox, which allow you to access your cable TV service at home from anywhere in the world through a broadband connection.

You can even buy consumer-grade home servers for personal use, like backing up files locally and then being able to access those files from anywhere in the world.

And there's lots of new developments coming online furthering this trend, one of the most exciting recent examples being the release of Opera Unite, a new feature in the latest version of the Opera browser that uses the browser to turn your computer into a server.

They've released this with just a few basic apps that enable things like file and photo sharing, a remote media player, and web server for hosting a website. But what's really interesting is that they've created a framework on which anyone can develop new services that take advantage of this new server-in-a-browser paradigm.

While there's no guarantee this particular product will change the world, it helps highlight the growing reality that soon we're going to have computers everywhere in the home that are going to want to act like servers.

Because of this, we need broadband networks that can deliver robust and reliable enough upstream service to be able to handle supplying these growing demands for bandwidth.

And quite frankly, no broadband technology other than fiber has proven itself capable of fully supporting these new demands for upstream capacity. Every other broadband technology has clear capacity and reliability constraints that limit their ability to welcome and nurture the growth of upload-intensive applications.

If we want to have a country where every computer can become a server, where every consumer can become a producer in the digital economy, then we need to set the goal of becoming a Full Fiber Nation otherwise we will never realize the full potential of this class of next-generation Internet applications.

Just dawned on me that yesterday was the four-month anniversary of the American Recovery and Reinvestment Act becoming law. Four months since over $7 billion was set aside to "stimulate" the deployment of broadband.

Yet this has really been a "stimulus" in name only as to date not a single dollar has been turned into deployment. Zero, zilch, nada.

Some have rightly argued that the "stimulus" has stimulated interest among communities in formulating plans to get themselves wired. But I've heard just as many stories about projects that are frozen waiting on the hope that they might get their hands on some of that free government money.

I am optimistic that we're further along as a country in terms of getting serious about getting wired now than we were six months ago, but I can't hide my frustration over the pace at which government is moving to turn dollars into deployment.

It sometimes feels like we're conducting broadband policy through a dialup modem and there's just too much data and decisions to get through it in a timely manner.

I don't necessarily blame the hard-working folks at NTIA and RUS as they've got to be overwhelmed given the task in front of them and the resources at their disposal. And I certainly don't envy them as it seems likely that they're going to be beat up no matter who they decide to give the money to since there will be many more who don't get any.

But that doesn't mean this process can't be sped up.

I completely respect the need to be careful in spending this money to make sure we don't invite waste, fraud, and abuse into the system, but off the top of my head I could name a dozen worthwhile projects that could be making use of these funds today.

They could be turning dollars into deployment at this very moment, getting unserved communities connected, finding new ways to encourage demand and adoption, and, most importantly, creating jobs.

These are all people that are proven and projects that are truly shovel-ready. So what are we waiting for? Are we really saying that avoiding waste, fraud, and abuse is more important than stimulating broadband deployment and supporting the economic recovery?

Another idea I've been discussing is that of establishing a fast-track partial loan guarantee program at RUS that could free up private capital to get people deploying sooner rather than later. And I'm sure there are more good ideas out there for how to do this better.

What's really driving me crazy is that we don't know any more today about how this money's going to be spent then we did four months ago. I know people are working hard behind the scenes, but we've seen none of the fruits of that labor. And despite a brief flurry of public engagement a few months ago, there's been no obvious way to help influence the process as everything's happening behind closed doors.

Again, I'm not trying to point fingers, but I can't help but feel flabbergasted at how all this is playing out. And, quite frankly, I'm not sure why more people aren't upset.

We as an industry seem to have begrudgingly accepted the way things are going as our fate, believing that this is the best we can hope government to do.

The thing is I think most of the people who normally would be complaining are either looking for money of their own and therefore don't want to upset anyone, or are more worried about the money being spent poorly than on when the money's spent.

And these are both valid concerns, especially the latter as even if these government agencies have the best of intentions they're still likely going to make some mistakes.

But for now I simply wanted to point out that four months have been wasted since the stimulus passed, four months during which thousands of homes could've been getting connected and thousands of people could've been getting hired into new jobs. And other than an uptick in interest among communities that see this pool of money as their opportunity to get wired, we still have not seen any broadband "stimulated" by this "stimulus."

The worst part is the way things are currently set to line up, I'm going to be writing the same thing four months from now, still lamenting the lack of dollars turning into a deployment.

Could there be a better way to do this, to get money out more quickly while still being responsible about it and maximizing its effectiveness? I say, "Yes!" All we have to do is open our minds to new ways of doing business.

Will we embrace these new ways of doing business so we can get about stimulating broadband to a greater degree in a shorter timeframe? That, my friends, remains to be seen.

Continuing my exploration of the comments to the FCC on how best to formulate a National Broadband Strategy I wanted to dive into those filed by Brett Glass.

Brett heads up LARIAT, a wireless Internet service provider in Wyoming that was "among the first, if it was not the very first, of the world's terrestrial high-speed WISPs." Brett's also been a long-time independent and vocal voice on matters of broadband policy, speaking from the perspective of a smaller network operator who's committed to bringing broadband to those who don't have it. In fact, in their service area in southeastern Wyoming only 5% of their customers have access to any form of wireline broadband.

His first major point relates to the definition of broadband: "A local cable TV plant, which delivers video, telephony, and Internet services over the same cable but separated by frequency is an example of a 'broadband' system. Note that the term 'broadband,'
properly used, does not imply that Internet is even among the services carried by the system, nor does it imply any specific performance level."

On the one hand I agree with this statement as there are applications that can utilize broadband networks without ever touching the Internet, but at the same time I can't ignore the thoughts expressed by David Isenberg and others in their comments summarized by their tagline: "It's the Internet stupid." These suggest that the FCC's primary concern re: broadband should be its relation to getting people to the Internet. Ultimately I think they're both right as I don't see how we can talk about broadband without including the Internet, and yet it's also a mistake to ignore broadband's uses beyond the Internet.

Brett goes on in his comments to stress that "data throughput is not the only criterion that should be used to evaluate the quality of broadband service. Many users who switch to LARIAT's terrestrial wireless broadband service do so not because throughput was slow but because the applications they wished to run (e.g. VoIP) could not tolerate the satellite system's high latency. For a VoIP user, an online auction bidder, or a participant in an online game, a data throughput of 200 Kbps, with reliable latency to the backbone of 75 milliseconds or less, is actually far more desirable than a 5 Mbps satellite connection with an average latency of a third of a second and a jitter (variation in latency) of almost as much."

This is an extremely salient point. The definition about broadband has to be based on more than just bandwidth.

Continuing on: "The Commission should recognize that many service providers make exaggerated speed claims, quoting the maximum raw bit rate of their equipment as the speed of the service. (This is tantamount to a used car salesman claiming that an automobile is a '120 MPH car' because this is the largest number on the speedometer.) Such claims, if they are not prohibited entirely as a deceptive marketing practice, should
certainly not be honored when the actual performance of a connection is evaluated."

Another great point. The definition of broadband shouldn't relate to speeds promised by the bandwidth that's actually delivered. And, in fact, we should be looking seriously at preventing deceptive advertising practices where network operators claim they can deliver a certain level of service but in actuality can't.

Brett's last point in this section suggests that the FCC should allow the market to decide what speeds should be offered vs. dictating them. I'm torn a bit on this one. On the one hand I agree with Brett as there are great disparities between what network operators pay for backhaul connections to the Internet, which becomes a problem if you have to deliver ultra-high-speed broadband in an area with high-cost backhaul connectivity. At the same time, I worry that if we leave this entirely up to the marketplace that we may only see small incremental improvements in capacity over time and that we could end up with a lot of money invested in lesser networks.

Because of this I'd suggest that instead of focusing on dictating what levels of service network operators have to offer that we instead define broadband based on the capacity of the access technology. So when we say we want 100Mbps symmetrical to every home, that doesn't necessarily mean a network operator would have to offer that service today if it's prohibitively expensive, but what they would have to do to be considered true broadband is invest in networks with the capacity to offer a real 100Mbps in-network to every home. For wireless this number would be different, perhaps 5Mbps or 10Mbps. In this way we can encourage the deployment of next-gen networks without dictating what services have to be delivered.

Brett goes on to make some good points about the state of wireless spectrum relative to smaller operators, namely that the spectrum auction process favors the big carriers and that smaller operators like LARIAT need more spectrum and/or the ability to add power to their signals. As I'm not well-versed in the wireless world I won't comment on these ideas at this time.

In the next section, Brett touches on the need to solve the "middle mile" problem, but not just in terms of getting more middle mile networks out there, which is where most of the discussion seems to focus, but also on making existing networks available for operators like himself to use. He shares that in his area he's not far from a number of nationwide "information superhighways" but that none of them are willing to work with him, and the local ILEC he has to use instead engages in anti-competitive price gouging.

While Brett doesn't get into many specifics about how to solve this, I like his general suggestion of first incentivize the backhaul operators to open up and work with other network operators, and then if that doesn't work require them to do so. Essentially feed them carrots to try and get them moving, but also let them know there's a stick ready to drop if they don't.

Brett's final points have to deal with everyone's favorite issue: net neutrality. Interestingly, he claims that "so-called 'network neutrality' regulation...would impact small and competitive ISPs like LARIAT far more than large incumbents, potentially destroying alternative providers and eliminating competitive options for consumers."

He then goes on to share that the FCC's treatment of this issue so far, most recently centered around the Comcast/BitTorrent case, has created so much uncertainty over what is and is not OK in terms of managing networks that it has driven away potential investors from LARIAT.

We heard a similar sentiment shared by Michael Johnston in a VidChat on this site last summer, where because the FCC told Comcast what not to do but didn't say anything about what it can be doing Jackson has had to hold off on offering a higher tier of service that they were going to include restrictions on P2P usage to keep their costs under control.

Brett goes on to suggest that the FCC "always seek to solve such controversies by first incenting and stimulating competition, and intervene further only when such efforts are unsuccessful and there is clear evidence of anticompetitive behavior and/or market failure. LARIAT further requests that any regulation of network management practices be accomplished via due process - rulemakings subject to public comment and discussion - rather than 'case by case' adjudicatory actions, so that it is clear to all, in advance, what the rules actually are."

So Brett thinks the FCC should be reactive rather than proactive in dealing with net neutrality, but also that when reacting they should have a public rulemaking rather than case-by-case adjudicatory actions.

The problem is I'm not sure if those two things fit well together. If the FCC's being reactive rather than proactive then network operators can never know "in advance what the rules actually are" until someone steps over the invisible line that triggers a reaction. I can see this leading to smaller operators having to be really hesitant with everything they do as they can't afford to fight any battles at the FCC, and larger operators operating with impunity because they can afford to fight those fights.

Also, I can't help but feel that having to go through a public rulemaking every time an operator steps over that invisible line could turn into a massive headache that will slow everything down as the FCC has to deal with an influx of public comments that they have to sort through and make sense of.

In order to manage this process more efficiently I think the FCC has to be at least a little bit proactive in working to clearly establish what's OK and what's not relative to network management. When it comes to setting those rules instead of a traditional public rulemaking I'd rather there be some body dedicated specifically to tackling these challenges that could bring interests from both sides of the table together along with an arbitrator of some sort to work through the arguments of all sides. Basically get the network operators, the public interest groups, and the applications developers all around the same table to work through what's OK and what's not, what's harmful and what's helpful.

This entity could also be proactive in studying what's happening in the marketplace on a regular (quarterly?) basis to keep up on the latest technologies and techniques, which can help keep all parties informed as to what the latest developments are and also get out ahead of issues a bit before they blow up into huge problems.

I know I'm still being overly vague in these ideas, and in general I agree with Brett's concerns, but I think we need more than a reactive public rulemaking process to effectively and efficiently deal with the complex issues surrounding net neutrality.

FCC NBS Comments: Why Not Follow Japan's Lead?

| 1 Comment | No TrackBacks

Last week was the deadline for submitting initial comments to the FCC on the formulation of a national broadband strategy (NBS). Not surprisingly this has been a hot topic, with more than 8,000 pages submitted by a wide range of parties interested in our country's digital future.

Now we've entered the period during which the public is encouraged to review and reply to these initial comments up until July 7th.

To help facilitate this process I'm going to be summarizing and replying to both specific comments as well as to general themes that I see emerging across multiple comments.

To start with, let's focus on the comments submitted by the Japanese government. Given the fact that they're at least five to ten years ahead of us in terms of getting themselves wired with next-gen networks and inspired about using these networks, who better to listen to when it comes to formulating our NBS than them?

Right away in reading their comments it becomes apparent how much further ahead they are than America is. For example, they released their first in a succession of national strategies all the way back in 2001. Their initial goals were to provide "high-speed Internet" to 30 million households and "ultra-high-speed Internet" to 10 million households within five years. They achieved this goal by 2003, only two years after their initial strategy was set in motion, which shows what can be accomplished when a government sets clear goals and puts in place specific plans to achieve them.

With the deployment challenge well on its way to being overcome they focused their attention on policies that would encourage the effective use of ICT (or Information & Communication Technologies) to tackle social challenges in areas such as e-government, healthcare, and education. Central to the goals of this strategy was "propelling the deployment of optical fiber...and eliminating all zero-broadband areas by 2010." It's interesting how they've linked the demand side of the broadband equation into the need to encourage the deployment fiber and universal broadband.

Tied into all of this was a particular emphasis on "establishing a fair competition environment." They continue, "As for fixed communications we have been promoting to ensure a competition environment in the broadband market by opening networks." And to quote at even greater length:

"We have enforced the promotion of competition to realize the diversity of interconnection by other operators. The Government of japan has introduced enforcing policies for interconnection tariffs at proper rates, which has ensured that costs for using infrastructure of broadband have been low, while incentives for facility investment have not been diminished...A variety of business opportunities in the broadband market was encouraged and the range of consumer options was expanded, allowing the Government of Japan to enable operators to enter the broadband market in Japan without the need for establishing transmission facilities...This creating of an attractive market achieved an increase in the number of broadband users, together with faster and more affordable broadband services."

I have to admit, I found these remarks to be remarkable. Here in the US the arguments against open networks are that they'll dissuade investment and therefore competition and therefore innovation and therefore lower prices. But what Japan's saying here is just the opposite. They're claiming that by opening up networks they've encouraged investment and competition, which has led to both innovation, lower prices, and greater demand for broadband.

I realize that the dynamics of the Japanese marketplace aren't the same as in the US, but that doesn't negate the fact that in a country considered to be in the top 3 in terms of connectivity, that embracing an open mindset is the best way to facilitate evolutionary growth.

Also notice, though, that Japan is careful to acknowledge the need for "proper tariffs" to make sure there's a sufficient rate-of-return to justify investment in infrastructure.

Another interesting thing they've been doing is "to achieve the optimum usage of the radio spectrum through frequency reallocation to keep pace with technology advancements, since 2002, the government has surveyed usage, published the results, and evaluated the extent of efficiency in radio spectrum usage taking into account the opinions of the public."

Now compare that to the US's approach of seemingly having no idea who's using what spectrum and having little to no plan for how to maximize the efficiency of that spectrum. I'd imagine that if we were to embark on a similar analysis of how America's spectrum's being used (or not used) that we'd find a ton of under-utilized airwaves.

Japan also applies its open mindset to wireless as well. "To promote provision of mobile broadband services, the Government of Japan places priority on allocating frequency for operators who establish a plan to open their networks to other operators..."

I like what this approach implies as it seems like instead of saying "Thou shalt be open!" they've focused on providing incentives to get the market moving in the direction they want. By favoring and fostering open networks they can get where they want to go incrementally and organically.

Too often in the US I think we get caught up in trying to open up all the networks at once, which then industry pushes back on by saying that nothing should ever be open. If instead we focused on incremental steps, like prioritizing stimulus funds towards projects that feature open networks and rewarding those operators that do open their networks, we can start making more real progress rather than getting stuck in rhetorical debates.

Arguably the best point Japan made was what they've done to foster connectivity among schools and libraries: "Another example of support has been the Local Intranet Infrastructure Subsidiary since 1998, which has been providing for the building of a local public network in each area connecting its respective schools, libraries, and town hall in order to upgrade the quality of education, public administration, welfare, healthcare, and disaster prevention in these hitherto disadvantaged areas."

While this program is focused primarily on underserved areas, the idea that we need to get all schools, libraries, and public government buildings interconnected with fiber should apply to every community. This is a clear goal that we can set out to accomplish that will give the public better access to ultra-high-speed broadband, improve the ability of these community anchor institutions to operate efficiently, and, if done right, make the deployment of fiber further into a community easier and more economically viable.

The final thing I wanted to point out is that while they don't define them specifically in this document, the Japanese government does go out of its way to distinguish between "high-speed" and "ultra-high-speed" networks. By doing this they're clearly demonstrating that they understand that while getting everyone connected with high-speed broadband is important that they can't afford to take their eye off the ball of also working to get everyone hooked up with ultra-high-speed broadband.

I think for the US that should mean making sure we don't focus all our energy on getting everyone hooked up to 10Mbps and below service. That we can't afford to be on the lagging edge of broadband technology. And that we need to make sure that while we work to get everyone connected at any speed, that we're also striving to get the best broadband to everyone as quickly as possible.

And speed really is of the essence here. As I mentioned at the beginning of this post, Japan's at least 5-10 years ahead of us in terms of their connectivity, and that's a conservative estimate as they've been working on a clear plan of attack for 8 years already. If we don't get our act together soon, they could be 10, 20, 30 years or more ahead of us. In fact, since they don't appear to be slowing down any time soon, if we don't get a bold and specific action plan together we may never catch up, let alone have a chance to surpass them.

The reason this is so important is that while we're still trying to figure out how to get the networks built, Japan's going to have a head start of multiple years on us in terms of figuring out how to incorporate the use of these networks into the fabric of their society to drive new efficiencies and open up new opportunities. In this document they're already referring to broadband as part of their "social infrastructure."

So the takeaways from Japan's comments are clear:

- We're way behind them in terms of getting wired and inspired.
- What enabled their success was setting out a clear plan with bold goals.
- They've found that open networks are the best at fostering competition and innovation.
- And a particular emphasis should be placed on wiring community anchor institutions with fiber.

Again, I'm completely aware that the circumstances of the broadband players and marketplace in Japan are different from the US, but at the same time the principles of what makes for a healthy broadband marketplace are universal. So I'd suggest we take heed of what's worked in Japan and use it as the basis for what can work here in the US. That's not to say we should follow everything they've done in lockstep as I'm a firm believer that through American ingenuity we can find an even better path to our digital future, but at the same time we shouldn't ignore the lessons they've learned that have positioned Japan as a global broadband leader.

How I'd Spend NTIA's BTOP Billions On Deployment

| No Comments | No TrackBacks

There's a lot more to NTIA's BTOP program than just deployment--including money for demand stimulation, public computing centers, and broadband mapping--but for now I want to take a moment and outline in broad brushstrokes how I'd spend the roughly $4 billion they have that can be allocated to deployment.

First I'd set aside $1 billion for best-of-breed testbed projects.

These projects will be selected based on the level of connectivity being deployed, the number of unserved homes getting connected, the amount of local buy-in based not just in dollars but in programs set up to make use of the networks and how much demand already exists for what it will deliver, and the potential for these models to be scalable if successful.

I'd look at who's successfully built sustainable networks in the past first, and I'd analyze applicants to favor those projects that are truly shovel-ready.

I also wouldn't necessarily worry about having these projects reach the largest number of people at the lowest cost as the point with these funds aren't to get everyone connected now they're to allow multiple models to establish themselves that we can learn from to inform our decisions of where to invest next.

As a fiber guy, my preference would be to focus on those projects that bring future-proof networks to unserved areas so we're not having to subsidize deployment again in a few years. To keep things simple I'd look at funding the ten best $100 million projects, or if there are a number of smaller ones with merit perhaps five $100 million and ten $50 million projects.

Then I'd take another $1 billion and focus it on projects that can get large numbers of unserved homes online as quickly as possible. For example, one possibility might be to peel off $100 million and use it to get as many Native Americans on Indian reservations setup with satellite. With that money 100,00-200,000 homes could be installed, possibly with computers too, and have their service subsidized for the next two years.

In general I'm not a big fan of satellite as I think it's too expensive, doesn't have enough capacity, has too much latency, and won't be able to scale to meet future demand for bandwidth. That said, it's going to take time to deploy big broadband networks, especially in areas as rural as reservations tend to be, so I'd consider using satellite in this instance so we can get people online at any speed ASAP, though I'd be very open to other wireless and even wireline technologies as well. The point with this billion is that it be used to get large numbers of unserved online as quickly as possible.

Then I'd take the last $2 billion and use it to start wiring and interconnecting schools, libraries, and healthcare facilities with fiber. By getting these community anchor institutions upgraded to 21st century connectivity we provide a way for everyone to get access to it even if it's not in their homes.

But I'd want to be smart in how we build these networks so that we're leveraging existing assets as much as possible and so that the networks we're subsidizing are designed in such a way so as to be easily used to extend fiber further into communities.

The simple truth is that T1s aren't sufficient to support the connectivity needs of these community anchor institutions. They need fiber. And by focusing a good chunk of money on these networks we can get a lot of good work done that will both afford more people access to big broadband while also laying the groundwork for easing future deployments.

So let's review:

- $1 billion for testbed showcase projects
- $1 billion for getting unserved online quickly
- $2 billion for wiring community anchor institutions with open fiber

The last point I'll make about this issue today is that I'd be careful about worrying too much about making sure every state gets an equal amount of funding. While there is language in the stimulus act mandating that every state get at least one grant, that doesn't mean they all need to get a grant for deployment.

The reason I say this is that I don't think it's wise to artificially restrict who's eligible based on their geography. For example, what if one state has two great projects that are certain to be successes and another state doesn't have anything comparable? I think we should be more worried about funding the right projects than about treating all states equally.

And in fact by doing this we can hopefully help inspire states that are lagging in their planning to get their act together. I don't want anyone to feel like they're entitled to get money but instead believe we should reward those that are truly ready to make a difference in their communities wherever those communities may be.

This doesn't mean we should ignore the rule saying every state should get a grant, just that we shouldn't worry about every state getting a grant for deployment.

So there you have it, a very rough, very broad look at how I'd spend that $4 billion on deployment in a way that can give us the greatest bang for our buck.

What do you think? How would you spend this $4 billion on deployment?

If I Ran AT&T...; I'd Be Worried

| No Comments | No TrackBacks

In the latest installment of my imagining what I'd do if I ran AT&T;, it's time for me to admit I'm worried about the future of my company given where things are trending.

As everyone knows my wireline phone business is swirling around the toilet bowl, bleeding customers on a quarterly basis.

My basic DSL business is struggling to compete with cable as they offer more bandwidth, TV service, and frequently voice through VoIP to further my losses of landline customers.

And now there's news that because of the ailing economy we're having to slow down our already slow deployments of U-Verse.

I can't help but feel like the entire access side of our wireline business is dying with no great hope on the horizon for it ever getting better given our current trajectory.

Yet my people are telling me that everything's fine as AT&T;'s wireless business is booming, but even there I'm pessimistic.

For starters, I have to be realistic in acknowledging that one of the biggest reasons for our wireless success has been because we have the iPhone. And while I know we have that exclusivity locked up for a little while longer, that doesn't mean we'll have it forever. So what happens when we lose it? How are we supposed to maintain that edge?

But even more troubling has been how while everyone loves their iPhones, I'm hearing a lot of chatter about how much they hate AT&T;'s network.

Customers are having trouble getting reliable connectivity in major cities.

Whenever too many iPhone users get in the same area our networks keep buckling under the load, like at this year's South-by-Southwest and at Apple's own Worldwide Developer Conference.

When Apple was announcing the new iPhone 3GS the crowd actually booed when they heard that our networks aren't ready to support the new phone's features like MMS and tethering.

And we're facing a massive pushback over our early upgrade pricing that's causing some commentators to suggest iPhone users not upgrade, wait out their contract, and then jump ship when the iPhone becomes available through another provider.

So with all this in mind I can't help but feel like our wireless success is fragile and hinges too much on something we can't totally control.

And yet despite these massive challenges I am also optimistic about the new innovation coming out of my company.

Our U-Verse TV service has received generally favorable reviews, making it the largest most successful deployment of IPTV in the country to date.

We're working with Cisco to deploy their Telepresence technology across the US and around the world.

Our AT&T; Navigator advanced GPS service just won an award from Frost & Sullivan.

And we just released the cutting edge AT&T; Synaptic Storage as a Service, which moves us into the cloud computing space by offering a business-class storage service.

So there's lots to be excited about, and yet I'm still worried. I wonder who actually even knows we're doing all this cool stuff? How many people see us as an innovator? And how much can these new business lines be counted on to bolster our flagging wireline and fragile wireless businesses?

With demand for bandwidth and interest in utilizing networked services and applications at all-time highs I should be overwhelming optimistic about the future of my company, and yet I'm not.

I can see so much opportunity to leverage our many assets, and yet I can't ignore the tremendous amount of work and energy it takes to get a massive company like AT&T; moving as quickly and as nimbly as is needed to take advantage of that opportunity.

I want to see my company competitive in the 21st century, but we have trouble getting out of a 20th century mindset, like with our stance on not taking advantage of U-Verse's IP nature to ride on open networks that aren't our own.

I want us to look beyond short-term costs and profits so we can realize the long-term benefits of things like investing in developing the next generation of PEG and local community media so that we can create an ongoing engine of locally produced and relevant content to feed through our networks.

I want to believe that as great a company as we grew to become in the 20th century, that our brightest days are ahead of us.

And yet on this day I'm having a hard time doing that. I'm feeling the weight of holding tight to the old paradigm weighing on my shoulders. I'm confused as to what to do next to work towards a future that looks much different than the past we've known. And I grow weary at the mere thought of even attempting to get my company and its millions of moving parts all coordinated and moving in a new direction.

I have not given up hope, mind you, but I am worried. Our future isn't certain, and I'm concerned that without a significant re-imagining of who we are AT&T;'s days as the dominant telecommunications provider in the US may be numbered.

But I am optimistic that with the resources of money and quality people at our disposal, that anything's possible. And I look forward to leading my company into a bright new future, though for now it's time to go back to being a lowly blogger until the next opportunity for me to take the reins and imagine what I would do if I ran AT&T.;

Discussions surrounding how much bandwidth we need are missing the point by striving to list all the different types of applications people use.

The question we should be asking is simple: how much bandwidth do we need to support a future where everyone's using video all the time?

Let's unpack that question a bit.

First off, video applications are what drive most demand for bandwidth. And the higher a video's resolution, the more bandwidth it needs.

Secondly, video applications include a whole lot more than just watching YouTube, Hulu, and porn, encompassing everything from videocalls to video security to webcasting video of yourself to the world.

Someday in the not too distant future we're all going to be using high-resolution video applications all the time: to talk to friends and family, to communicate with doctors and teachers, to conduct business, to see what's happening in the world, to learn and to teach, and, of course, to be entertained.

We're going to be spending hours online every day using high-resolution video applications.

Or at least we're all going to be wanting to use them, assuming we have enough bandwidth in our broadband for the apps to actually work.

To support the possibilities of a world where high-resolution, two-way video is ubiquitous, we need broadband networks with the capacity, symmetrical speeds, scalability, reliability, and low latency to support high-quality high-resolution two-way video all the time, even if everyone's using it.

We need networks that can deliver the speeds they promise, and we needs speeds starting at 100Mbps and going up from there in order to handle the traffic from multiple users online at the same address at the same time.

We cannot forget that the next-generation of the Internet is not about webpage and email, it's all about video. And as we continue to adopt more and more high-resolution video applications, we can't afford to have networks that deal in bandwidth scarcity, that can't offer lots of upstream capacity, and that can't support lots of simultaneous usage.

That's why I'm a believer in fiber as it's the only broadband technology that welcomes a future of ubiquitous, always-on, high-resolution, two-way video and that has the capacity to support whatever demands we throw at it.

While I tend to be someone who's cautious in praising municipal broadband as the ultimate answer I can't allow misinformation about it go by without reacting to make sure the truth gets out, especially when it's as fundamentally wrong as this quote from a recent editorial in the Charleston Gazette:

"But commercial providers generally offer more reliable and faster service - few of their subscribers are likely to switch to a slower municipal service to save a couple of bucks."

The reason this statement stood out to me as so blatantly false was I'd recently finished reading this article from Christopher Mitchell's new site MuniNetworks.org in which he lays out how municipal fiber networks in Lafayette, LA and across the multiple cities in Utah involved with the UTOPIA project are delivering citizens not just lower overall costs but also faster speeds and lower prices per Mbps.

Through UTOPIA you can get 100Mbps symmetrical at home. In Lafayette they're offering 50Mbps symmetrical for less than $60 a month.

Now find me a private provider offering similar speeds and prices. You can't as they don't exist.

I should take a step back for a moment and acknowledge the biases of that editorial's author, Frank Rizzo, who's a council-member-at-large for the city of Philadelphia. He watched his city's municipal wireless project crash and burn, soo I can understand why he may think municipal broadband is a bad idea.

But what I take umbrage with is his conflating municipal wireless with municipal fiber. Saying that all municipal services are slower than those provided by private operators is patently false. And in most cases the reason a municipality took up the challenge of laying fiber is because private providers weren't delivering enough bandwidth to support local economic development so the community took their digital future into their own hands.

Also troubling is assuming that because the wireless Philadelphia project failed that means all municipal broadband is doomed to fail rather than acknowledging that the mistakes Philadelphia made were their own fault and due primarily to a fundamentally flawed business model.

I'm all for a robust debate about the merits of public vs. private broadband, but let's make sure we stay true to the facts and don't muck up the issues with untrue statements like the one made in the editorial linked to above.

I've been getting a lot of requests recently by city, state, and national entities to help lay out the case for why we need more bandwidth in particular as it relates to specific applications that can improve our lives through broadband.

To best accomplish this task I've decided to focus on specific verticals, starting with the top ten applications for bettering healthcare through broadband.

Let's take a look at the list:

Telesurgery - Ranging from a surgeon observing and consulting a surgery remotely through high quality videoconferencing to actually performing the surgery through advanced robotics, telesurgery enables anyone to leverage the skills of the best surgeons from anywhere using broadband that's high capacity for video and low latency so there's no lag between what the surgeon's intent and the robot's actions.

Teleradiology - Rather than being limited to the medical expertise housed in any one healthcare facility at any given time, teleradiology allows for the results of MRIs, CAT scans, and X-rays to be reviewed by specialists from around the globe 24 hours a day. And as these test results consist of a series of high resolution images that convey what's happening in a patient's body they need high capacity networks to speedily transfer the files so that medical opinions can be turned into medical care as quickly as possible.

Remote examinations - Whether in a medical facility, at home, or outside and on the go, through broadband doctors can examine patients anywhere through a combination of videoconferencing and integrated diagnostic tools. In this way doctors can start examining patients immediately after injuries happen, they can consult with patients who get to stay in their homes and save the drive into the office, psychologists can conduct psychological examinations on prison inmates without moving them from their cells, and so much more. All using broadband to empower two-way high quality video communication.

Patient monitoring - In addition to specific examinations, through broadband medical professionals can keep tabs on ailing patients without having to physically be at their sides at all times. Robots can monitor ICU patients, retirement facilities can share personnel between buildings to save costs as they keep a 24-hour vigil, and bed-ridden patients at home can have their health monitored around the clock. All possible through ubiquitous high-speed broadband.

Patient/family communication - It used to be an ailing patient's only connections to the outside world whether they were in the hospital or at home were the telephone for talking to friends and family and the television for keeping up to date on what's happening in the world. Now through broadband they can not only have instant access to the world's news, they can communicate face-to-face with loved ones through videoconferencing while also being able to learn about their condition and treatments as they recover.

EMRs - Electronic medical records replace traditional paper-based patient records and make it so that doctors always have access to all their patient's relevant medical data, including history, lab results, allergies, and more. No more waiting for paper records to be mailed or faxed between offices. No more lost pages or missing information. And EMRs can include advanced features like an automatic drug interaction checker to make sure different doctors don't prescribe drugs that conflict with each other.

Personal Health Records - Personal health records are the patient-facing component of EMRs. They allow patients to access all their medical information from anywhere, including the status and results from their most recent tests and exams. Personal health records can also include a number of additional functionalities, like pain and food diaries, symptoms checkers, libraries of relevant medical information, exercise managers, and more. And all this patient-generated data can be shared with their doctors to help in diagnosis and treatment of ailments.

Video translator network - Translation services are currently largely limited to the telephone, but with broadband available these services are beginning to leverage high quality two-way video to help communicate with patients in emergency rooms specifically and hospitals more generally. By the patient, doctor, and translator all being able to see each other broadband is enabling better communication between people who speak different languages. And this technology can be revolutionary for the effective real-time translation of sign language.

Doctor social networks - Broadband-powered social networks can help doctors stay up to date with the latest medical advances and techniques and connected with their peers across the globe. Whether it's to tap expertise for advice on specific consultations, to generally improve their overall knowledge, or just to get support from others having to deal with the same high-stress job, these social networks can improve doctors' ability to provide the best care to their patients. And these same social networks can also be used to unite nurses, EMTs, and other medical professionals.

Patient social networks - The sociological construct of a patient social network already exists in the form of patient support groups where people suffering from the same maladies can come together, share their stories, and support each other. Now broadband is enabling these support groups to go online and use social networking technologies. These social networks afford patients access to a greater number of fellow sufferers, more tools to share their stories with and learn from others, and most importantly the ability to participate in these groups even if they're not able to physically move from their rooms. In these ways, broadband is helping patients feel supported thereby improving their mental state and therefore their ability to at least feel better and often recover more quickly.

---

It's important to note that while some of these applications may seem more relevant to hospital than homes and therefore less relevant to the debates surrounding residential broadband, know that they all can benefit from the ubiquitous availability of high-speed connectivity.

For example, doctors can consult from anywhere they have a broadband pipe big enough to watch live video to sit in on a surgery or receive the images from an MRI. Remote examinations become even more powerful when able to be conducted from the field and not just in medical facilities or even in homes. And the translators who provide services over the video translator network can work from home using consumer-grade videoconferencing technologies. All of this is possible once high-speed connectivity becomes universally available.

With these applications that can clearly benefit our lives in mind, the next question I often receive is what's next? What applications will be created once the next-generation of broadband networks are everywhere?

But I'm not sure if these are the right questions to ask.

Instead of heading out looking for the next killer app we should first consider how the applications listed above will be improved by broadband networks with higher capacity, better reliability, and lower latency.

Higher capacity networks will open up the ability to deliver higher quality video. So instead of video that's the size of a playing card or full-screen but low quality, high capacity broadband networks will enable the delivery of two-way HD video that can fill a big-screen TV.

More reliable networks will help insure that mission-critical applications like remote monitoring always work and never fail due to shoddy connectivity.

And networks with lower latency will make real-time communications feel more natural by not introducing any lag in between when people speak.

There undoubtedly will be a whole new class of applications that can take full advantage of the capacity, reliability, and low latency of next-generation broadband networks, but for now let's focus on equipping our entire healthcare system with the applications listed above and on putting the networks in place that can support their evolution into integral parts of our lives.

In terms of what we need from next-generation broadband networks to enable all these applications to establish themselves and evolve the answers simple: we need everyone to have access to the best broadband. We need broadband networks with limitless capacity so as demands increase so can the supply needed to support them. We need broadband networks that we can rely on to always work. We need broadband networks with low latency to enable as close to real-time delivery as possible.

And while the benefits of next-generation broadband to how we administer medical care are profound, they're only the tip of the iceberg.

Up next look forward to the Top Ten Broadband-Enabled Education Applications!

An Incremental Approach To Broadband Mapping

| 2 Comments | No TrackBacks

The debate over how best to map broadband continues to rage on, as evidenced by this most recent Wall Street Journal article.

Yet what's troublesome is how much these discussions focus on what Connected Nation's doing wrong rather than exploring the right way to do broadband mapping.

This is especially worrisome as I'd argue that no one has fully cracked the nut on the best approach to mapping broadband.

And given that we're about to spend hundreds of millions of taxpayer dollars on trying to tackle this challenge, I want to attempt to push this dialog forward in a more constructive manner by proposing a new approach to broadband mapping.

The key to my proposal is that instead of trying to have one entity attempt do all the mapping in one fell swoop we focus on an incremental methodology that starts with establishing a baseline of data and then encourages others to add to it.

More specifically, I want to see us start with the most basic question of, "Where's broadband?" Produce maps like Virginia's that show where broadband--or symmetrical service over 768Kbps--is available on as granular a level as possible.

Then make this data available to the public to build additional layers of data on top of.

For example, with this baseline in place I'd want to go and round up data from all the FTTH providers to show where their networks are available and what speeds they offer at what prices. Doing this could then inspire their DSL and cable competitors to step up and show what they have to offer rather than risk creating a vacuum in which customers flock to fiber.

Another layer of data could be actual speeds and/or quality of service, with data drawn from user speed tests like BroadbandCensus.com, a network of committed network monitors with more sophisticated monitoring tools, or any other mechanism. In this way customers could start to know more about if they're truly getting the service they're paying for, and ultimately do so in a granular way that can identify stronger and weaker parts of the same network.

Along these lines, someone could create a layer of data showing the usage restrictions like bandwidth caps of the various broadband networks on a regional basis.

Arguably even more important than these supply-side metrics would be the opportunity to layer on data related to adoption and usage.

By layering on data showing takerates for broadband we can identify areas where it's available but not being adopted so we can target demand-side programs.

Also, if we start collected more hard data about the actual usage of broadband we can figure out which communities are gaining the most benefit from using broadband in order to learn from them about what's working so we can share those lessons with other communities.

Building on this idea, we could also start layering on data about demand for bandwidth comparing it to how much supply is available, with areas where there's a gap between demand and supply clearly showing which markets are underserved by market forces alone.

Trying to collect all this data at once would be a monumental challenge. Not every mapping entity has expertise in all these areas, and by spreading out the work it can be done more efficiently.

Also, by taking an incremental approach with multiple entities layering data we can leverage the best ideas from every state. Instead of trying to be prescriptive in saying that there's only one right way to do mapping an incremental approach will embrace the fact that we're still learning what the best way to do this is. In this way each state can be encouraged to innovate in finding new layers of relevant data to add and new ways to make the layers of data more relevant to each other.

And perhaps the best argument in favor of an incremental approach to broadband mapping is that we can't have these maps be static. They're going to need to evolve over time. We don't need to just map broadband once, we need to be getting a baseline set today and then having it evolve so we can track our progress over time.

I can't say that I have all the details figured out for how this incremental approach to broadband mapping will work. There's still a lot to be hammered out, especially things like making sure the maps are standardized in such a way so as to be interoperable and so that the state maps can be collated into a larger national map.

But I do feel like we're going to be better served by taking this mapping process one step at a time rather than throwing our lot all in with one model or another.

Let's start with the most basic question of, "Where's broadband?" but instead of making that the endgame let's have that be the starting point upon which a robust, multifaceted, user-supported, incremental approach to broadband mapping can be built upon.

One of my biggest pet peeves advocating for fiber are the jokes made about its dietary namesake. And not just by laymen as I had an FCC commissioner make a similar crack about fiber's nutritional properties at a conference once.

Yet, the analogy of fiber optics as laxative are actually spot on.

For starters, the Internet is constipated.

While data zips across the country on beams of light carried by fiber optic cables, when it reaches the vast majority of last-mile access networks it hits copper wires, which regardless of whether it's DSL or cable will always be slower than fiber.

Another phrase often used to describe this is the last-mile bottleneck, and it's traditionally been one of the primary limiting factors in the development of reliable, high quality Internet applications.

The reason you can't watch the higher quality video available online today at home is because you don't have enough bandwidth. I've tried watching the HD video on Hulu.com but it doesn't play smoothly despite my subscribing to the fastest residential broadband service available ten blocks from our nation's Capitol. There's not enough pipe to push all that data through.

One of the reasons why video may stutter or degrade is because there's not enough capacity in the last-mile network to support lots of simultaneous usage. It's amazing how much better my cable connection works late at night on a Sunday as opposed to during the week. And I've heard horror stories up in Cape Cod where around 4pm when all the kids get home from school connectivity goes south quickly.

All this isn't to say that the last-mile is the only place where bottlenecks can form, but traditionally it has been the most constant place where the Internet gets plugged up.

So how can we relieve the Internet's constipation? By feeding it a diet rich in fiber.

The more fiber you have the more capacity that's available to handle Internet traffic. And when you lay fiber all the way to users' front doors then you fundamentally redefine their connectivity paradigm, evolving from an era of bandwidth scarcity to one of abundance.

As the Internet's backbone is all fiber, anything with lesser capacity will be a bottleneck.

So let's acknowledge that the next-generation of the Internet means extending the power of fiber optics to every house, and let's blow out the old constipated Internet by embracing fiber as the ultimate laxative.

About this Archive

This page is an archive of entries from June 2009 listed from newest to oldest.

May 2009 is the previous archive.

July 2009 is the next archive.

Find recent content on the main index or look in the archives to find all content.