Why is this page text-only?


App-Rising.com covers the development and adoption of broadband applications, the deployment of and need for broadband networks, and the demands placed on policy to adapt to the revolutionary opportunities made possible by the Internet.

App-Rising.com is written by Geoff Daily, a DC-based technology journalist, broadband activist, marketing consultant, and Internet entrepreneur.

App-Rising.com is supported in part by AT&T;, however all views and opinions expressed herein are solely my own.

« January 2008 | Main | March 2008 »

February 2008 Archives

February 1, 2008 7:22 AM

Online Video's Dirty Little Secret

Recently my wife and I decided to forgo paying for cable TV service in order to end our penchant for wasted evenings and weekends spent channel surfing.

This isn't to say we've stopped watching video. We both very much enjoy movies at home and in the theaters, for example.

I will admit that we haven't yet embraced online movies but I have found myself frequenting the major TV networks' websites on a much more regular basis since turning off cable. Therein you can find full-length episodes of pretty much all your favorite shows.

Now, part of this is professional curiosity and research as I've long written about the business of online video for StreamingMedia.com.

But there's also something else that's a much bigger driver of this change in my behavior.

It quite simply is online video's dirty little secret: there are hardly any commercials!

That's right, over the last three full-length episodes I've watched I've seen maybe a half dozen commercials. While there are most certainly more commercial breaks, lining up with how the show would be shown on TV, more often than not there isn't a commercial to fill all those slots.

Even when there is a commercial it's a commercial, as in singular. No more three plus minute commercial extravaganzas; it's just one 15-30 second commercial and you're back to your show.

The result of these limited commercials is that I'm about a thousand times more likely to watch a show from beginning to end, instead of catching the channel surfing wave whenever the commercial break starts and rarely making it back to my show by the time that break ends.

And that's ultimately the intent of limiting commercials online.

With TV, it was long the only game in town and the TV networks had no way of knowing in real-time what impact their decisions regarding commercials were having on viewership.

Now on the Internet, content owners have direct feedback into user behavior, and in turn they're beginning to recognize that the ultimate goal of an online video is to get a user to watch it through to the end, and that if they load that video up with long commercial breaks, the odds of someone sticking around drop dramatically.

It really is a fantastic evolution of the content world that is directly resulting in a better user experience, and in so doing ultimately improving content owners' bottom line through the paradigm-shifting, enabling technology we call broadband.

February 4, 2008 7:45 AM

Why Can't Local Media Be a Competitive Edge for TV Providers?

One of the big issues I've been exploring and will continue to dive into over the next few months is identifying the problems with PEG and working towards finding 21st century solutions to bolster its ability to further the cause of local community media.

In my last post, John McHugh of ParkTV in St. Louis Park, MN submitted this thought-provoking comment:

"Cable TV companies could use the strengths of the local channels to differentiate themselves from satellite tv. Within a region/cluster, successful PEG channels in one service area aren't the same as a single PEG channel in another. The company should have non-specific mention of public meetings, school sports/events and local interests programming, tag it with the ever-present asterix -- whose footnote reveals "not all systems have the same local channels". Why not? They claim to compete on value."

In an extended conversation with Chuck Sherwood, senior associate with TeleDimensions and lifer in the PEG community, I learned that while not every cable provider is actively fighting against their obligations to support and carry PEG channels, that none of the major operators have taken the bold step of supporting PEG above and beyond these obligations.

But I have to ask the same question as John: Why not?

I mean, why would it be a bad thing to stock your cable system with locally generated content that can't be found anywhere else? Assuming the content's compelling and relevant, wouldn't that give you a huge edge over your competitors?

It seems like such a simple, obvious idea, especially considering the increasingly competitive cable market.

And imagine what would happen if it worked and then you had multiple cable providers fighting to offer the best in local content.

It seems possible that we could enter a world where cable operators weren't working against PEG but instead fighting for it. Where they're investing in local content producers and heavily promoting local shows in order to gain a competitive edge.

To get this dynamo going there would likely need to be a way for them to ultimately make money off of local content beyond retaining and attracting subscribers, but that doesn't seem impossible either as I know in talking with Chuck that there's the beginnings of a push to understand how advertising and PEG might co-exist.

And the great thing is that even if the cable operators weren't making much or any money off of ad placement, most do offer ad production services and in producing those local ads for local media they could likely upsell many advertisers on buying airtime on national channels.

The biggest piece of this puzzle, though, is getting the audience to watch local content. My sense is that part of the reason why the major cable operators, new and old, haven't embraced the possibilities of PEG as competitive advantage is that they don't see it as high demand content. Quite frankly, there's a sense that PEG only means boring government meetings, educational content from schools, and crazy public access shows. And that the only audience for this content is the smallest of niches.

If PEG content could prove itself to have highly in demand content, then perhaps we could flip the paradigm of PEG needing cable to cable needing PEG and in so doing expand what PEG means and the audience it serves.

And if cable providers in competitive markets would wake up and realize the potential to gain a leg up over their rivals by embracing local content, we could start taking steps towards a future where supporting PEG isn't seen as an obligation but instead a sound business decision that can further both operator and community interests alike.

February 5, 2008 8:36 AM

Article RoundUp: More Broadband From More Places (And Less From Others)

There's been a spate of stories related to new developments in the availability of broadband over the last couple of weeks. Here's a handful I found particularly interesting:

Shareband Launching Bonded DSL in Seattle
Shareband is a British startup that enables business and residential customers to bond ADSL lines with other pipes to enable bigger, faster, and more stable Internet access. While the technology has proven that it can work, like all DSL it's limited by distance. And ultimately bonding multiple copper lines together seems like not much more than a stopgap technology that can fill a valuable niche for the next few years but ultimately has an uncertain future.

HughesNet Customers Say Service Sluggish
While satellite broadband provider HughesNet is in the midst of testing their newly launched bird that promises better coverage and greater capacity, there's a growing buzz around the limitations of the service it provides today. Customers are complaining about how if you exceed the rolling bandwidth cap, which can range between 200MB and 1500MB, that your service will be throttled to less than dialup speeds of just 7-14Kbps for the next 24 hours. And customers who are paying $80 for what's supposed to be 1.5Mbps down and 200Kbps up service are sometimes only realizing 150Kbps down and 6Kbps up. These are speeds at which the Internet is almost unusable, calling into question the viability of satellite as a true broadband alternative.

40Mbps DSL? Rim Semi Claims High Speeds at Long Distances
Rim Semiconductor Company is demonstrating new technology that it claims will enable 20-40Mbps at a distance of more than 5000 feet. While I support all efforts to increase our nation's broadband capacity and these claims are impressive relative to the speeds being realized by other DSL big broadband initiatives, I'm hopeful that public opinion won't get caught up in the hype of technologies like this and DOCSIS 3.0 (the new cable standard promising 100Mbps to the home). I say this because ultimately if you believe that we will one day in the not too distant future have need for gigabit connectivity to the home, then the only viable last mile solution is fiber. Sure it's possible someone might innovate a new way to stretch the capacity of copper, but I'd rather we spend more time, energy, and money to deploying more proven fiber technology than on research that can only hope to eventually reach gigabit speeds and ultimately will demand new investment in networks anyway.

February 5, 2008 10:52 AM

The Problem with the Bush Administration's Broadband Report

Last week the Bush Administration released a report entitled "Networked Nation: Broadband in America 2007" that highlights what it sees as the results of its successful national broadband strategy.

While it's been widely lambasted as overly optimistic and completely ignorant of America's falling position in international broadband rankings, that's not what I see as being its most significant shortcoming.

That honor goes to its utterly milquetoast attitude towards broadband.

When I think "national broadband strategy" I don't think "how can we be adequate" or "how can we do just enough to get by".

What I want to see is some aspirational goals, a national broadband strategy that says "here's where we are, but here's where we could be going", a national broadband strategy that doesn't stop at claiming to know what's enough but instead strives to be all that it can be.

If anyone or anything wants to claim the mantle of being a national broadband strategy, it has to start from the perspective of what does it want to achieve.

For me that begins with a few straightforward goals:

- Broadband to everyone.
- Everyone to broadband.
- Broadband as an ever-increasing standard.

Broadband to everyone means 100% availability. It means that every man, woman, and child can access broadband from their homes and businesses.

Everyone to broadband means 100% of people being aware of what broadband can do for them and wanting to incorporate its use more in their day to day lives.

And broadband as an ever-increasing standard means we all come to the understanding that the more capacity in the networks the more we can do with it, and that even if we don't know how this additional capacity will be used tomorrow, that we should still be doing everything we can today to realize ever higher speeds.

Based on these three basic principles, according to my scorecard we're still a long ways away from being able to proclaim "Mission Accomplished".

Even though broadband is available most places, we're still not at 100%, even if you count satellite as broadband. We're getting close, but we must acknowledge that there are some significant challenges ahead to fill in those last remaining holds in our country's broadband coverage.

Quite frankly, we're a helluva long ways away from everyone using broadband. Only recently have we passed 50% of homes subscribed to broadband, and I'd argue that most of those homes really aren't making the most out of what broadband can deliver. Getting everyone online isn't going to be easy as there will continue to be many who don't see a need for it, but the more people we can get online the more powerful the Internet becomes as a tool for affecting real societal change.

And don't get me started on the lack of expansion in the definition of broadband. The FCC is still stuck in neutral at 200Kbps in one direction as the definition, and there continues to be in-fighting about how much bandwidth we'll ultimately need. This is why I've taken the general position that more is better, and that we can not accurately say how much is too much, so instead let's focus on getting as much as we possibly can.

But just because we're a ways away from accomplishing these goals doesn't mean that we shouldn't aspire to them.

If we want America to continue to be great in the 21st century, we can't waste time patting ourselves on the back about how far we've come when we need to instead be marshaling all of our resources to tackle the goal of how can we aspire to be better.

And as I've stated before, this isn't just a matter of beating other countries. All we need to do is continuously work on trying to better than we were yesterday and are today.

If we can only understand and agree on the basic premise that our deployment and use of broadband is in its infancy and that by nurturing its growth we can revolutionize society, then perhaps we can recapture that American spirit to strive tirelessly towards a better tomorrow powered by broadband.

February 6, 2008 8:48 AM

The Missing Link for Broadband: Demand

When talking about the need for broadband the focus is almost always on supply, or how do you get more capacity to more people.

Only recently has the drumbeat started to pay more attention to demand, or getting people to subscribe to and use broadband.

In general, I see the demand side of the broadband equation to be as if not more important than supply as you can build all the capacity you want but if no one's using it then what's the point?

But there's a problem as this renewed focus on adoption is being vocalized in large part by broadband providers, who many will claim aren't interested in societal change through broadband so much as bolstering their bottom line by adding new subscribers.

And any concerted effort to increase take-rates might seem like the government trying to get more people to buy cell phones or drink more milk, which can then lead to higher revenue and bigger profits for already well-off multi-billion dollar corporations.

But that's only looking at the situation on the surface. Let's dive a bit deeper.

First off there's the basic rule of thumb about networks: the more nodes (or users) the more powerful the network becomes. So the more people subscribed to broadband at home the more potential ecommerce customers, the more sellers, the more content creators, and the more content watchers there are. So greater demand equals a greater Internet.

Secondly there's the basic reality that if we're setting out to revolutionize society through broadband that we ultimately need everyone on the network. Making government resources available online doesn't do a whole lot of good if most people can't access those resources. And if our goal is to get everything online, then we also need to get everyone online in order to take advantage of that.

The third major point I wanted to make is that the more demand for broadband the bigger the carrot on the stick to spur deployment.

If you talk to Verizon about FiOS, they'll admit they're cherry picking, or deploying first into neighborhoods with demographics that suggest more homes that are willing and able to subscribe to broadband. That's just basic business sense.

But what would happen if everyone was demanding broadband? Not half of homes, but all homes, clamoring for broadband? Imagine the impact that would have on deployment.

Network operators could focus less on reaching the highest value customers and more on delivering service to as many customers as possible.

A radical increase in demand for broadband could also have a profound impact on competition as new entrants would feel emboldened to break into more new markets as there'd be a bigger pie of customers for them to try and claim a slice of.

The math is pretty simple: if 50% adoption can support two or three providers, then 100% adoption could create a marketplace where four or five providers can realize take-rates large enough to justify deployment.

So it seems obvious to me that one of the biggest steps we can take as a country into our broadband future is to do more to spur demand for broadband and, in doing so, create additional demand for higher speeds.

And as demand increases, supply will undoubtedly race to catch up. Or, if it fails to, then consumers and communities will be inspired to start exploring other options to get the connectivity they need.

Spurring demand for broadband isn't about lining the pockets of broadband providers; it's about recognizing the value greater participation in and understanding of the Internet has in furthering the goals broadband advocates like myself have been espousing.

In the end will the multibillion dollar corporations get richer? Yes, but they don't have nearly as much to gain as the country as a whole does by focusing much more attention on spurring demand for bandwidth alongside supply.

February 7, 2008 11:20 AM

US Ranks First in Worldwide for Use of Info Tech

Last week a London business school professor released a new set of rankings aimed at gauging which countries are making the most out of using information technology to increase productivity, dubbed the Connectivity Scorecard.

Unlike the many studies that track America's falling broadband star to middle-of-the-pack mediocrity, in this area we can still proudly hold our heads high as we came out at the head of the class.

The study looked at 30 different factors related to the use of telecommunications technologies to boost social and economic prosperity. While the US did get demerits for its 20th century telecom infrastructure, the degree to which individuals and corporations utilize technology to spur productivity was able to overcome this limitation.

But despite our victory, we're a long ways away from being able to proclaim Mission Accomplished, as evidenced by the fact our score didn't top 7 on a 10 point scale, primarily because of the missed opportunities broadband provides.

Instead of seeing this as a chance to pat ourselves on the back for what we've done, I see this as an opportunity to serve as a rallying cry for how much more we can be doing.

Additionally, it highlights that despite our lagging connectivity we're still the country that's best poised to take advantage of the information technology revolution.

On the flipside, it shows that simply having connectivity is not enough as the much vaunted fiber networks of South Korea were only good enough to rank them at #10 on this list.

In other words, having technology is not as significant as using technology.

This is an important lesson to keep in mind moving forward as it's easy to get caught up in the hype around new last mile technologies or killer apps and lose sight of what really matters: improving society through the use of that technology.

February 8, 2008 12:09 PM

Cable's Big Bandwidth Problem

Everyone knows that cable companies have some problems with their broadband service.

Theirs is a shared network where one neighbor can have dramatically negative impacts on another's service.

There's the Comcast/Bittorrent brouhaha over the cable giant blocking or degrading sometimes-legal P2P traffic.

There are the mysterious bandwidth caps that punish users from consuming too much bandwidth.

And the deployment of FTTH networks like Verizon's FiOS are usurping cable's throne as offering the fastest advertised broadband around.

But even still the reality of where cable systems are and where they're heading still shocked me a bit as I read this DSLReports article about Canadian cable provider Videotron rolling out North America's first sighting of DOCSIS 3.0, the technology that's supposed to enable cable to hit those magical 100Mbps speeds.

The first thing that caught my eye were the packages Videotron is offering. While downloads at 30, 50, and even 100Mbps are fantastically impressive, they stood in stark contrast to the 1 or 2Mbps upload being offered.

That disparity is remarkable to me, and it suggest that the move to DOCSIS 3.0 while potentially great for downloading movies and the like will have next to no impact on the aspect of the Internet that makes it truly revolutionary: its capacity for two-way interactivity and communication.

This reality is especially depressing in light of some testing I've been doing of a videocalling-on-the-TV product called TVBlob I've been doing over my Comcast consumer cable connection, which despite my paying for the fastest service they offer couldn't deliver more than 512Kbps upstream, leaving a lot of real estate on the TV screen unused.

But that's not the most frustrating part. Alongside these massive upgrades in download capacity come pretty restrictive bandwidth caps. While I do have to give credit to Videotron for being transparent in the fact that they do have caps and clearly stating what those caps are, I'm not sure what good having an ultrafast connection is if I can only download 30 or 50GB a month.

Don't get me wrong. I'm almost positive I don't move that much data around today, but once I get my HDTV if I had 30Mbps to the home you can bet that I'd want to start downloading HD movies. And last time I checked, a single movie on Blu-ray can take up 30GB.

It's like giving me a bigger straw to drink from an endless smoothie, only the more I get used to drinking faster because I like it the more it's going to cost me.

It's not that I'm against metered bandwidth (more to come on that next week), it just seems like we're rapidly entering a world where bandwidth should be more available but it's coming with more restrictions.

We need to find a path to a future where we have more bandwidth and fewer restrictions while still acknowledging the need for whoever is making that bandwidth available can make their business viable.

February 11, 2008 9:15 AM

Broadband Access: Not the First Thing on Voters' Minds

Last week I found a short post by Ann Treacy on the Blandin on Broadband blog about an eye-opening fact: only one percent of America think that broadband access is an issue.

Actually, that number's even more startling considering the source - the EETimes, which credits itself as being the source for global news for the creators of technology.

Not only that but broadband scored its one percent competing against nine other technology-related issues; so it couldn't even move the needle when competing against tech issues to a techie audience.

To show how low the Internet ranked, the issue of Internet policy didn't even register one percent.

On the one hand these results frustrate me as they highlight the continuing lack of understanding and fervor around the potential of broadband to impact all aspects of technology development and beyond in a positive way.

On the other, not only am I not all that surprised by these results, but I think they demand we reconsider how far we've come and how far we still have to go in creating a nation of broadband believers.

The simple truth is that the vast majority of Americans think that they have all the broadband that they need.

More places than not have at least one provider, and many have two or more.

You can get 750Kbps and higher, which can handle basic Internet traffic just fine, and at a price that isn't unreasonable for a large percentage of customers.

The average joe doesn't know why they'd want, let alone need, more capacity. And that applies to at least some portion of the tech elite as well.

Because of this I think it's important that we temper our efforts to lament over the current state of American broadband and to be prophets for the potential of big broadband. I fear that sometimes we go overboard in talking about how bad things will be if we don't act now, and in turn cause the American public to collectively shrug their shoulders at a problem they don't really see as being a problem today.

Our focus in talking about why we need more broadband should stay on exploring all the good that's possible through more broadband to try and increase understanding of and demand for the broadband we all want to see come to life in America.

I understand that this is a very urgent issue with potentially devastating consequences in the global 21st century economy for our country, but I worry that if we spend too much time trying to sound the alarm to a disinterested and more importantly largely satisfied public, our cries for movement and action will fall on deaf ears.

February 12, 2008 9:03 AM

America's International Internet Advantage

A couple weeks ago I wrote about the cutting of two major underseas Internet cables near Egypt, which led to a 50% drop in connectivity for some entire countries.

I commented about how I was surprised this doesn't happen more often given the fact that international Internet traffic is being delivered on hair-thin strands strung across the ocean floor. In continuing to read I learned that this is a fairly common occurrence; the unique part of this story was that multiple cables serving a similar area were cut around the same time.

In researching the topic, I came across this picture.

It shows a rough approximation of the major underseas fiber optic cables that unite the world's communications.

What I found most interesting in this image was America's connectivity. Quite simply, it looks like it's unparalleled with anywhere else in the world.

There are upwards of a dozen major trunk lines branching off from each coast reaching to Asia and Europe.

I looked at that map and realized that if a cable were cut off the coast of the US it likely wouldn't have any noticeable impact to consumers at all as there's plenty of alternative pipes through which traffic can be routed.

I'm not sure if it's America's leadership in all things Internet that led to this reality or our country's unique geography with major coasts on both oceans.

But either way it's nice to know we can go to bed each night without having to worry that an errant anchor will prevent anyone in this country from enjoying the hunt for fresh content, applications, and services in the global marketplace.

February 14, 2008 9:55 AM

The House of the Future Has No Broadband

Yesterday I came across a story about plans to rebuild The House of the Future at Disneyland.

The original House was built in 1967 and was intended as a showcase for space age technologies, but within a decade reality had caught up with fantasy and the House didn't seem quite so space age so it was shut down.

Today the spirit of the House is about to be reinvigorated as a collaborative project between the Walt Disney Co., Microsoft Corp., Hewlett-Packard Co., LifeWare, and Taylor Morrison.

It will feature a host of smart home technologies like a touchpad countertop that can identify groceries, a closet that can help pick out outfits, and an internal network for sharing media within the house.

Hmmm....can anyone else see what's missing here?

Based on the description given in that Wired article, not one word is mentioned about having the home connected via big broadband to the Internet.

Call me flabbergasted, confused, disappointed, and charged up about changing that.

How can the supposed "House of the Future" be a house of the future without any mention of broadband?

I mean, sure new interfaces and consumer electronics will open up revolutionary possibilities, but none moreso than the Internet and very few of those new interfaces and devices won't have at least some interaction with the Internet.

What this house needs is a major corporate sponsor to step up and grab the reins of this opportunity to engage the American public with the message of why broadband is an essential component of our future.

What better place than Disneyland to spread the good word about broadband in a House of the Future setting?

Just think of the possibilities...literally everything in that house could be hooked up to the Internet. And it'd provide an opportunity to highlight some of the cutting edge applications made possible by big broadband.

To me showcasing broadband at the House of the Future is a no-brainer. Now it's time to see if I can find like-minded allies who are thinking the same way.

February 15, 2008 8:25 AM

Comcast's Metered Highway Meets Vuze's Muddy Racetrack

Earlier this week Comcast submitted comments to the FCC defending their right to slow down traffic on its network.

The analogy they used to describe how they manage P2P traffic was that of an on-ramp to a busy highway that regulates how quickly cars get on and off in order to manage congestion and keep traffic flowing.

Vuze, the P2P video company who initiated the FCC’s inquiry into Comcast’s practices, responded with an analogy comparing what Comcast’s been doing to a horse race where Comcast owns the track, is running their own horse, and slowing all the other horses down.

Many others have chimed in lambasting Comcast’s defense over the last few days, citing this as a prime example of a network operator exerting undue influence over the bits traveling through their pipes.

But I have to say, despite my desire for a free and open Internet, critics of Comcast are missing some very key points.

First off, cable systems are shared networks. So when they say that one user’s P2P usage can negatively impact their neighbor’s, that’s the truth.

This isn’t a smokescreen to allow them to squeeze out legitimate Internet traffic; this is them admitting that the growth in demand for bandwidth is outstripping the capacity of their network to handle it.

Which leads to my second point: what would happen if they couldn’t manage traffic on their network to reduce congestion? Won’t that result in more congestion and therefore reduced performance that will harm all applications and use of the Internet for Comcast users?

That is the point everyone seems to be missing.

To use an analogy to help explain let’s go back to the horse race. So Comcast’s network is a race track. But it’s a race track that’s only designed to handle X number of horses weighing Y number of pounds to run Z number of races in a day. As you start putting additional, heavier horses running races more frequently, the race track will begin to deteriorate, which will create unsteady footing that will slow down horses and lead to more crashes that prevent them from finishing the race.

On a racetrack with limited capacity there can be only a certain number of horses running a limited number of races before it starts to degrade performance for everyone.

And again, this isn’t a matter of Comcast being evil; it’s them admitting the limitations of their network.

So now, what to do? Do we force them to allow all horses onto their race track to run as much as they want thereby likely destroying the track and bringing the network down? Or might we better served focusing on making sure that they’re being upfront with the public as to the limitations of their network so that we can make informed decisions as to whether or not we want to bet on their track?

Now I know that these issues aren’t this cut and dry, especially in areas where cable is the only viable broadband option (like where I live, 8 blocks off the National Mall). And the topic of net neutrality and network management are regaining momentum following Congressman Markey’s re-introduction of his reworked Internet freedom bill earlier this week.

So for these reasons, I’m excited to announce that next week will be Net Neutrality Week on App-Rising.com. Every day you’ll get a post explaining a different facet of this complex issue as well as thoughts on what solutions are available to help us find some level of resolution to this contentious debate so that we can move forward with the many other pressing topics for us to tackle as we continue the Great Broadband Debate of 2008.

February 18, 2008 9:26 AM

What is Net Neutrality?

What is Net Neutrality?

To some it means preventing network operators from selectively slowing down some traffic to favor others, especially their own.

To others it means preserving freedom of speech on the Internet, insuring no one is able to control what you say or do when you’re online.

To some network operators it doesn’t mean much at all as their networks have enough capacity to deliver what they promise.

To other network operators it’s an attack on their fundamental right to manage traffic on their private networks.

To me, it means many things.

Firstly it’s a description of what the Internet is at its core: a neutral network of networks that allows data to flow freely between them.

Secondly it’s a bipolar misnomer. Net neutrality in truth means two things: Internet neutrality and network neutrality.

Internet neutrality refers to the portion of a network that provides broadband access.

Network neutrality refers to the overall ability of an operator to manage their network.

Protecting the rights of consumers to get the level of Internet neutrality they expect when they sign up for broadband is vitally important to supporting a free and open marketplace for goods, services, and applications online.

Protecting the rights of ISPs to manage traffic on their network is essential to their ability to deliver a reliable Internet experience and to support the development of in-network applications.

What all of this falls under isn’t the banner of Net Neutrality so much as that of Network Management.

Whether we’re talking about Comcast’s right to delay P2P traffic to AT&T;’s right to censor a webcast to Verizon’s right to deliver applications that run faster inside FiOS than outside, these all boil down to what can and can’t ISPs do when managing their networks.

Just because we’ve boiled all this down doesn’t mean the dynamics at play are any less complex. Though throughout this week I hope to demonstrate that by understanding the dynamics at work we can find relatively simple solutions that can allow us to move forward without having to fall down the rabbit hole of government as regulator forced to make an endless stream of decisions regarding what is and isn’t legitimate network management.

February 19, 2008 10:46 AM

All Bits Are NOT Created Equal

A core tenant of what most people think of as Net Neutrality is that broadband providers should not be allowed to discriminate between different types of Internet traffic on their networks.

Underpinning this thought is the idea that all bits are created equal and should therefore be treated equally.

But is that really the case? Are all bits born with the same inalienable rights? Should they be?

For one, different applications demand different things from the network. Voice or video calls are very sensitive to latency, while downloading a file isn’t. Watching full-screen video on-demand requires a lot of bandwidth, email doesn’t. Why should they all be treated the same?

For two, some bits should have higher priority than others. Take security. Why shouldn’t video from a community protection solution like the SafetyBlanket get priority during an emergency situation over the kid down the street pirating movies?

For three, prioritization, if done properly, offers the potential for a host of new efficiencies in the network. If there’s more capacity in the network than is available out to the Internet, why shouldn’t an application deployed in-network be able to get more bandwidth than one in the cloud?

With prioritization comes the potential to revitalize the QoS available over the Internet, to introduce a new era of the experiences that can be enabled. The goal isn’t to make things worse; it’s to make things better.

That said, sometimes discrimination can be necessary as well, as I explored with regards to Comcast and the constraints of their shared network last week. Yet there’s a funny thing I often hear when discussing Net Neutrality with people who’ve invested in full fiber networks: it’s a foreign concept to them. They can’t understand the need to slow down traffic as they’ve already got plenty of capacity. So one simple answer to solving the NetNeut dilemma is to get more capacity (aka more fiber) into the network.

My final point is another example where all bits are not created equal. It’s my belief that the bits I send and receive through the Internet are much more important than the bits of my broadband provider. I do not believe that they should be able to slow down what I’m doing to favor their own traffic. In my mind that practice falls nowhere near the classification of “best efforts”. I think the same can be said about degrading traffic through passive neglect as well, where my bits are slowed down because my broadband provider’s gobbling up all the resources on their network not because they’re actively trying to slow me down, though where to draw that line on a shared network isn’t an easy question to answer.

In the end, most NetNeut advocates I know agree with the idea that not all bits are created equal. They understand that there can be benefits to more active network management and they claim that their intent is not to forbid any of the legitimate reasons to prioritize traffic I listed above.

Trust me, I wish we were in a full fiber world with sufficient capacity to render this issue moot, and ultimately its my belief that that’s the world we should be shooting to realize. But until that time, if we let policy get in the way of reality, instead of protecting consumers and preserving openness for applications developers we may end up hurting rather than helping all parties involved.

February 20, 2008 10:24 AM

Net Neutrality Is Terrifying

Here’s what everyone’s been missing about the whole net neutrality thing: the reason the big, evil telcos and cable companies don’t like it isn’t because they’re greedy, it’s because they’re scared.

We’ve already covered how Comcast’s FCC comments reveal the vulnerability of their network. Now imagine this: net neutrality passes, cablecos can no longer limit P2P traffic, and P2P usage spikes, driven in part by the buzz built up around this brouhaha. Mightn’t that cripple cable networks? That’s 25 million broadband users negatively impacted by this seemingly likely scenario. That would certainly scare me if I was a cableco.

While the telcos are weathering the P2P storm more readily, net neutrality is equally scary to them. Though often painted as a thinly veiled threat, when telcos talk of how net neutrality legislation would cause them to invest less in upgrading the capacity of their network, there’s really another dynamic at work.

In building their business cases for investing billions in FiOS and U-Verse, Verizon and AT&T; have based their revenue projections on the assumption that some of the additional capacity they’re putting into their networks can be used to deploy their own advanced services and offer a higher quality of service to other applications developers in-network. In fact, it’s my basic understanding that this is where they’re counting on realizing most all of their future growth.

The problem with most net neutrality regimes is that they call this fundamental right into question. If all bits are created equal and broadband providers can’t prioritize any traffic, then that means they can’t monetize their investment in building out capacity in the way that they’d planned. And if you take away the portion of their revenue they were counting for future growth, it only makes sense that they would have to revisit the viability of making a multi-billion dollar gamble in upgrading their networks. Not because they want to punish people but because what was already seen as a risky gambit would become a whole lot scarier with strict net neutrality in place.

So now we can see how the resistance to net neutrality hasn’t been driven by pure greed so much as fear. Fear that net neutrality might cripple shared networks, and fear that net neutrality will rob telcos of a key piece of their business model.

Some might still say that greed has everything to do with it and that broadband providers should be pouring billions into their networks without regard for their bottom line because that’s what’s best for the country. But we can’t forget: these are private, for-profit companies. We can’t force them to invest in their networks, but we can certainly dissuade them from doing so if we seek legislation that limits their ability to monetize their network.

But less network investment isn’t what scares me the most about net neutrality. The reason I find it terrifying is that, at least in some interpretations, it negates the possibilities of smart networks that can identify different types of Internet traffic and provide each with the bandwidth it needs to succeed. Net neutrality calls into question the viability of delivering applications in-network, where they can realize a higher level of quality of service than if they reside in the cloud. Instead of protecting Internet freedom, I fear that net neutrality may hold us back from realizing this next generation of the Internet.

Net neutrality also frightens me due to its amorphous, largely undefined nature. Any time the government tries writing broad legislation with the wind of public support at its back but without a solid knowledge of the complex dynamics at work, I cringe. The specter of watching the FCC try to determine everything that is and isn’t OK to do on your network sends chills down my spine. But those fears are nothing compared to the all-too-likely prospect that whatever legislation or regulations are passed will throw us into a decades long battle in the courts to define what really is and isn’t OK when it comes to managing traffic on a network. A drawn out legal battle would almost certainly gum up the gears of any broadband-related legislation and possibly put an end to productive dialogue over how the government can spur the deployment and adoption of broadband altogether.

For these reasons and more, net neutrality can be a very scary thing. This is about far more than the good of net neutrality supporters vs. the evils of the big broadband providers. And if we don’t recognize this moving forward then I’m terrified of where all this will take us.

February 21, 2008 12:06 PM

Seven Steps to Solving Net Neutrality...For Now

Throughout this week I've been exploring the dynamics of the debate around net neutrality in an attempt to help frame these issues in a new and hopefully more productive light.

Now I want to dive into some initial thoughts on how we might find some kind of resolution to this mess.

The first step is to recognize that from certain angles the two sides aren't that far apart. For example, both agree that network operators should not be allowed to intentionally degrade legitimate network traffic, especially in order to favor their own. The only disagreement here is whether or not legislation is needed or if the FCC's four principles regarding broadband are sufficient.

The second step is for the pro net neutrality crowd to agree that network operators should be allowed to manage traffic on their networks and that they should retain the right to explore the possibilities of delivering applications in-network, so long as doing so doesn't slow down the broadband service they're selling to consumers. In fact, if we're going to have a bill protecting the rights of consumers we should also include one protecting these rights of network operators.

The third step is to enact some form of truth-in-advertising legislation or regulations. We need to close the gap between what some broadband providers are marketing and the level of service consumers are realizing. Companies should still be able to market the theoretical maximum of their service, but alongside that should be a number that more accurately reflects the speeds consumers will realize. Maybe it's a minimum guaranteed level of access or the average speed customers will get on a day-to-day basis. However we frame this it's important consumers know what they're getting when selecting a broadband service.

Along these lines, the fourth step would be to expand those truth-in-advertising requirements to include full disclosure of what traffic a particular network is managing. If Comcast is going to degrade P2P traffic, their customers have the right to know about it. If Comcast doesn't want people to think they can't use P2P at all on their network, then they should have to clearly state the time and type of managing they're doing. I say don't prevent them from managing P2P traffic but force them to be honest about it and then, assuming competition is working, consumers will be able to make informed decisions about whether or not they want to continue using that service.

The fifth step would be to setup a taskforce charged with monitoring network operators in order to protect consumer interest. This task force would bring to light instances where network management practices are not aligning with the service being marketed. You wouldn't even necessarily need to make this an official group as there seems to be plenty of tech heads out there that would love the opportunity to catch big, evil broadband providers doing bad things and then outing them. Just give them a way to get the attention of the FCC when they find something happening that suggests wrongdoing.

The sixth step would be to establish consequences for broadband providers caught slowing down legitimate traffic without advertising that they're doing so. The first level of punishment could be to simply have them update their marketing materials and send them to their customers indicating their new treatment of particular types of traffic. Then there can be additional levels, like forcing them to give free service to affected customers for a period of time.

The seventh and final step in my plan to resolve net neutrality is to not pass net neutrality legislation. I've already written about how net neutrality is terrifying, so why show its hand now and just end up stuck in court? Why not leave the threat of net neutrality legislation in place, hanging above their heads like Damocles sword? Let them know that if they continue to get caught misbehaving that there won't be any stopping the support that will get behind passing full-blown net neutrality.

Now all this being said, I had to include the "...For Now" in the title of this post as this is an amorphous ever-evolving area. Tomorrow a brand new application could come along that eats up even more bandwidth than P2P. The day after that broadband providers who also offer cable TV could decide to start aggressively attacking the free and open delivery of video over the Internet. The day after that we could wake up to a world where fiber to the home is the norm not the exception and with all that capacity these worries go away.

The point is that the best we can do today is establish a plan that puts the rights of consumers first without ignoring the rights of broadband providers while keeping an eye towards the fact that the dynamics of all this could change tomorrow.

February 22, 2008 3:23 PM

Reviewing Markey’s “Internet Freedom Preservation Act of 2008”

Last week Rep. Ed Markey (D-MA) introduced the Internet Freedom Preservation Act of 2008, co-sponsored by Rep. Chad Pickering (R-MS). What better way to finish up our Net Neutrality week than a running commentary of my first read through this bill? (You can download a PDF of the full text here.) Read on for the thoughts that went running round my head while reading through this bill:

- To start with, I’m not a huge fan of “Internet Freedom Preservation” as it places the emphasis on the Internet and not the consumer. I’d prefer something more like Consumer Broadband Bill of Rights.

- I’m happy to report that at least someone in Congress is acknowledging the “profound benefits for numerous aspects of daily life for millions of people throughout the United States” that the Internet has had.

- Here’s the problem with this attempt to establish what net neutrality is: it all hinges on how you define the word “unreasonable”. On the one hand, unreasonable seems relative. What's unreasonable to you might be different from what’s unreasonable to me. I read the bill as qualifying "unreasonable" as relative to the perspective of the “use for lawful purposes” but what do you do if that use is reasonable to the user, lawful to the public, but unreasonable to the network operator? This is the word that future litigation over net neutrality will hinge upon.

- I don’t understand why one of the items commands that the FCC look into whether or not broadband network providers are adding charges for quality of service to certain Internet applications. I can’t say for sure if they are actually doing that today, but they’ve been pretty upfront that that's what they want to do in the future, so the only thing left to figure out is “whether such pricing conflicts with the policies of the US” which takes us right back to defining what is and isn’t unreasonable.

- I like that it acknowledges the need to look into reasonable network management practices for prioritizing emergency traffic, but I’m not sure if parental controls are worth looking into alongside everything else. I say that assuming these are opt-in systems that parents can opt out of.

- Very interesting, they’ve got a bit in there inquiring about whether or not having sufficient bandwidth lessens the need for protecting against unreasonable network management. Ultimately this seems to be the best clear cut answer to resolving many of these questions.

- I’ve hit their call for having broadband summits across the country. While it’s not a bad idea, what frustrates me is that they only suggest using these summits to gather info about availability, price, and competition. Aren’t there more efficient ways to get that info? Like BroadbandCensus.com? And if we’re going to take the time, effort, and expense to conduct big meetings, then why not also charge the FCC to try and educate and inspire people as to the uses of broadband? It’s time we start thinking outside the box a little more.

- On the plus side, they specifically call for the use of Internet technologies to allow more people to participate in these summits.

All in all, I don’t mind this bill. While it still uses much of the same language that troubled me from the last version, it doesn’t attempt to write any of it into stone. Instead it’s trying to say “Look, we know what we think needs to happen, but we understand that there may be other complexities at work, so let’s sit down with the American people and try to figure this out.” And that’s not a bad thing.

That said, I am worried that this is just a trojan horse for net neutrality. I mean, what do public summits about the availability of broadband have to do with determining proper network management? Shouldn’t these summits be conducted alongside the broadband mapping bill working its way through Congress? I sincerely hope that they aren’t a hidden attempt to get people together and preach the gospel of net neutrality as I don’t think that would help further productive dialog in any way, shape, or form.

February 25, 2008 10:21 AM

Why Adobe AIR Might Revolutionize the Internet

You know the Internet's coming into its own when the release of a new underlying Internet technology results in a big story in a major publication like the New York Times. And that's just what's happened with this article (reg. required) about Adobe's new platform, Adobe Integrated Runtime, or AIR.

Without getting into the gory details, what AIR enables is for anything built with Flash to run in an Internet browser can now be built to run as a standalone desktop application.

An example is Adobe's own Buzzword. It's an online word processor that runs in your browser. Now with AIR, it can work as a standalone application.

The biggest advantage to breaking free from the browser is that formerly these in-browser applications only worked when you were connected to the Internet, but now you can also use them when you're offline, like in an airplane. Yet they retain their networked focus, leveraging the Internet when connected to allow for things like easily storing files remotely and have more robust opportunities to collaborate with others.

But there's something even bigger at work with regards to the future of AIR's potential.

I had the opportunity to sit down for an early demo of AIR back in the fall at Adobe's San Francisco headquarters. They walked me through a couple of examples, but the one that really caught my eye was a prototype built by eBay.

Their thinking behind developing a desktop application was so they could offer additional features to their most loyal users, like more real-time tracking of bids and easier listing of new items.

But here's the remarkable thing: for anyone downloading the AIR version of eBay there's no longer any need to go to www.ebay.com. Everything you used to do on their site can now be done through this desktop application.

So what this portends is a future where instead of navigating to your favorite website in your browser, instead you could install a desktop version of that site that would give you additional features while also allowing you to do things without being connected to the Internet.

I'm not sure I've done a good enough job describing just how revolutionary this could be, but know that I'm going to be keeping a close eye on this space, and as I find examples of how AIR fundamentally shifts the relationship you can have with your favorite sites I'll be share them here on App-Rising.com.

February 26, 2008 11:27 AM

Hey FCC: Don't Give All That Auction Money to the Treasury!

It looks like the FCC's wireless spectrum auction is nearing its end as bidding is slowing to a stop in round 101.

For those not in the know, this initiative is auctioning off the wireless spectrum that's slated to be freed up early next year as broadcast TV makes the transition from analog to digital.

And in terms of raw dollars, it's been a smashing success. The thresholds that were put in place to guarantee spectrum could only be won with sufficiently high bids, and they've now almost all been surpassed.

Originally, the FCC was charged with the goal of putting at least $10 billion into the US Treasury from these auctions. Today, total bidding stands just below $20 billion.

So let me say it again: why are we putting money raised from selling spectrum for communications into the same pot of money that goes to fund the million and one other things the government does? Why don't we take at least some of that money and invest in our communications infrastructure?

With these billions we could lay a lot of fiber, or cover wide swathes of rural America in multi-megabit wireless, or pursue programs that help educate and inspire the public about the use of broadband in order to increase demand for and participation all the wonderful things the Internet makes possible.

I know that much of this money has already been allocated by the government for other purposes. But my guess is that those allocations were based on initial estimates, which proved to be low.

So now that there's more money available, how should we spent it? Is the best option to just shovel it into the general fund where its impact will be diluted across all the responsibilities of government, or shouldn't we be considering the possibility of taking at least the overrun and applying to initiatives that will help bolster America's communications?

I mean, what could be more important than that? And if you think about it, this wouldn't have to be a matter of investing in communications for communications sake. For example, we could identify the problem areas that Congress would want to spend the money on and show them how we can find solutions of many, if not all, of these problems through the utilization of broadband.

You want to bolster homeland security? Fine, let's build a virtual fence of motion sensitive cameras connected to the Internet and therefore accessible from anywhere.

You want to improve government services? Good, then let's make all public meetings available as webcasts online and try to transition all instances where paper forms are required to websites that can eliminate lines and waste.

You want better healthcare? Great, then let's not only fund broadband networks to hospitals, let's focus on equipping medical professionals with the tools needed to utilize that connectivity, like electronic medical records and patient monitoring applications.

You want to expand educational opportunities? Fantastic, let's make sure that all schools have access to 100Mbps and work towards further integrating the use of information technology into the classroom.

I mean really, this is just too easy and obvious. And think of all the good that could be done in any or all of these areas with the $10 billion overrun these auctions have realized.

So if anyone's listening at the FCC or in Congress: let's do the right thing and keep at least some of the money raised by these communications auctions focused on bolstering communications in America.

February 27, 2008 10:11 AM

MSNBC.com's Debate Coverage Produces Bad User Experience

Last night I had my first moment of regret over having recently canceled my cable service in order to rely solely on Internet video for my entertainment needs.

I was eagerly anticipating tuning in online to what was likely the last presidential primary debate between the Democratic frontrunners. I had done the same for last week’s debate on CNN.com and had a great experience watching live video for over an hour. This week did not deliver on that promise.

As I tuned in for the start of the debate on MSNBC.com, I was immediately greeted by the vagaries of online video. It was choppy, with movement starting and stopping at random. It was buffering constantly, providing the complete opposite of the all-important goal of an unbroken viewing experience. Its audio fell out of synch, with the video lagging far behind the audio. And even that audio was marred by strange metallic noises that faded in and out.

Long story short, it was an altogether unwatchable experience.

Now, it’s important to note that this was my first truly bad online video experience in quite some time. I’d even begun to foolishly convince myself that the day of robust, reliable Internet video had arrived. So when I began encountering these problems, I assumed first that the problem must be local. So I spent the next ten minutes closing and re-opening my browser, switching between Firefox and Safari, checking my connectivity, and power cycling my cable modem. Nothing worked.

Continue reading "MSNBC.com's Debate Coverage Produces Bad User Experience" »

February 28, 2008 12:37 PM

Chickens Coming Home to Roost for Comcast

The maelstrom surrounding Comcast's decision to interfere with P2P traffic on its network gained a new front today as law firm Gilbert Randolph LLP filed a class action lawsuit on behalf of Dr. Sanford Sidner and all the citizens of Washington, DC who've subscribed to Comcast's broadband services in the last three years.

Here's an excerpt from their press release: "The Complaint alleges that Comcast advertises and represents that it provides the "fastest Internet connection" and "unfettered access to all the content, services, and applications that the Internet has to offer." These representations allegedly are false because Comcast intentionally blocks or otherwise impedes its customers' access to peer-to-peer file-sharing applications."

While I've tried to argue in favor of network operators' right to manage traffic on their private networks, I don't see how Comcast is going to be able to defend against their indefensible practice of not delivering the service that they advertise.

You can't advertise one speed but only deliver another without giving customers a realistic idea of what that gap will be and how it will fluctuate throughout the day.

You can't say you provide "unfettered access" and then start fettering that access by actively interfering with legal traffic.

And these truth-in-advertising ideals extend beyond just Comcast to examples like the recent uproar over some wireless companies offering "unlimited" service but then enforcing strict bandwidth caps.

In my mind these issues aren't about preserving private rights to manage traffic; these are about consumer protection and making sure consumers know what they're buying when they pay for broadband.

And, unfortunately for Comcast, I don't think they're going to be able to find enough cover under the defense that their customer service agreements only promise best-efforts service and that they specifically retain the right to manage traffic that could be harmful to their network.

It's not that these aren't legitimate arguments, it's that the vast majority of users don't understand what these things actually mean.

For example, most people don't know that they're not getting the 5Mbps service they're paying for, and as long as their broadband works reasonably well they probably don't care.

Additionally, Comcast's P2P policies don't affect the average broadband user as they for the most part don't use P2P applications.

And Comcast likely won't get much support for its claims of needing to manage P2P traffic to preserve the experience of others on their network as it suggests their networks aren't capable of delivering the service that they promise.

Also working against Comcast is that even though their practices and policies are only impacting a narrow niche of users, the users being hit tend to be the ones who are most vocal in defending their rights.

Ultimately there's no defending a status quo where a provider isn't delivering on their broadband promises.

In fact, the only way that I see Comcast winning this is if they're able to prove that the average consumer knows what "best efforts" service means when they buy it, and that P2P applications constitute traffic that's harmful to their networks and must be stopped, even if the content being delivered is legal.

While I'm no legal expert, I'd hate to be the lawyer charged with proving these points.

February 29, 2008 2:10 PM

Kids Teach Global Classmates

I came across a tremendously uplifting article by way of my friends over at the Blandin on Broadband blog about the use of broadband in education.

This article details the recent happenings of the fifth annual Megaconference Jr., an event sponsored by Internet2 that united 215 schools in 13 countries through two-way videoconferencing.

The event lasted for 12 hours, with a dozen schools serving as video jockeys for an hour, helping introduce videos that were produced by the kids to share with classmates from around the world.

I can't express how excited I am by initiatives like this. Talk about a way to open kids eyes about the fact that there are people just like them living around the world!

Can you imagine if we could realize a future where an event like this isn't a big-time once-a-year deal but rather an integrated part of the curriculum?

Instead of reading and regurgitating American history, kids could be producing videos and creating presentations that they share with other classes around the state/country/world. Those other classes could then, in turn, do the same and share information back. What better way to learn about Chinese history than talking with a classroom of Chinese students?

I'm a firm believer that one of the best opportunities for learning in the 21st century is to get kids engaged with producing content in original ways, like through making a video. And having the ability to share that content with others can only help.

Also worth nothing in this story is that the Minnesota school who's participation is profiled was specifically selected to serve as a video jockey host site because of their fiber network.

Yet another example of how those communities who've invested in 21st century broadband are reaping the benefits!