March 2008 Archives

Tune in to Freedom 2 Connect!

| No Comments | No TrackBacks

Should've sent this out earlier. I'm currently sitting in a session on open fiber at the Freedom 2 Connect conference.

You can watch the webcast live by clicking on this link: rtsp://harmony.law.harvard.edu/f2c.sdp

You can also access the live chat going on between many of the people in attendance (the seats are filled with open laptops) by clicking here: https://f2c08.campfirenow.com/1de9e

And I've just learned that there's a chat going on in Second Life available here: http://slurl.com/secondlife/Capitol%20Hill%202/132/193/22/

Come join the fun!

Comcast's Two-Pronged P2P Response

| No Comments | No TrackBacks

Comcast has recently come out with an aggressive push back against accusations of any misconduct in its treatment of P2P traffic on its network.

First, the good news. They've announced a new relationship with BitTorrent through which they claim to be developing protocol-agnostic traffic management tools and investing in increasing the capacity of their network so that it's not as threatened by the bandwidth demands of P2P applications.

While many are wary that the significance of this announcement is more smoke than substance, I have to at least applaud the effort to reach across the aisle.

We're going to get NOWHERE if network operators and applications developers can't open a productive dialog about how to make things better for everyone. If those two sides work together, the possibilities of what we can accomplish are limitless. If they continue to be adversarial, then we're likely to be very limited in what gets done for a long, long time.

The other new angle to this story is that Comcast is now claiming the even if the FCC decides that Comcast was wrong in what they did that the FCC doesn't have the power to enforce any of its rulings in these matters.

Have to admit, while this approach isn't surprising in the slightest, I find it more than a little confusing. Wasn't a key argument for net neutrality proponents that there needed to be legislation in order to grant power to the FCC or other agency to enforce rules like this?

So doesn't that mean that Comcast basically just made their argument for them, that legislation is needed in order to give some agency the power to enforce the principles of an open Internet?

Maybe it's just me but this seems like a silly approach. If I were them I'd just say, "Hey, look, we're changing our policies because of pressure from consumers, which proves the market's working, so back off."

Not what it seems like they did say, which was, "Yeah we're managing traffic. So what? They're our networks and besides, it's not like there's anything you can do about it."

I just can't see how picking a fight helps them any.

That said, I'm hopeful that the announcement of their new relationship with BitTorrent is an honest attempt by them to find a better way to protect the vulnerability of their networks.

The Latest From Lafayette, LA

| 1 Comment | No TrackBacks

Had the great fortune to chat yesterday with Terry Huval, head of Lafayette Utility System, which is in the midst of deploying a full fiber network to the community of Lafayette, LA.

We chatted about the latest happenings with this initiative, which include their selection of Alcatel Lucent's technology, the issuing of their bonds, and the start of construction for the first phase of deployment, which is scheduled to begin serving its first customers in January 2009.

But there were two other nuggets of news that really caught my eye as they proved LUS's desire to be progressive in deploying one of the most advanced communications networks in the world.

First off, Terry shared with me their plans to offer high speed intranet or LAN services for free to enable consumers and small businesses to transfer data in-network at speeds much faster than the Internet connections they're paying for.

So say you've signed up for LUS's baseline broadband, which will likely be around 10Mbps. Because of these free LAN capabilities, you'll be able to establish point-to-point connections to other users on LUS's network that go beyond the speed of your broadband connection to support burstable speeds of up 100Mbps for in-network data transfer.

What might this enable? Imagine sharing an HD home movie with a neighbor in minutes instead of hours, or a small business being able to send large datasets across town exponentially faster than it would take over the open Internet. No longer will you be limited by your Internet connectivity but instead you'll be able to take greater advantage of the capacity fiber provides.

I've written before about how a similar initiative in Vasteros, Sweden led the ratio of outbround to in-network traffic to flip from 80/20 to 20/80 respectively, a trend made even more spectacular by the fact that overall traffic increased a thousandfold on their new full fiber network.

It's my fervent belief that leveraging the in-network capabilities of full fiber networks holds the potential to revolutionize our relationship with the Internet and how we use connectivity to establish stronger bonds within our community. So needless to say I'm ecstatic to hear that LUS has chosen this path as I believe their decision will be rewarded handsomely by enabling the creation of a more powerful network.

The second major tidbit I learned relates to one of LUS's initiatives to bridge the so-called digital divide by offering low-cost Internet service to TV sets.

The idea is that many people may want TV and phone service but aren't yet convinced they need broadband. So LUS is going to enable them to pay a low fee to rent a special set-top box and for very basic Internet access--slower than their base level broadband--so that they can surf the Web from their TV.

Now Terry admits that this service will be limited as it likely won't be able to do things like allow people to watch YouTube videos plus there are the limitations of the set-top box, which won't have the storage and ability to support an endless array of peripherals as a full-fledged computer would.

But users will be able to visit webpages, use email, and other basic functions of being online. And because it's LUS's mission to deliver their services for 20% less than their local competitors, it'll essentially work out so that you pay the same to get TV and this limited Internet product from LUS as you would to get TV alone from the cable company.

The overall idea behind this is to provide another way for people to get introduced to the advantages of being online so that they might find inspiration to upgrade to the true broadband connectivity LUS's full fiber network can deliver.

While surfing the Internet on your TV is not new, this is the first I've heard of a broadband service provider offering this kind of package to entice new users to get online. It's really an interesting idea, though I'm going to be curious to see how customers react to it.

To date, surfing the Internet on your TV has not been a huge success, but this is different. This is about providing Internet access into a home that wouldn't already have it, which makes it harder for me to guess how readily users will adopt it as I've known I could never live anywhere without broadband ever since I got stuck living in a house my junior year of college with nothing more than dialup.

When I heard Terry describe a service where you couldn't watch YouTube, where you didn't have any storage, where you likely were extremely limited in the Internet applications you could use, I found myself cringing at the thought.

But again, that's me, someone who has written about and relied on broadband for my livelihood for the last five years.

Ultimately, I welcome any effort to expand the pie, to get more people excited about the Internet, and to eventually entice more people into equipping themselves with broadband.

So in the end I think this is an innovative approach to tackling the digital divide from a different angle, and I couldn't be more excited to see how it plays out, because if it works then we'll gain another important arrow in our quiver as we all work together to convince America that broadband's great and that everyone needs to be online.

All in all I continue to be impressed by the passion, inventiveness, and focus on their local community that I've seen expressed both by Terry individually and his entire team. But even more than that I appreciate Terry's practicality. He understands that mission one is getting the network built, mission two is making sure it can handle the load reliably, and mission three is acknowledging that realizing a FTTH success story requires more than building a great network, it's about finding ways to engage the community with actually utilizing the capacity being put into the ground.

On a final note, I wanted to thank Terry and the LUS team publicly for their sponsorship of my travel down to Lafayette the week after next to speak at TechSouth, the preeminent regional technology show in the area. I'll be presenting on the possibilities of broadband applications as well as how their present and future demands for bandwidth necessitate the eventual deployment of full fiber networks.

It should be a great event and you all can look forward to reading a lot more about it as I blog from the show floor starting April 8th.

The Internet? Bah! circa 1995...

| No Comments | No TrackBacks

Sometimes you stumble across articles that both make you upset and make you think.

When I first read this Newsweek article entitled "The Internet? Bah!" I found myself getting angry at passages like this:

"...I'm uneasy about this most trendy and oversold community. Visionaries see a future of telecommuting workers, interactive libraries and multimedia classrooms. They speak of electronic town meetings and virtual communities. Commerce and business will shift from offices and malls to networks and modems. And the freedom of digital networks will make government more democratic.

Baloney. Do our computer pundits lack all common sense?"

To be honest, I couldn't believe I was reading this, until I noticed something odd: the date for the article was 1995. Turns out Newsweek had decided to repost an old article about the early days of the Internet.

But what's interesting was how right and how wrong this diatribe was in arguing the point that the Internet is overrated and not likely to revolutionize society.

Let's go through the article piece by piece to break it down to see how far we've come and how far we still have to go.

The article continues:

"The truth is no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works."

Well, at least for me, the Internet has replaced my daily newspaper, and not just recently either. While I used to enjoy reading it growing up, as soon as I hit college the Internet became my primary and often sole source of news. Why be limited to words on paper when there's an endless trove of perspectives, insight, and news to be found online?

The author is definitely onto something when stating that CD-ROMs can't replace teachers--the use of technology in education is often overhyped--but I found it extremely shortsighted to say that no computer network could change the way government works. Have we realized a huge paradigm shift? No, but today you can do a lot of things online that used to require waiting in line, and increasingly governments are becoming more transparent through making documents and video available online. Is it still early days and is adoption still lagging? Yes, but for those people who find and use online resources, they rapidly find them to be essential to enabling a richer relationship with government.

"Every voice can be heard cheaply and instantly. The result? Every voice is heard. The cacophony more closely resembles citizens band radio, complete with handles, harassment, and anonymous threats. When most everyone shouts, few listen."

There's definitely still some truth to this. We've got more people blogging than ever, and finding the information you need amongst all that banter can be challenging, to say the least. But what this author seemed to miss is the potential for Internet technologies to find ways to overcome the limitations of finding information online. Don't get me wrong: we've still got a big problem with information overload. But I'd argue it's getting better through the development of more robust and dynamic search capabilities and the increasingly common trend of aggregators of all sorts popping up to make finding related information easier.

"How about electronic publishing? Try reading a book on disc. At best, it's an unpleasant chore: the myopic glow of a clunky computer replaces the friendly pages of a book. And you can't tote that laptop to the beach. Yet Nicholas Negroponte, director of the MIT Media Lab, predicts that we'll soon buy books and newspapers straight over the Internet. Uh, sure."

The big thing missed here is the oncoming potential of epaper. The concept was around in 1995, but no one thought it was right around the corner. And even today, more than a decade later, you can argue that epaper is still more promise than reality. But in many way it's an inevitable technology that will revolutionize our relationship with the printed word. Additionally, probably 80% of what I read I do on a computer screen and I've never found it to be a bother. Plus there are a number of other new technologies coming down the pipe that promise to make the viewing experience on computers even better, like LED screens.

"Won't the Internet be useful in governing? Internet addicts clamor for government reports. But when Andy Spano ran for county executive in Westchester County, N.Y., he put every press release and position paper onto a bulletin board. In that affluent county, with plenty of computer companies, how many voters logged in? Fewer than 30. Not a good omen."

This isn't a limitation of technology, instead it points to the basic need for things like marketing and awareness building. The Internet is the only platform that can reach everyone and get them to engage and interact on an issue. The only problem is getting them to know what's there, how to use it, and why it'll make their lives better. If people aren't showing up to take advantage of online resources I don't think that's the Internet's fault, I see that as being our own failure to bring our community together.

"Then there are those pushing computers into schools. We're told that multimedia will make schoolwork easy and fun. Students will happily learn from animated characters while taught by expertly tailored software.Who needs teachers when you've got computer-aided education? Bah. These expensive toys are difficult to use in classrooms and require extensive teacher training. Sure, kids love videogames--but think of your own experience: can you recall even one educational filmstrip of decades past? I'll bet you remember the two or three great teachers who made a difference in your life."

This is one passage I agree with wholeheartedly. The use of technology in education has been overhyped and exemplifies that we can't benefit from the use of technology for technology's sake. Too often I hear about schools mandating the use of technology, teachers that don't know how to use it, and students that end up getting less personal attention than they did before. Don't get me wrong, I see boundless opportunity in the use of technology to help further education. But we've got to be careful not to embrace technology before understanding the problems at hand and how best to solve them.

"Then there's cyberbusiness. We're promised instant catalog shopping--just point and click for great deals. We'll order airline tickets over the network, make restaurant reservations and negotiate sales contracts. Stores will become obselete. So how come my local mall does more business in an afternoon than the entire Internet handles in a month?"

I'm torn on this one. On the one hand, the author sounds foolish; everyone I know buys their airline tickets online, more restaurants than ever let you make reservations online, and the promise of "instant catalog shopping" has been realized. At the same time, there is a bit of a disconnect as many people would still rather be able to go touch something before buying it. For some goods the Internet is rapidly replacing retail, but for others it's serving more as a complementary opportunity to compare prices.

"What's missing from this electronic wonderland? Human contact. Discount the fawning techno-burble about virtual communities. Computers and networks isolate us from one another."

Again, I agree and disagree with this sentiment. I totally agree that human contact has been missing from Internet interactions, and that too often the use of the Internet is isolating people from those around them. At the same time I've seen a grandma see her new granddaughter who's an ocean away using videocalling; I use email to stay in closer more regular touch with people than was ever possible via snail mail; and I still believe that the next generation of the Internet will be uniting local communities. We most certainly do need to be vigilant in making sure we don't fall too far down the rabbit hole of isolation the Internet can create, but the flip side to this coin is that the Internet holds the potential to unite us like we've never been before.

Long story short, I'm saddened that so much of what was said in this article more than a decade ago is still true today. But I'm also heartened that at least some of the fears about the Internet expressed back then are being resolved.

We've still got a long ways to go but sometimes it helps to pause and contemplate how far we've come so that we might better guide our journey moving forward into a better tomorrow.

I've been coming across a lot of great articles recently, so I wanted to share a handful for your reading pleasure:

Patriot Act haunts Google service
Story about a Canadian university who faced the choice of a million dollar upgrade to create an in-house system, or adopting Google’s Apps for free. They chose free and were happy with it, until they learned they were on the wrong side of privacy laws and worries about US snooping. Now similar concerns are deterring the uptake of these hosted applications elsewhere. Proof positive that there are still a lot of growing pains left to endure and overcome as we try to transition from the old world to anew.

Practicing Patients
In-depth New York Times article primarily about a site called PatientsLikeMe.com that records, tracks, and anonymously researches the health status of patients with a variety of conditions. More generally this is an exploration of the privacy issues associated with your medical information, which has long been held to be of the most sensitive variety.

Healthcare IT: Saving Lives, Saving Money
Important post on Cisco’s High Tech Policy Blog, ostensibly about S. 2408, the Medicare Electronic Medication and Safety Protection Act, which champions e-prescriptions mandating their use for Medicare patients. What I found most significant was the brief discussion about how government is starting to realize that investment in the use of health IT isn’t just a cost as it can drive new efficiencies that produce overall net savings. In the case of this bill that means $3 billion.

PacketFront on Open Access
Packetfront doesn’t always seem all that eager to pursue attention for the open access municipal broadband model that it champions, so it was interesting to read this extended explanation of its benefits and answers to specific questions posed by the good people at the Blandin Foundation. This is a must read for anyone interested in or intrigued by open access networks.

The Next President’s Internet Policy
I’m eagerly awaiting the opportunity to meet David Isenberg for the first time at his event Freedom to Connect 2008 next week in DC. He’s a tremendous thinker and the man responsible for this wonderful list of ten talking points for the next US President. I’ll be working on a more in-depth reaction to this to post in the near future.

Competition Works! Sort Of...

| No Comments | No TrackBacks

There are many in the telecom world who believe the ultimate answer to all things broadband is competition. If you can just get that dynamo of capitalism churning then you'll create incentives through competition to spur investment and innovation.

I've come across a few stories recently that suggest quite clearly that competition can work.

The best evidence of this is the impact of Verizon FiOS. You're starting to hear a lot more talk of DOCSIS 3.0, a new standard that promises to boost cable broadband speeds up over 100Mbps.

Why are you hearing this? Not because cable companies want to invest in their networks. They spent billions less than a decade ago and would much rather continue realizing a return on that investment.

Nope, the real driver of DOCSIS 3.0 has been Verizon's decision to lay fiber to the home across much of its footprint, allowing it to offer faster speeds than current cable systems are capable of.

In fact in reading around, I'm not sure if I've heard of any DOCSIS 3.0 deployments, at least by the major cable companies, that weren't aimed at areas that have FiOS.

So this is a terrific example of the impact competition can have on innovation and investment. But there's a problem: FiOS isn't everywhere.

As such this competition is leading to a situation where the rich are getting richer.

You could almost argue that communities with FiOS don't need DOCSIS 3.0 as much as communities without it. They already have one big pipe; is it really such a great thing for them to get a second one before the vast majority of the country even has one choice for 20Mbps+ broadband?

Another area where it's been interesting to watch the impact of competition has been the brouhaha surrounding Comcast's P2P traffic shaping.

I'm a little surprised it took them this long to take this approach, but Verizon and AT&T; are finally capitalizing on the opportunity to turn this into a PR opportunity through openly expressing their support for P2P applications and their involvement in developing the P4P standard, which aims to make P2P more efficient for network operators and faster for users by focusing on delivering between peers on the same network.

This is exactly the kind of innovation we want to see in a competitive marketplace. One company screws up, so its competitors move in with innovative solutions that (hopefully) improve the consumer experience.

And these changes stand a better chance of benefiting everyone, not just those in areas where robust competition exists, as they should improve the experience of everyone on AT&T; and Verizon's networks rather than only the people in areas where fiber is being deployed.

But even beyond that, I'm hopeful that this marks the early days of a new era where broadband providers begin innovating more purposefully with the goal of differentiating their services from their competitors.

If you don't want to just be a dumb pipe provider, then you've got to find new ways to add value to the services you offer consumers. Simply delivering the triple play and competing on price and the amount of HD channels and on-demand content isn't enough.

Not only that, it's my belief that the first companies to recognize the value of developing innovative new services to gain a competitive edge stand to be big winners as they gain first mover advantage.

And hopefully we can get this dynamo started sooner rather than later as if you can get competition to spur investment and innovation by multi-billion dollar companies, you have the potential to really start moving the needle.

"Killer Apps" Jumps the Shark

| No Comments | No TrackBacks

Everyone knows what a "killer app" is, but last night I witnessed the moment when this phrase jumped the shark.

TGI Friday is now offering a new menu of special "killer apps" though these apps refer to appetizers rather than applications.

Still, I wonder: do these appetizers justify a visit to TGI Fridays, just as a videocalling application demands the use of broadband?

Or from another angle, does TGI Fridays need a killer app? I mean, they've got locations everywhere. A lot of people already eat there. And the value proposition of going there is clear: if you're hungry you can go there to eat.

Broadband is most places, but still not everywhere. Not everyone is already online. And the value proposition for broadband tends to be quite muddy for most.

At the same time, TGI Fridays does need a killer app to help differentiate itself from the competition, which offers similar menus and prices. It's competing in a cutthroat world where some customers continue coming back out of habit but in order to attract new customers they need to create new reasons to stop by.

In this way I can see parallels between TGI Fridays and the various broadband competitors. To date their services have been roughly equivalent. Many subscribers stay with the same service because it's what they know rather than because it's the best. Customers can be very price sensitive. And so in order to attract new subscribers, broadband providers need to find the new killer apps that will drive demand for their service.

So in the end, maybe TGI Friday's killer apps isn't a sign that the term "killer app" has jumped the shark, instead they exemplify the challenges providers of all sorts have differentiating themselves in a competitive marketplace.

All this being said, I'm much more optimistic about killer apps for broadband having the opportunity to revolutionize society than the killer apps from TGI Friday, though I have been known to underestimate the awesome power of a new flavor of fried chicken wing before...

How I'd Define Broadband

| No Comments | No TrackBacks

Everyone acknowledges the FCC's definition of broadband at 200Kbps is outdated and in need of an update, as I have covered earlier.

There have been some attempts to set a new definition, but none that have really taken hold. For example, Congress has tried to redefine broadband as the speed necessary to deliver one high definition video stream, but that never went anywhere in part because it's hard to say what constitutes an HD video stream.

To some degree Congress was on the right track, though. I think it's important that however we redefine broadband that it relates to what that connectivity can do rather than how many Kbps or Mbps it may have.

So to start with, having broadband should mean having the ability to watch video with real-time or close to real-time playback. Sure you can argue the video can be watched over any speed, you just have to wait longer for it to download. But in my mind any connection that wants to call itself broadband has to be able to support instant-on video.

Using HD video as the benchmark doesn't make a lot of sense given its amorphous definition, the fact that most broadband today is not capable of delivering a true HD video stream, and the reality that most content providers still don't have their video available online in HD yet.

Instead, I look towards the other end of the spectrum. While video can be delivered at any bitrate, the most common bottom end for video lies somewhere in the 300-500Kbps range. That includes videos on YouTube and any live streaming where the video window's about the same size. This range gives you full-motion, OK quality video, basically the baseline for what consumers have shown themselves willing to watch.

Now this is all downstream. If you want to be using videocalling applications with similar quality video then you need the same amount of capacity upstream as you do down. And as so many next-gen apps require as much if not more upload as they do download, any definition of broadband should strive to be symmetrical.

Also important to note is that no matter which direction you're sending video, you always need some overhead in terms of higher capacity than a video's bitrate in order to guarantee quality playback.

Add all this up and I come to a number of 750Kbps. If you have 750Kbps, then you can watch some, but not all, Internet video, and therefore you have broadband.

I'd suggest that we move immediately to replace 200Kbps with 750Kbps symmetrical as how we define broadband.

I had the good fortune to attend an event on Capitol Hill yesterday hosted by the Internet Innovation Alliance (IIA) entitled "Closing the Broadband Gap in Rural and Underserved America."

It was a first-rate panel anchored by IIA's own Bruce Mehlman and Larry Irving, and featuring Rob Atkinson, president of the Information Technology and Innovation Foundation; Emily Parker, program officer of the US Libraries program for the Bill and Melinda Gates Foundation; Dr. Mark McElroy of Connected Nation, COO and SVP of communications for Connected Nation; and Alec Ross of One Economy, EVP of external affairs for One Economy.

I'll summarize my impression of their comments in reverse order from how they spoke.

I'm doing this to emphasize how strongly I agreed with Ross's comments related to the importance of spurring broadband adoption as much, if not more than, its deployment. He even went so far as to say that before we spend a dollar on increasing supply, we should first consider spending it on demand.

This argument is one of the very underpinnings of what App-Rising.com and to a large part my professional career is all about.

Increasing supply of anything without increased demand is most often pointless. And the best way to get more supply of anything is to increase demand for it.

Which leads to McElroy, who hit some of the highlights of Connected Nation's model and underlying philosophy. They build maps to find areas that don't have broadband, which spurs investment. And they establish teams of local leaders to generate plans to get communities using the Internet to a greater degree.

While their model has come under increased scrutiny in recent weeks, I'm still a supporter of any and all efforts to upgrade our understanding of what broadband is out there and, especially, to engage communities in a dialog about how to better use broadband.

Parker shared an overview of how the Bill and Melinda Gates Foundation invested $350 million to help get computers and connectivity to libraries across the country. She emphasized the importance of libraries as community technology centers, but lamented that not all communities have been reinvesting in keeping their technological capacity current.

The Foundation is currently in the midst of determining the next set of goals for schools that it will try and help achieve, one of which is high speed Internet access, defined by them as 1.5Mbps.

Kicking off the panel was Atkinson, who framed some of the challenges of deploying broadband given America's low population density but contrasting that against a place like Sweden where despite a largely rural populace only 400 homes in the whole country do not have broadband access. The reason? Because they've invested $11 billion over the last few years to upgrade their infrastructure. For the US, he suggested an aggressive policy of offering tax credits for the deployment of broadband.

Atkinson was also the first to strike a contrarian note when the panel moved into an open discussion as while most of the panel cited the need for and benefits of competition he called its efficacy into question in rural areas, in particular as it relates to giving new entrants additional incentives just for the sake of spurring competition.

I've long wondered how competition is the answer to increasing capacity and availability in rural areas; if we're having trouble getting one company to invest how can we expect to get two, especially when the more competitors the smaller the slice of customers each one gets.

I managed to sneak in a question at the end trying to ask about the gap between how so many people say it's too expensive to get big broadband to rural areas yet rural areas are likely the ones that could most benefit from that connectivity. Unfortunately I included in that question my belief that the ultimate goal should be a fiber pipe to every home.

This led the answers to focus on questioning if that truly is the goal. After a brief discussion a consensus emerged on the panel that rural broadband deployment should focus more on getting current broadband technology to everyone than next-gen technology to anyone.

I completely agree that the first order of business in considering the rural broadband challenge is making sure that everyone in America has access of at least 750Kbps or higher.

But at the same time why aren't we setting a longer term goal that's much more aggressive?

One argument put forth was that if you start talking about getting too much bandwidth into rural areas the cost becomes too great and can scare off all deployment, be it public or private because it requires too much of an investment.

I understand that as well, but I still don't see why we can't set a long-term goal of a fiber pipe to every home, no matter where it is.

In the meantime, this panel did a great job of laying out some of the most important things we can be doing to spur deployment to rural areas today:

- Robust mapping so private providers can know where the gaps are and move to fill them in

- Local community teams and technology centers that can spur the adoption and use of the Internet that will grow the demand that can drive deployment

- Tax credits and other incentives for the companies willing to deploy to rural areas to help make the business case more attractive

The only thing missing from this discussion was an advocate for municipal broadband. I have to admit I still have some reservations about public entities competing with private enterprise for consumer dollars, but I can't deny the reality that in many rural areas some form of municipally owned, financed, and/or operated deployment might be the only way those communities can expect to have their infrastructure upgraded in the next twenty years.

And for those communities who have seen the light and understand how important it is to be as wired as possible in the 21st century, I see no reason why they shouldn't be allowed to forge their own digital future if incumbent providers are unwilling or unable to deliver the network a community says it needs.

Content Continues Pouring Online

| No Comments | No TrackBacks

The flood gates are officially open for content flowing onto the Internet.

Sure you still can't get quite everything, and much of what you can get comes tied to dubious usage restrictions, but there's no denying the tidal shift of the last year, where premium content used to be limited and now every major TV network offers free full-length first-run episodes and all the major film studios are working with multiple online distribution partners.

But this story isn't limited to the possibilities of online video supplanting traditional television and DVD distribution. The Internet is also witnessing an upsurge in the availability of content through experiences only possible on the Internet.

Take for example Paramount Picture, which just launched its VooZoo application for Facebook. What VooZoo does is give Facebook users access to clips from thousands of its movies to watch and share through the social network.

Or Marvel Entertainment, whose been pushing a Digital Comics Unlimited initiative that gives online access to thousands of comics you can read right on your computer (assuming you want to as I have to admit I tried this out and found the reading experience a little unnatural).

And if we want to expand the definition of "content" even further, then check out this Wired article that lists a handful of super powerful telescopes that are accessible over the Internet, either to control live or have pictures taken for.

There's no end to the content continuing to pour online. As there's more good content we'll have more reason to go online. As we're online more, we're likely to find more content that interests us and the cycle can continue.

Ultimately, we should strive towards making the ultimate realization of the Internet a reality, where all content is delivered online in one fashion or another--your TV and movies, radio and CDs, books and newspapers and comics.

To get to that point won't be easy, but it's reassuring to know that the flow of content onto the Internet continues unabated, as only by filling the shelves of the Internet's library will we be able to entice consumer habits to shift from tradition distribution to the 21st century, where all the world's content is instantly at your fingertips.

Connectivity Secures Data On Your Thumb

| No Comments | No TrackBacks

Came across an interesting story this morning from ComputerWorld about how one government agency is beginning to utilize Web-enabled thumb drives to protect sensitive information.

USB thumb drives are those small storage devices typically about the size of your thumb that feature a built-in USB connecter. Just plug it into your computer's USB slot, and use it like an external hard drive or floppy disc to store data and take it with you on the go.

The taking it on the go part is what's been problematic. Government has gained a bit of a reputation for not safeguarding digital data as well as it could and should. Thumb drives can be especially bad for security as they can now store GBs in a small form factor that's easy to lose and steal. And since many of them aren't secured by even a password, they're essentially asking to be stolen and abused.

Until now. Washington's Division of Child Support was the subject of this article, and they're in the midst of upgrading their workers' thumb drives to SanDisk's Cruzer Enterprise edition that tie into SanDisk's Central Management & Control software.

What this software enables is the centralized management of these thumb drives. Now administrators can keep track of what data is stored on which thumb drive, and if a thumb drive is misplaced it's both password-protected and its memory can be wiped clean remotely.

This is another prime example of how everything's getting connected and what that connectivity can enable. Not only that, it illstrates how being connected to the Internet can support security efforts rather than just exposing an organization to additional risk.

Security is a major concern for the 21st century, especially as more and more of our sensitive personal information is delivered online. But done in the right way connectivity can be a boon to security and not merely a threat.

YouTube Gets Serviced, May Help It Go Niche

| No Comments | No TrackBacks

YouTube has made a name for itself as the leading site for sharing user-generated video.

Last week it announced its intentions to expand upon its core competencies to become more than a site but a service, a service that empowers website owners to deliver video sharing to their users.

But wait: doesn't YouTube already allow anyone to upload video and embed their player in any webpage?

Yes, and by making that process simple while bringing together a massive audience, YouTube in effect became the Internet's video platform, with producers choosing to host their video on YouTube rather than on their own.

Well now YouTube's taking that a step further. With their new API, website owners can develop private label video sharing services that feature their own branding while leveraging the player technology and hosting capabilities of YouTube.

Yeah, but why is this such a big deal?

While many sites use YouTube to host video, it's far from an elegant site when it comes to sharing video between members of a site's community. For example, uploading video meant having to go to a third-party site like YouTube, which is also where all the management of that video had to be done.

Now, when site owners build their own private YouTubes, users will be able to do all of this without leaving their site.

OK...so what?

The overall upshot is that we should see more sites offering video sharing, which means more users, and ultimately more content.

But since YouTube's services will likely be affordable to all, there's another potential outgrowth of this: the rise of niche-oriented websites and content.

Today, much of what constitutes "user-generated content" is focused on either more private videos to be viewed by friends and family or more public videos intended to reach as large an audience as wants to watch it.

Niche-oriented content is that which is relevant and interesting to a particular niche of people.

As private video sharing should now be easier and more affordable than ever, I expect we'll start to see more and more niche sites offering it as a service.

As more niche sites begin offering video sharing, we'll start to see a lot more niche content be created.

As more niche content is created, we should have a lot more usage, assuming the content is good.

So ultimately what YouTube's announcement may mark is the beginning of a revolution in the way niche sites relate to user-generated content.

What excites me about this is that with each new site that introduces video sharing capabilities we gain a new node through which content can aggregate and flow. We've seen the impact of video sharing and user-generated content on bigger picture arenas like entertainment. But what I'm curious to see is what impact it will have as more niche sites come online.

It's hard to say how much this may explode, but one thing I do know is that it will undoubtedly drive greater demand for bandwidth, both upstream to upload videos and dowstream to watch them. So we end up back where we always do: neck deep in another example of why we need 21st century broadband in America.

The New York Times ran a great article earlier this week entitled, "Tech's Late Adopters Prefer the Tried and True." It detailed how late adopters resist change to the patterns of use that work for them.

The lead example they gave was 56-year-old gentleman who's always used the same Netscape browser and AOL dialup service to get online. Why? Because it just worked.

That's what I see as the biggest challenge for the next generation of the Internet: getting to the point where it just works.

To illustrate the point, let's compare the usability of TV vs. the Internet-delivered TV I've recently switched over to.

To watch TV, I turn it on, start changing channels, and it basically always works.

To watch a DVD, I turn on the player, put in a disc, and hit play. It always works.

To watch Internet TV, I have to turn on my computer (not really, it's never off), open an Internet browser, type in a URL, click through the menus on the website to find and select the video I want, start watching and hope that the video doesn't stop or start stuttering or get off track from the audio.

The Internet isn't alone in failing the "It Just Works" test as cable VoD can be equally labyrinthine as it typically requires navigating hundreds of choices by just pressing up or down. But that's also why cable VoD has not been as successful as it could be.

The only way the vast majority of what's possible online starts getting used by late adopters any time time is if we can find ways to integrate their use more seamlessly and reliably into our lives.

Some of this is already happening with things like standalone music players that can play Internet radio. No longer do you have to start up a computer to listen to Internet radio, now you can access it much like a regular radio.

Videocalling is another Internet application with the potential to be revolutionized by upping its "It Just Works" quotient through dedicated devices that make videocalling as easy as dialing a phone.

Many companies are vying in this space but none have reached any significant level of mass market adoption. And in truth, this is all so young that it's even too early for many early adopters to have bought in yet.

But the best example of what can happen when you combine an Internet application with an "It Just Works" use case is what happened with the iPod. The devices made storing and navigating large music collections enjoyable, and made buying and loading song easy. And look what happened.

Look what happened with YouTube. Sure there were other video sharing sites, but YouTube had the highest "It Just Worked" factor so it took off.

The only way we get everyone enjoying the benefits of being online is if we find ways to make "It Just Work." And the biggest winners for developing new uses for the Internet will be those that find a way so that "It Just Works" as an easy part of our everyday lives.

You'd think with the amount of attention I've spent tracking the exaflood that there'd be little left to surprise me in terms of new news on the growing demand for bandwidth and storage, but you'd have thought wrong.

A new white paper by IDC came out earlier this week that starts by saying if you add up all the electronic data that existed in 2006 it'd equal 180 exabytes. That's right: the exaflood isn't some amorphous future concept, the data's already pooling up around our ankles.

But we really are seeing only the earliest signs of what's to come. According to their projections, the total amount of data in existence is growing by a factor of ten every five years.

That means by 2011 we'll be at 1,800 exabytes, or 1.8 zettabytes of information.

To put that in perspective, 180 exabytes (or 180,000,000,000,000,000,000) is already greater than the estimated number of stars in the universe, according to this ComputerWorld article.

While these numbers deal with total stored data rather than the number of bits being delivered over the Internet, there's no denying the fact that all forms of data are increasingly becoming network-enabled.

Whether it's because you're storing and backing up data in the cloud, or pulling data from remote hard drives, or sharing data with friends, all trends point to a future where data rarely resides only locally on a hard drive.

And to be honest, I think that factor of 10 growth every five years might be low.

I say this not at all flippantly as with the evolution of digital media creation technologies like digital cameras and camcorders, microphones and webcams, never before has the average consumer had so many ways to create multimedia content, all of which demands a lot of data.

And at the same time, scientists are getting new toys as well with ever more powerful and sensitive telescopes and atom smashers producing unbelievably large file sets. Not to mention supercomputers breaking down the inner workings of DNA and modeling the galactic machinations of the universe.

Growth across all these things may stay steady in its year-to-year growth and that would be nothing to sneeze at, but at the same time we should also be prepared for the possibility that continuing technological innovation and burgeoning consumer awareness and adoption could push these trends for more data higher than we ever could imagine.

Seeing as how late last week I broke through the 5GB monthly bandwidth cap on my wireless service, I decided to keep forging ahead into this brave new world.

So far? No change. Even though I'm not up over 8GB in data I've sent and received this month, I've not seen any drop in performance.

In fact over weekend I hit a new high water mark for speed with my wireless card: 2.75Mbps.

Even my less than techie friends are blown away by this number. They understand that it's twice as fast as most DSL lines. While they may not known specific bandwidth requirements for apps, they do know that you can do a lot with 2.75Mbps.

The problem, though, is that my wireless connection is highly inconsistent. At night it routinely tops 1Mbps, but during the day I'm ecstatic if I break 500Kbps, and too often it'll go through periods where the connectivity drops below 200Kbps, making activities like watching video nigh impossible.

Even more troubling, though, is that these speeds vary dramatically from one moment to the next. I can have 1Mbps, and the next second only get 150Kbps. The problem with this up and down is that it renders some Internet applications unusable. For example, I would never rely on my wireless connection to do a live videocall. Even putting aside wireless's upstream limitations, I wouldn't want to deal with watching a live video that runs great one moment and stutters the next.

On the wireline side of things I've also got some good news: last night I plugged in to download a big file and broke the 20Mbps plane for the first time.

That's the funny thing about cable broadband like I have. I often lament over how you don't get what you pay for since it's a shared network where your neighbor's use impacts your connectivity. But at the same time, I'm certainly not paying for 20Mbps.

True, I do have Comcast's fastest residential service that's available to me, and I know they're pushing the bursting capabilities of their network, where very fast speeds can be realized for short periods of time.

But even still, it was remarkable to see my gauge jump all the way to 23Mbps+. And boy did it download that file fast! I think it was a 50MB file I was downloading that finished in what felt like seconds.

It's interesting that my wireless and wireline peaked on similar days, especially since the two peaked relative to each other by a factor of ten.

Now I've never looked into this to confirm a specific ratio, but I think it's safe to say that wireline capacity stays ahead of wireless by at least a factor of ten.

Just look at the numbers: the latest wireless broadband is touting 1-3Mbps; the latest wireless broadband 10-30Mbps.

It'll be interesting to see how this trend evolves over time as both are realizing lots of investment but both also have a lot of uncertainty when it comes to competing technologies and slow adoption rates.

In any event, at this point as a user I couldn't be happier. Both my wireless and wireline connections are getting faster without me having to say or pay anything.

That said, they both still have a long ways to go until I consider them to be big broadband connections that can support all the wonderful bandwidth-intensive applications the Internet makes possible.

ESPN Violating Net Neutrality?

| No Comments | No TrackBacks

My future brother-in-law made an intriguing remark the other day: Doesn’t ESPN 360 violate net neutrality?

As background, ESPN 360 is a package of live and on-demand video that ESPN sells to broadband providers who can then offer it as a value-added service to their subscribers.

The dilemma is that the only way you can get the content is if you’re subscribed to a broadband provider who’s paid for the ESPN 360 service.

Having recently completed a whole week of posts on net neutrality, I started in on an in-depth explanation of the nuances to this debate. But in the end, I couldn’t bring myself to refute his initial assertion that ESPN 360 violates the spirit of net neutrality, at least in its broadest, most vague definition.

In a sense, it does discriminate based on the origin and destination of content. Before sending any video, it knows that you’re trying to access it from the right network, in other words the network that has paid to make that content available.

What I don’t known is whether or not broadband provider partners of ESPN prioritize ESPN 360 traffic in any way to get a higher level of quality of service, but I’m going to work on finding out.

Of course, this is all a bit of an exaggeration. I don’t think any advocate of net neutrality would advocate for exclusive services like ESPN 360 to be outlawed. But that said, I think this misunderstanding by a relative layperson--my brother-in-law was one of the first to get an iPhone, but his interests lie in medicine rather than broadband--exemplifies how muddied the net neutrality message has become.

We’ve reached the point where net neutrality means everything from preserving free speech to protecting the rights of small companies to operate freely on the Internet to defending the rights of consumers. The problem is that by making the concept so big and amorphous, it obfuscates the real questions that need to be answered, and in so doing leaves the general public confused as to what net neutrality actually means.

The point I will continue to harp on in this blog is that content owners and applications developers working with network operators is not a bad thing, so long as those relationships don’t turn predatory against companies that rely on the open Internet and don’t infringe on the rights of consumers.

Perhaps by refocusing on more specific issues we can continue to make progress forward on finding a solution amenable to all parties involved. Otherwise, I fear continuing confusion over what net neutrality is will engender resentment from consumers towards totally legitimate products like ESPN 360.

A long-running theme on App-Rising.com is the exploration of how the Internet introduces new ways to interact with and visualize information. It's not just about on-demand access and search and sharing; the digital revolution can also change the nature of how we relate to something.

To that end, here are three examples of how it's doing this that I've found from across the web:

Nikon - Infinite Yardstick
I'm not 100% sure what inspired a Japanese camera/optics company to create this "infinite yardstick", but boy is it cool.

After the page loads you're left on a grid that's slowly moving to the left through a series of objects. It starts with objects the size of galaxies, like the Milky Way. Then as you make your way through the line you eventually end up at the scale of a single atom.

To navigate either let it continue its slow scroll, or you can click on the buttons that line the bottom of it. As you switch between units of measure, text will pop up on the screen describing that level of size. I skipped a lot of that text by clicking on it as it didn't seem to have all that much profound to say.

But even still, I found it exhilarating to be able to scroll from an atom to a galaxy all in one window. I'm still not certain I fully understand the sense of scale that separates these extremes as I'm only human, but at least now with this tool I've got a guide to help me get there.

Institute for Interactive Research - Don't Click It
While there's nothing that's necessarily all that bandwidth-intensive to be found here, I did find it to be a compelling site to visit.

The idea here is that they've designed a website that you can navigate without having to click your mouse. Instead, simply move your cursor around in order to open menus and select items.

It's a bit disconcerting at first, especially as every time you do click the screen switches to an image filled with static asking if you intended to click or not. But after a few moments it started to become a bit more natural.

Did this interface open my eyes to a whole new way of interacting with computers? Maybe, maybe not. But at a minimum it does offer up some tantalizing insight into what's possible moving forward.

Immersive Media - Demos
Having windows in which you can move your perspective around in 3D is nothing new to me. I've long enjoyed its possibilities for virtual tours when looking for apartments, I've recently found interesting Google's StreeView push to drive cars equipped with cameras around major cities, and I've also experienced computer-generated 3D environments as well, like the oceanscape that can be found here.

But this was something different. This link above takes you to the Demos page of Immersive Media, the company that builds 360 degree cameras. And what it shows is the use of these interactive 3D windows only this time for video rather than still images.

Even though this might not be new, it was new to me, and it has a real wow factor. Check out footage of someone skiing or a fly-by over a wooded area. Both provide very compelling viewing experiences, almost as engrossing as the 3D that needs special glasses and pops off the screen.

Perhaps even more interesting is that I found myself watching the same clips repeatedly so that I could see things from different angles. Because of this, I'm hopeful we might start seeing these kinds of experience become commonplace, especially for things like sporting events as this really allows for a compellingly immersive experience.

The Internet Not Just about Wired vs. Wireless

| No Comments | No TrackBacks

I read a remarkable article in the New York Times yesterday about Cuba's underground Internet.

It describes how despite Cuba being largely cut off from the world that there are thousands of people finding ways to get information into and spread around the island. Whether it's smuggled satellites, hard drives passed from hand to hand, or computer science students at the local university, information is constantly finding new avenues to reach people through the curtain of communism.

But for me this story was about more than some scrappy rebels sticking it to the man; it fundamentally shifted my thinking of what the Internet is.

By and large when I talk about the Internet and broadband I do so from a wireline perspective as my primary interest has long been the deployment and utilization of fiber-to-the-home networks.

I've recently been realizing that I can't turn a blind eye to wireless as speeds are increasing, it may be the best near-term solution for getting everyone online, and there are a number of compelling wireless-only applications.

But reading about how Cubans are sharing information by passing along memory sticks and cards and thumb drives made me think: is that delivery method an aspect of the Internet?

Technically speaking the Internet is a series of publicly accessible interconnected computer networks.

Well, if you're holding a hard drive in your hand that's arguably private, not public. And since most of these devices need USB or other slots to interface with computers, I'd be hard pressed to refer to them as "interconnected computer networks."

But that brings up another point: when is a local non-interconnected network a part of the Internet?

A perfect example of this can be found in a recent comment a post of mine garnered from Robert Hard:

"Very interesting piece, I'm apart of a community wireless network in Australia and due to some regulatory constraints here, the focus of our network has been much to do about local content and communications, rather than Internet sharing.

Despite the lack of formalized internet provision and members already having DSL Internet access. Our group continues to grow, fueled by people wanting to communicate and host content locally more than over the Internet."

The post he was responding to dealt with how a full fiber community in Sweden had seen the ratio of outbound to in-network traffic flip from 80/20 to 20/80 with the deployment of fiber.

So that leaves me to wonder: are these local networks part of the Internet? Even if they don't interconnect with any other networks and they may not even use the IP protocol?

Perhaps even more importantly, if they're not part of the Internet, then what umbrella do they fall under? More generic labels like digital distribution?

Where this has all led me is to expand my definition of the Internet to include all instances where digital information is being shared. Whether it's on a local network or through a hard-drive hand-off network, they all involve the distribution of data through digital means.

Now perhaps instead of lumping everything together under the Internet's banner we'd be better off coming up with new words to describe this new age we're living in.

But in either case, that New York Times article helped crystallize my thoughts about what the Internet fundamentally is, and in so doing made me realize that it's not just about wired vs. wireless, it's about revolutionizing the way we're able to access and share information through technology, no matter what you call it or how you do it.

Since writing earlier this week about the tension between offering higher speeds and stricter usage caps, I've decided to continue relying solely on my Verizon wireless card to support my broadband needs.

Driving this decision was my curiosity as to how long it would take to reach my 5GB monthly transfer limit in light of the fact I'm now watching most all of my TV over the Internet.

Well, I'm just about there as, according to my calculations, I've moved more than 4.7GB through my wireless connection since the first of the month less than a week ago.

And as I'm currently recuperating from a nasty migraine, I will undoubtedly be breaking on through to the other side sometime later today.

Luckily I've learned that I shouldn't have to fear losing service once I cross that line. According to a footnote on Verizon Wireless's website, once I exceed 5GB they reserve the right to limit my speed to 200Kbps.

But even still there's something funny about this whole setup. While Verizon helpfully provided some usage monitoring capabilities in the application you use to access their wireless aircard, it doesn't keep you up to date on your monthly usage.

In order to come up with that 4.7GB number I had to break out my calculator and start adding up all the sessions I've initiated in the last week.

I find this to be very frustrating as there's little reason why they couldn't or shouldn't keep you up to date on this. I can kind of understand this practice when it comes to cell phones as by not making it easy for me to check up on how many minutes I use they increase the chances for me to go over allowing them to charge some ungodly per-minute rate from there on out.

But there is no analogous mechanism for my wireless broadband. Once I go over, they may just start slowing me down. In fact, as far as I can tell, there isn't a way for me to buy more capacity even if I wanted to.

Now I could be wrong about this. While I haven't seen anything on the website about it and they've never shared with me when I first signed up for service or any time after about any options being available, perhaps once I go over the limit a friendly box will pop up asking if I want to pay $20 for another 5GB to stave off the oncoming slowdown.

But even still, this all points to one of the most bizarre trends I've read about over the last year. More and more broadband providers are installing and enforcing usage caps, yet too often customers aren't being told what these caps are, they aren't being given a robust way of tracking their usage, and when they go over there's no reasonable opportunity for them to pay to increase that cap.

From a network operator perspective I understand why sometimes caps may be needed, but from a consumer perspective this is an untenable situation.

As I've said before and I will continue saying on into the future: consumers have the right to know what they're buying when they sign up for broadband service.

And to take this a step further: having fully informed customers is essential to encouraging the future of consumer adoption of and reliance on broadband and the Internet.

But in the meantime, I'll let you know how things turn out once I cross 5GB. Wish me luck!

Don't Buy Blu-Ray! Blu-Ray Is The Future

| No Comments | No TrackBacks

There's been a lot of buzz in recent weeks about HD-DVD shutting it down and Blu-ray coming out on top as the choice for next generation optical disc formats.

To bring everyone up to speed, HD-DVD and Blu-ray use blue lasers instead of the red of DVD players to store HD video and additional features on discs with the same form factor as DVDs and CDs. Like Betamax v. VHS thirty years ago, the two formats were locked in a high stakes battle over who would become the preeminent choice for movie distribution. But within the last month most major studios came on board with Blu-ray, which led HD-DVD's biggest pusher, Toshiba, to announce they're done making more players.

So my condolences go out to anyone who's already invested in an HD-DVD player. If I were you, I'd either go hoard as many HD-DVD titles as you can, or get ready to have a very expensive doorstop.

Now that this latest edition of the formats wars has found its end, it should be time for you to run out and buy a Blu-ray player so you can get in on enjoying the flagship of the HD revolution.

But it's NOT! In fact, DON"T BUY BLU-RAY!!!

Why? For one simple reason: the real future for Blu-ray isn't in HD video, it's in enabling networked experiences in your living room, but there aren't any Blu-ray players that are currently capable of connecting to the Internet.

To give you a sense for what I mean by "networked experiences" here's a quick rundown of what this will ultimately mean:

- Being able to download additional language tracks
- Having movie trailers that automatically refresh over time so they're always current
- Accessing ecommerce sites through the disc where you can buy movie merchandise
- Playing interactive games with scoreboards that track all users
- Contributing to social networks and discussion forums built up around movies

And the list can go on and on. Quite simply: everything you do on the Internet today may eventually make it into a networked experience through a Blu-ray disc in your living room.

But the problem is in the second half of my warning: there aren't any players that will let you have these networked experiences.

Now you might be fooled into believing there are players as the studios have begun to release the first discs with networked experiences, but you'd be wrong.

If you're going to buy a Blu-ray player today the best option is Sony's gaming console, the PS3. It has a Blu-ray player built-in, it's cheaper than most standalone Blu-ray players, and this summer it's going to gain the ability to deliver networked experiences this summer.

Otherwise, the first networked Blu-ray players aren't coming out until the fall.

Even worse, if you've already bought a Blu-ray player, you're likely going to have to buy another one if you want to have these networked experiences.

But all this being said, I do believe that the future of networked Blu-ray experiences is enormous.

There's a lot of talk about moving to a world without discs, where everything's delivered over the network. But the reality is that only a small percentage of broadband connections are fast enough to transfer HD video in a reasonably short period of time. Who wants to wait 10 hours to download a 2 hour movie?

What Blu-ray allows for is to deliver the HD video on-disc but then connect that content with all the possibilities found in being networked to the Internet.

I'm not 100% sure optical media has a great long-term future as (hopefully) we'll one day live in a world where every home is hooked up to high capacity fiber.

But in the near term I think we're in for some wonderful treats over the next few years in terms of studios opening up new experiences that leverage network connectivity to expand the enjoyment we can have with a movie on a disc.

So definitely do get excited about Blu-ray and its potential as another application that uses broadband, but don't jump on board the bandwagon too soon and end up with yet another all-too-expensive doorstop.

Last week Connected Nation released a report entitled "The Economic Impact of Stimulating Broadband Nationally." In it they claim that if the US were to stimulate the deployment and adoption of broadband that we'd realize more than $134 billion worth of benefits.

Unfortunately, because of the reputation Connected Nation has in some circles, these findings have been called into question, with many labeling this as nothing more than another PR stunt aimed to pressuring legislators into passing legislation based on what some to be Connected Nation's failed model for broadband mapping and community organizing.

The best analysis I've seen so far of their report comes from App-Rising.com friend Ann Treacy over on the Blandin Foundation's broadband blog.

Here's her final analysis of the report's findings in Kentucky: "I can’t say that I think these numbers are rock solid. But I think it is undeniable that Kentucky is in better shape now than they were before the push for broadband."

And this is the point that matters most in all this. I don't think it really matters what the precise number is for how much the availability and use of broadband can improve America. Who cares if it's $100 billion or $10 billion or $10 trillion, in the end there's no denying the fact that by having cheaper, faster, more widely available and more widely adopted broadband we can give our country a massive economic boost.

Beyond specific dollar amounts, let's also consider who far-reaching that impact could/will be.

It'll create jobs.

It'll save healthcare costs.

It'll reduce the amount of gas we consume and emissions we product.

And then there's the more qualitative benefits:

It'll expand educational opportunities.

It'll allow rural areas to get better healthcare and compete more readily in the global economy.

It'll encourage the development of our 21st century information economy by encouraging greater innovation.

This list just goes on and on.

Why we even need another study showing the economic impact of broadband is beyond me. No reasonable, informed person can doubt the fact that we'd be better off as a country with more broadband and more people online.

And I would argue that there is no greater investment we could be making as a country in our future than in aggressively pursuing all available options for bringing our broadband infrastructure into this new millennium. What else could we invest in that would improve healthcare, education, government services, economic development, and beyond?

The question is no longer if we need to do this, but what we need to do, how we can do it, and when is the soonest we can get it done by.

Those are what need answering, not embroiling ourselves in an argument over the specific dollar amount by which having broadband would enable us to realize.

In fact, rather than guessing how much of an impact broadband can have, why not set a goal of how much we want it to have?

I say by getting everyone hooked up to the Internet through broadband we should be able to realize a minimum of $1 trillion in benefits to the US.

Later this week I'll share with you how we might get close to that number, not by sitting back and observing broadband's impact but instead by leaning forward and actively pursuing its potential to revolutionize the way society works in the 21st century.

The Fickleness of Online Audiences

| No Comments | No TrackBacks

The meteoric rise of sites like Google and eBay have long dominated discussions over the potential for online businesses to realize explosive growth.

And this truly is a revolutionary aspect of the Internet, providing a platform that allows anyone to reach an audience of hundreds of millions of people instantly. Not only that, but this growth be driven organically by pure word-of-mouth rather than having to rely on massive marketing campaigns.

But there's an uncertain underbelly that lies beneath this remarkable revolution: just as easily as audiences can be attracted to a site they can go away.

This isn't a new concept as fads are as American as apple pie, but it's an important one to consider whenever a new online business has been built up based primarily on page views.

The Internet makes matters worse as there tends to be so many copycats so that when one site gets popular you're bound to see dozens more like it sprouting up everywhere, giving consumers more choices to distract them.

The world of Internet startups is littered with the carcasses of companies who didn't understand how fragile an online audience is, but it's not just the small fish who are threatened by this reality.

Just last week, for example, Google's stock took a hit upon news that AdSense click-throughs--which is a measure of how many times people clicked on those ubiquitous Google text banner ads--had dropped .3%.

While there have been many different reasonings for what led to that drop, the one thing that can't be denied is that if consumers decide they no longer want to click on those banner ads, then the very core of Google's business model will be seriously threatened

Another example is eBay. On February 18th, a group of disgruntled users--upset because of a recent hike in charges and change in feedback policy--started a boycott of eBay, which has resulted in new auction listings dropping off 13%.

Though neither of these examples have yet had a disastrous impact on either Google or eBay, and they're somewhat unique as these aren't necessarily instances of consumers running off to competitors, even still they highlight how fragile many Internet businesses are.

Simply put: Internet users can be fickle. And if you've built your business on the belief that everyone's going to stay around and continue using your service indefinitely, you're likely in for a rude awakening.

As while the Internet is the ultimate platform for reaching an audience of hundreds of millions, it's also the best way to lose that audience to competing services, dissatisfied customers, or the whims of a buying public always looking for the next best thing.

The More Bandwidth You Have The Less You Get

| 1 Comment | No TrackBacks

In my continuing adventures transitioning to watching all my TV online, over the weekend I had one of the starkest reminders yet of that cresting wave of the exaflood and how today's broadband isn't prepared to handle it.

On Friday evening I began having some troubles with my Wi-Fi router. Rather than tether myself to my cable modem, I decided to forego it and plug in my wireless AirCard from Verizon.

Now, most often the speeds I get on this card are at the low end of what I need to watch video--typically around 500Kbps. But recently at night and early in the morning I've been realizing huge bursts up past 1Mbps. And in fact Saturday night I hit a new record: 2.5Mbps.

Because of this higher speed I excitedly proceeded to begin watching Internet video as I have been over my cable connection, confident that I'd have enough bandwidth to watch unbroken video.

The good news is that I did have enough bandwidth to watch video, not quite full screen HD quality, but good enough quality at about twice the size of the average YouTube video window.

But there's a twist.

When I got the AirCard I asked if they had any bandwidth caps, and Verizon had no problem admitting they did, and that it was 5Gbps a month upload/download, or more than enough for average day-to-day use, according to them.

And this has been more than enough. I use my AirCard extensively while on the road and can say that even on heavier days I had trouble topping 100Mbps upload/download during a single 24-hour period.

But that was before this speed boost, when watching video was more nuisance than pleasure. Therefore I hardly ever watched any when connecting through my AirCard, never anything more than short YouTube videos, and very few even of those.

The twist in this story is that with all this additional bandwidth, I began to watch video as I would on TV. I watched a full-length episode on ABC, a handful of full-length episodes of some animation I like, and a few other assorted things over the course of Friday night into Saturday morning and again Saturday evening.

I didn't keep track of how many hours of video I watched, but it was certainly at least three or four and could've been a good deal more than that.

In any event, while I was clearly excited about the possibilities of watching video over my wireless card, I thought it prudent to go check out the bandwidth usage meter that's part of the software for my AirCard that tells me how many bits I've passed through the network.

While my previous high had barely topped 100Mbps, from midnight Friday to midnight Saturday I moved well over 1Gbps, even nearing 2Gbps, of data. That's right, I managed to blow through roughly a third of my monthly allotment of bandwidth in a 24-hour period.

This incident clearly illustrated two major points of broadband in the 21st century:

- Video uses up a ton of bandwidth. While this isn't a new thought, it's still remarkable to see its impact quantified like this. I mean, I wasn't online any more than before, I was just watching more video while online, which led to a ten-fold jump in my usage.

- Higher speeds and stricture usage caps don't get along. This is the great paradox of broadband: everyone's pushing to up the speeds they deliver, but at the same time many are implementing caps on how much you can use those pipes. This creates tension that needs to be resolved moving forward.

The best way I can think of this is through an analogy. Imagine you've got a bottomless milkshake. Before, you only had a tiny straw to suck through so the malt shop let you drink as much as you could. Now, they're rolling out new and improved jumbo straws, only they're no longer offering bottomless milkshakes as now you're able to drink twice as fast and consume twice as much, or more.

This may not be the most elegant analogy but it clearly illustrates the paradoxical bind some broadband providers are putting themselves in.

Yet I don't necessarily think it's their fault. For the most part they're scrambling to catch up to the revolutionary changes the Internet has been making to the telecommunications industry. The problem is less nefarious decisions being made by faceless corporations and more about a business model that's fundamentally flawed.

I understand that you can't keep making the straw bigger and allowing users to drink more milkshake without increasing revenue from that user. And the current all-you-can-eat model of broadband is firmly tied to that paradigm.

How this tension is resolved is one of the most important issues moving forward as we must make sure that as consumers continue to acquire a taste for massive amounts of milkshake we need to be sure that all parties involved with making and delivering that milkshake stand to benefit.

If broadband is to flourish, then all parties must profit, or else some day soon we stand the risk of hearing the sound of straws sucking air at the bottom of the formerly bottomless milkshake of bandwidth.

About this Archive

This page is an archive of entries from March 2008 listed from newest to oldest.

February 2008 is the previous archive.

April 2008 is the next archive.

Find recent content on the main index or look in the archives to find all content.