January 2008 Archives

Internet Proves Its Vulnerability/Resiliency

| No Comments | No TrackBacks

On Wednesday, two underseas cables were cut off of Egypt's northern coast causing 70% of the country's Internet network to go down.

This story is interesting for so many reasons.

First off, I sometimes think people forget that pretty much all international Internet traffic runs over incredibly long, hair thin strands of glass stretched across the ocean floor. Sure they're bundled together and wrapped in protective coating, but between the destructive power of man and the awesome power of Mother Earth I sometimes wonder why we don't hear about stories like this every week.

Secondly, the Internet is inherently interconnected, so that when these two cables were cut, not only did Egypt lose 70% of its network but India lost over half its bandwidth. Last I looked, there's a little thing called the Middle East separating the two, so it's kinda startling how big an impact an issue with the Internet in one country can affect another.

There's a lot of talk about the vulnerability of the Internet due to too much demand for bandwidth and not enough investment in capacity, but we mustn't forget its physical vulnerability in terms of the pipes that carry its traffic.

At the same time, it never ceases to amaze me how resilient the Internet is.

By Wednesday night, for example, India had service--albeit degraded--up and running again. Within a couple weeks they expect to be back to full speed.

How did this happen? By rerouting the traffic that used to flow over the now cut pipes through other networks.

This truly is one of the most beautiful parts of the Internet: the way that networks interconnect and allow it to survive losing vital arteries by diverting more traffic through other pipes.

At the same time, I think this resiliency speaks even more strongly in favor of the need for more capacity in the network. The only way this resiliency works is if you have the capacity elsewhere on the Internet to route traffic through during an emergency like this. If this had happened but no one else had any additional capacity, then we wouldn't be able to reroute anything.

On the flip side, the more capacity and interconnectedness we have the more resilient the Internet becomes.

It's important we recognize how fragile the Internet can be and how important it is we work towards bolstering its capacity so that when--not if--something like this happens in America, we'll have the resiliency to survive without risking harming industries like the financial marketplace, which has become so heavily reliant on the ability to reliably communicate instantaneously.

Yesterday App-Rising.com co-hosted its first event in DC, joining together with fellow co-host EDUCAUSE and conference organizer the Congressional Internet Caucus Advisory Committee to put on a preconference seminar called "The Future of Broadband: Moving from Why to How."

I'll work on writing up thoughts on the other sessions to post shortly, but first I wanted to review what was said on my panel

The intent behind this panel was to engage a voice that's too often ignored in broadband debates, that of applications developers, in order to discuss the demands their applications have for bandwidth.

The reason I feel this is an important topic to consider is that by understanding more of these specifics I hope to equip my fellow broadband believers with the ability to express the need for broadband in concrete rather than abstract terms.

And seeing as how video is the ultimate bandwidth hog, I focused this panel on video applications.

Joining me onstage were Tom Spengler, CEO of Granicus, a company that combines live and on-demand video of local government meetings with meetings notes, agenda, and votes to create what they've dubbed an integrated public record; John Hughes, CEO of GOSN, a company that is working to introduce a new era of Internet-empowered monitored video security for communities that actively prevents crime; and Gary Bachula, VP of External Affairs for Internet2, the big, big bandwidth private network that connects hundreds of university campuses, government agencies, and corporate research groups with massive connectivity.

After giving each the opportunity to share a 30-second elevator pitch about their respective companies, I asked each more specific questions about what they enable.

Tom shared that Granicus has more than 500 communities as customers and streams more than 1.5 million videos a month. He also alluded to the fact that many of their customers are finding new ways to use Granicus's platform to integrate online video into the workings of government, for things like distance learning, for example.

John described a typical SafetyBlanket deployment of motion sensitive cameras on buildings, pan-tilt-zoom cameras on light poles, all tied to 24/7 monitoring services. He also confronted the question of if his product is taking us to a Big Brother, nanny state by stating that the SafetyBlanket is an external security system that only sees what's already in the public eye.

Gary discussed the roots of Internet2 as an attempt to create a private network that was five years ahead of the public Internet, and he expressed his dismay over how much further than five years they have become through their continued investment in capacity. He also mentioned the fact that while Internet2 is a tremendous research tool, that its use is becoming woven into the fabric of students' and faculty's day to day lives.

Next we dove into the topic of bandwidth, exploring what the minimum and maximum connectivity each app requires.

For Granicus, that means 300Kbps for delivering video. They can serve customers slower than that, the video will just continue to degrade until bandwidth becomes so scarce that their system will flip users over to an audio-only stream. By doing this, though, they're able to serve even those customers still on dialup. He also shared that they could ramp up the bitrate of their video as high as a customer would want, the challenge is that the majority of their users are connecting at very low speeds and are therefore unable to receive anything higher.

For GOSN, while the SafetyBlanket is able to work over lower speed connections by sending JPEGs instead of video, it fundamentally is a high bandwidth application. In fact, to max out the capacity of the cameras requires 80-120Mbps per building.

With Internet2, I had Gary discuss the range of two-way videoconferencing technologies in use by people on their network. On the low end they start with traditional Polycom units streaming at 1.5Mbps. For those with newer units that number can jump to 3, 4, 5Mbps, or DVD quality headed towards HD. He mentioned the increasingly common use of uncompressed MPEG 2 video streamed at 30Mbps. Then he talked about the really neat use of video conferences where 30 people stream video to each other at the same time, requiring about 2-3Mbps per user. He finished by taking things to the highest end of uncompressed HD at 1.5Gbps and even the experimental quad-HD at 6Gbps for one stream.

I found these three examples to be a fascinating juxtaposition of the amazing things that can be done with low connectivity, contrasted against the tremendously bandwidth-intensive applications that in my mind put to bed any notion that we're not going to want at least a gig to every home in the not too distant future.

Then I asked the question: if you were king for a day, what would you do to improve our country's broadband infrastructure?

Tom shared his company's vision for expanding and enhancing participation in local government, and that to accomplish those goals we need everyone to have access to broadband and to get everyone onto the network.

John expressed his company's need for a more secure, stable Internet, one that can guarantee the delivery of video from his system to the monitoring center as he and his customers can't afford to lose signal in an emergency.

Gary's main comment was quite simple: you can never be too rich, or too thin, or have too much bandwidth. He backed that up with the fact that every time they've increased capacity, new demand has filled it up.

While there were many other interesting items shared during this session, the last two I wanted to cover here are:

- Tom's suggestion that if want to spur adoption of broadband use in government, we should be looking at creating something like an e-rate program that could fund government to purchase and integrate broadband applications.

- John's line of thought that historically alarm systems take over the phone line during an emergency, and that the SafetyBlanket does the same, only this time the broadband connection. This idea led him to defend his need to be able to prioritize traffic on a network in order to be able to guarantee video from an emergency can make it to the monitoring center, homeowner, and public safety officials.

All in all the message seemed to be clear: in order to support the continued development of broadband applications we need to make sure everyone has access and is connected, we need smarter more stable networks, and we need lots more bandwidth.

A good time was had by all, and I'd like to again share my gratitude with the Congressional Internet Caucus Advisory Committee for deeming this content worthy of a preconference event, and to EDUCAUSE, fellow broadband believers that I was honored to share the stage with.

If anyone has any questions about the information contained in this post, submit a comment and make your voice heard! I'll do my best to get whatever answers you need.

Cool New Apps to be Shown at DEMO

| No Comments | No TrackBacks

DEMO is a tech event all about unveiling new technologies.

In some past iterations I've found semi-interesting apps, but not all that much I considered to be very innovative uses of broadband.

Well I just came upon this NetworkWorld feature that highlights ten new products about to be unveiled at DEMO '08, three of which I thought worthy of sharing:

They're not the first site I've com across trying to establish itself in the music instruction business, but they've got established musicians involved with famous groups like the Allman Brothers Band and 3 Doors Down. Plus they're very much focused on delivering high quality (and therefore bandwidth-intensive) video.

While they seem primarily involved with delivering on-demand video, it'll be interesting to see if/how they incorporate other broadband applications like live video and two-way video into their offerings as they get off the ground.

I don't usually tout technologies geared towards fighting piracy, but I thought this worth mentioning as its mechanism for doing so is a visual search engine. So they're claiming the ability to identify what a video is by scanning the video itself rather than relying on textual metadata that describes what's in the video.

Visual search is something I first encountered with a company called Imaginestics, who I got out to Fort Wayne, IN to present at the Killer App Expo last year. In talking with their CEO, I learned that processing a visual search requires massive amounts more bandwidth than a text search through a site like Google. We're talking about ten times more bandwidth at least.

Here's one of the cooler things I've seen in a while. It's a site that will allow you to create movies using 3D avatars and real-life video quickly and easily.

Looking at the videos on their site the concept still seems somewhat rudimentary, but I can see so much potential in what they're attempting, basically allowing anyone to create and share a movie over the Internet from their computer without any special training.

When I think about how much money is often spent to produce 3D animations for training and corporate uses, it seems like this endeavor might revolutionize that, and in so doing make it accessible to a much broader range of use cases. Definitely an app worth keeping an eye on.

Internet Reinforces Local Bonds

| 4 Comments | No TrackBacks

So often when you talk about the benefits of the Internet the conversation focuses on its ability to break down the barriers of distance, to unite disparate friends and relatives, to provide access to resources not available in your local community.

But there seems to be a growing understanding that the ultimate benefits that the Internet brings to society may be found not in tying together things that are far apart but instead enhancing the relationships that exist locally.

For example, this Wired essay entitled "How Email Brings You Closer to Guy in the Next Cubicle" explores how while broadband should enable anyone to do anything from anywhere that what it's actually doing is putting a premium on living in concentrated areas. The reason for this is the efficiencies the Internet can bring to day-to-day communications with the people you interact with regularly.

I can personally attest that this is the case for me. While I do email a lot with my parents and friends back in Minnesota as well as with colleagues across the country and world, when I look back over my inbox I have far more emails sent to and from people who live within a few miles of me.

But this capacity of broadband to unite communities is perhaps best highlighted by one of the most stunning facts I've heard in a long time.

In catching up last week with Matt Wenger, CEO of Packetfront America--Packetfront enables the creation of open access, multi-service networks--we got to talking about Vasteras, Sweden, arguably the fullest realization of what an open access, multi-service network can be.

The fact that blew me away was about them and how the deployment of a fiber network impacted their use of broadband.

Before this community fiber network was put in place, more than 80% of the traffic on local networks was outbound, pulling in and sending out information over the world wide web.

After the fiber network came into being? That ratio basically flipped as now more than 80% of the bandwidth being consumed is for moving data around within the Vasteras network, so neighbors talking to neighbors rather than users pulling in data from all over the Internet.

It should be noted that just because the percentage dropped, doesn't mean people on that network are consuming outlying Internet content less. Instead, it's a sign of just how massively demand for bandwidth in-network has grown, literally more than a thousandfold.

This trend is totally and utterly remarkable to me.

Basically everything on the Internet to date has taken a server-in-the-sky mentality where you're almost constantly sending and receiving data out over the world wide web.

To think that that paradigm has now fundamentally shifted in communities like Vasteras is one of most under-discussed potential outcomes of deploying a fiber network, and one that demands further exploration.

But never fear! While I don't yet have the answers to what's driving all that demand for bandwidth, my curiosity has officially been piqued and I'm on the hunt for more information.

Until then I'll leave you with this thought: the deployment of fiber networks can and should be considered the best opportunity we've had in a long time to not just hook people up to the global economy but also to reinvigorate the ties between people in their local communities.

Anyone deploying fiber, big or small, public or private, should be keeping this thought in mind, as otherwise we may end up missing out on one of the great unsung benefits of what broadband and the Internet enables us to realize.

Last night I was following along with the Republican presidential debate. At one point the dialog turned to the federal government's recently announced $150 billion stimulus plan to stave off a recession.

While most all of the candidates stated their support, Governor Mike Huckabee took a different tack. He suggested that perhaps a more effective use of that government money would be to invest in roads. By doing so we could bolster our infrastructure while sinking money into American goods and labor to get the job done.

I'm no economist but this idea seems to make sense. The whole point of this stimulus package is to get more money flowing into and around the economy. But the tax cuts and rebates that have been proposed seem like a less-than-elegant way to do so, literally just throwing money back at people and hoping they spend it at all, let alone on something that benefits our economy. (Will it really be that effective if all the products consumers and businesses buy are imported from overseas?)

But why roads? Sure there are many in disrepair--especially around the DC area--and adding capacity to major thoroughfares in order to ease congestion certainly won't hurt economic development. But what if there was a better way to spend that money to spur growth in our economy.

At this point, anyone who knows me has to know where I'm taking this, but that's the thing that boggles my mind: how painfully obvious this idea seems to be, yet no one seems to be saying. So here it goes...


I mean, I know of a host of American companies that supply the various technological parts to build a fiber network, of course we can use American engineers to design it, and American labor to build it. If we wanted to, we could structure it so that we basically give the new fiber infrastructure to private providers, helping to dramatically improve their bottom line. I mean, we could even go so far as to build out the network in order to support a base level of free connectivity for everyone.

The reason this was so obvious it hurt to me was an article I linked to earlier this week that highlighted research that showed the gap between stated private investment and needed overall investment to upgrade global broadband capacity to be $137 billion.

Now I know that's a global number, but then I'd imagine it's pretty safe to say that spending $150 billion here in the States would be sufficient to get a fiber pipe into every home, business, school, hospital, church....basically every building in America.

Can anyone tell me why this is a bad idea?

More than ever the sentiment I shared when initially linking to the story of $137 billion--that that number really doesn't seem all that insurmountable--seems to ring truer and truer to me.

The federal government wipes its collective posterior with $100 billion projects. Just last night I read a statement from one Congressperson lambasting another for holding up some bill because of a mere $10 billion, as if that was an inconceivably insignificant number.

What am I missing here? With $10 billion, couldn't we wire a huge chunk of rural America with fiber?

I officially feel like I'm taking crazy pills.

The potential broadband offers to revolutionize the efficiency of society is not argued by anyone. And everyone agrees we need to do more to get everyone online at faster speeds and lower prices, and that broadband access is the key to our country's future.

So to me this seems like an incredibly simple equation.

Now whether or not we can convince anyone on the Hill this should be a priority...well...

(To be honest, I'm not overly optimistic as we're in the midst of a wireless spectrum auction expected to raise more than $10 billion, and yet I have not heard one word towards the possibility of taking some or all of that money and applying it to the deployment of broadband, despite the fact the auction's being run by the FCC! Instead my understanding is the proceeds are going to be taken away from communications entirely and lumped into a general fund for all programs. Excuse me while I go pull all my hair out and jump out the window in frustration...)

On Wednesday I had the tremendous opportunity to sit down with Gary Bachula of Internet2 to engage for the first time with someone from that organization.

In all candor, going into that meeting I only had vague impressions of what Internet2 is and does, thinking of it solely as a high bandwidth network connecting universities for research. But it turns out it’s so much more.

First off, it truly is a high bandwidth network. They’re currently working on expanding the network’s capacity to enable things like the ability to establish a 10Gbps connection on-demand.

The original intent of Internet2 was to build a private version of the Internet that would be five years ahead of the public Internet and would serve as a testbed of sorts for new high bandwidth applications.

It turns out their vision was wrong as the pace of their network capacity has outstripped the public Internet to the point now where one has to wonder if the public Internet will ever catch up with Internet2.

While Internet2 is heavily focused on uniting university campuses, it also boasts corporate research groups and government agencies among the entities on the network.

One of Internet2’s primary role is as an enabling tool for research projects that require the transfer of massive data sets. To get a sense for how massive these can be, once the Large Hadron Collider in Switzerland is operational it’s going to begin spitting out huge amounts of data pretty much continuously. Because of this researchers engaged with trying to understand its findings will have need to get updates on a regular basis. These updates will contain so much information that even over a 1.5Gbps connection it will take four hours to download.

What I found even more interesting, though, is the ways in which Internet2 is providing an opportunity to put high bandwidth applications into action that may one day find their way onto the public Internet.

While these applications come in many different flavors, in particular my conversation with Gary focused on what’s happening with two-way, real-time video communication, as that’s the area I wanted him to cover on the panel App-Rising.com is co-hosting and moderating next week.

He went through the entire range of possibilities being explored, from pretty basic videoconferencing to HD one-on-one conversations at 30Mbps to an experience where 30+ live video feeds combine simultaneously for group discussions to uncompressed live HD streaming at 1.5Gbps.

I won’t get into any more detail than that for now as I want everyone to come to our event next week to hear the rest, but needless to say my eyes have been officially opened as to the vitally important role that Internet2 has and will continue to play in proving the concept of what this thing we call the Internet could one day be, given the influx of a heckuva lot more bandwidth.

In the end, I found myself getting more and more excited about what’s being done and what’s possible on Internet2. But as I did so, I couldn’t escape the nagging feeling of wondering when, if ever, the public Internet was going to be able to support anything near what’s possible on Internet2.

The hope is that Internet2 could be a proving grounds, a place to vet new technologies, and a shining example of what’s possible, but my fear is that it will continue to outpace the public Internet to such a severe degree that it begins to feel more like a futuristic pipe dream than an inspiration and road map for what’s possible.

I guess when we still live in a country where a sizable segment of people don’t see a pressing need for broadband of any sort, where another major contingent believes that what we have today is sufficient, and yet another group that admits the need for more capacity but only in small, incremental steps, it can be hard to stay optimistic about the public Internet ever reaching the level that Internet2 is at today, let alone where they’ll be ten years from now.

But luckily staying positive and hopeful is one of the things I do best. And that faith only grows stronger when given the opportunity to commune with people like Gary who are equally passionate about the importance of these issues.

We’ve all got a lot of work to do to reach that vision for a future of ultra high bandwidth, but at the very least through acknowledging the accomplishments of Internet2 we can talk about the future through real-world examples of what’s happening and not merely grandiose rhetoric over what’s possible.

That's right: App-Rising.com will be co-hosting it's first event next week in DC, joining forces with the Congressional Internet Caucus Advisory Committee and EDUCAUSE to put on a preconference activity to the State of the Net conference entitled "The Future of Broadband: Moving From Why to How".

On January 29th (next Tuesday) from 3-5:30pm at the Hyatt Regency in DC (full details and RSVP available here), anyone and everyone who's engaged with the debate on how to make America great through the deployment and use of broadband will convene for an event jam-packed with goodness.

It will kick off with opening remarks from FCC Commissioner Michael Copps.

Then lead into a presentation of an exciting new paper from EDUCAUSE called "A Blueprint for Big Broadband" that explores some of the options available to America to fund the deployment of big broadband fiber networks.

This presentation will then open up to a panel discussion featuring, among others, the paper's author, John Windhausen, who I've only met on the phone to date but who I'm rapidly gaining respect for as a leading thinker in this space, and Jim Baller, someone who I've long thought to have one of the most important, well-reasoned voices around.

And...drum roll please...it's time for my panel entitled "Beyond YouTube: Video Applications That Make Broadband Work"!

On it I'm featuring Tom Spengler of Granicus, John Hughes of GOSN, and Gary Bachula of Internet2. We're going to talk about how each of them are helping to deliver video in different ways for different purposes in an attempt to better society, while diving into the topic about the present and future demands for bandwidth as it relates to the development of video applications.

Needless to say I couldn't be more excited about this opportunity to join forces with a host of organizations and individuals that I respect and admire. And I encourage all of you out there that are based in or near DC to come out and join the fun as we discuss and debate the future of how we're going to make the most of broadband in this great country of ours.

Oh yeah, and it's free! Just RSVP to the info provided on this page.

One of the biggest issues in telecom these past few months has been the proper role of network operators in policing the delivery of illegal content across their pipes.

The MPAA has been pushing to require all universities and colleges to take more aggressive approaches to ferreting out and punishing the illegal distribution of content by students on their networks.

Last fall they began pushing the message in DC that federal funding of postsecondary institutions should be tied to their willingness to take on the challenge of hunting down pirates.

But then yesterday this story took on a new wrinkle as the MPAA admitted that their 2005 report, which showed college campuses accounting for 44% of illegal P2P filesharing, was wrong due to a human error, and that the real number is closer to 15%, though even that number has been called into question by some as over-inflated.

The other main component to this thread is the noise AT&T; has been making recently about the possibility of more aggressively filtering their network to identify and capture content pirates.

While to some degree, I think AT&T; should be commended for at least be willing to openly talk about this possibility--unlike the BitTorrent/Comcast fiasco of '07--at the same time it raises a lot of questions.

These issues are probably best summed up in this Slate article by Tim Wu, in which he looks at the potential liability AT&T; may incur if it moves forward with these plans. '

The gist of his comments are that under current common carrier protection, AT&T; is not liable for what bits run over its network. The issue is that if they start looking more closely at those bits, they may render that protection void, and therefore be open to lawsuits of all sorts, plus stronger mandates to try and fight this piracy in more aggressive ways.

While the legal outcome of this question of liability is not yet firmly defined, I do wonder what might happen once AT&T; starts looking more closely.

On the one hand, what if they looked and found that piracy isn't near as rampant as the MPAA makes it out to be? That could actually produce some tremendous insight for this debate.

On the other, what happens if their network is saturated with piracy? Will they then have to pursue a campaign of locking up paying customers? And again, there's potential for huge liability.

At the same time, it makes sense why they'd be making moves in this direction. AT&T; wants a strong relationship with content owners, especially as it's trying to break into the video market. In fact, they may even be a little stuck now as once you start down this road of discussing possibilities to cater to the content people it's likely hard to step back from that table without engendering resentment and potentially harming relationships, an especially dangerous proposition during these early days of U-Verse when, despite AT&T;'s overall size, they only have a relative handful of TV customers.

But the point I wanted to make here doesn't have anything to do with AT&T; specifically. Instead I think it's important to acknowledge the incredible amount of uncertainty in this space before diving into any form of mandates on how network operators should be joining the fight against piracy.

The biggest reason I believe that isn't even the consumer protection/privacy aspect of this. Instead, it's the fear that welled up in me when I read in Wu's piece that any form of filtering will have an adverse affect on network performance.

It's time again for everyone's favorite, the Broadband Article Roundup!

Here are some of the stories I've found interesting over the last week for your reading pleasure:

"U2 3D" Brings Hyperreal Arena Rock to the Multiplex
This story is about a new 3D recreation of a U2 concert that can be seen at some iMAX theaters wearing special glasses. While the article doesn't specifically mention the use of broadband, one factoid did catch my eye. To record this 3D video requires 20MB for a single frame. At 24fps, over the course of the production they ended up with a petabyte--or a thousand gigabytes--of data on their servers. And since they use the word "servers" that suggests that at some point all that data will be flying over a network somewhere.

Fascinating story with video that tells the tale of how Kenya's largest broadcaster has started a YouTube channel with videos about what's happening in that country. While less than 1% of its citizens have access to broadband, this effort to leverage YouTube has been instrumental in allowing people outside of Kenya to keep up with what's going on, providing a direct link into what's happening on the ground.

$137 Billion Investment Required by 2010 to Close Gap Between Demand and Supply of Broadband Access
This article highlights the findings of a recent Nemertes Research Study that predicts by 2010 demand for broadband will outstrip supply. And that to close that gap it'll require $137 billion globally, which they claim is roughly 60-70% beyond what service providers were currently planning to invest. To be honest, that number doesn't seem all that big to me, and certainly not insurmountable. I should admit, though, that I say this in light of last night's Democratic debate when Obama alluded to the $2 trillion we're going to end up spending in Iraq over the course of the war. It seems to me that if we can shake the couch cushions enough to support burning through more than $10 billion a month in Iraq, finding another $100 billion to invest in our own country's infrastructure shouldn't be that hard...right?

17 Ways You Can Use Twitter: A Guide for Beginners, Marketers, and Business Owners
Before going any further, a twitter is a microblog. To become a twitterer, you create an account on their site and then start writing short posts describing what you're doing throughout the day. While it's primarily gained popularity as a consumer tool used to keep in touch with friends, I've long wondered about its possibilities as a professional tool for businesses. Well here's the first post I've seen diving into that topic, providing a number of interesting suggestions for how this unique online tool can be used to expand and enhance your business.

Had to include this as it's one of the neatest things I've seen in a while. Go to the site and you'll see a rectangle made up of smaller rectangles with the names of musical notes on them. Hold down shift and scroll over the notes and you'll begin to create music. You can choose from piano, strings, and drums. Whenever you play a note a colored circle expands featuring the light spectrum associated with that note. And for better or worse, this is a collaborative site, meaning anyone who wants to come to it and start playing can, which can lead to a bit of a cacophony. At the same time, it's kind of neat knowing that you're playing music of sorts with strangers. I highly recommend checking this out if you like neat, easy-to-use experiments with what's possible online.

Over the weekend while playing chauffeur to a pair of car-less friends in need of help running an errand I had a brush with greatness. That's right, on Saturday afternoon, around 1:30pm, I met Matthew Lesko.

Now, that name may or may not ring a bell, but I'd be willing to bet that well over half of you, my faithful readers, have seen him on TV. In fact, you might know who he is just by hearing these two clues: "Free Money!" and a question mark suit.

Not enough? Well how about this:

So here's a guy who's made his living off of the inefficiencies of government. Some have labeled him as a shyster, someone who's sole interest is making money off of his claims of "Free Money!" He's even been listed as one of the 100 people that are screwing up America.

But when you actually talk with him you quickly realize that, while not adverse to making money, he has a different mission: to unite people with the government programs that were put in place to help them. He's just someone who can't understand why we'd create all these government programs but then not provide the average citizen with a way to know about and apply for them, so he set out to change that.

In truth, though, any talk of what he's doing or why he's doing it was much less interesting to me than how he's leveraging the Internet and broadband to make it happen.

The best evidence of this was a quick look around the surroundings as we stood in his headquarters. One half of the space was taken up by the usual trappings of an office, while the other half was totally empty. While chatting with him, he mentioned that he used to also rent the three adjoining spaces to serve as a warehouse for his many titles. But now? No need as he's read the proverbial writing on the wall and is working on taking his business all-digital. He's realized that the future isn't in selling ink-on-paper books but instead in building out an online portal where all the information that used to be stored in massive tomes can now be searched and found quickly and easily.

But that’s not all, he’s also been tapping into the power of crowdsourcing, or leveraging his audience of informed readers who want to contribute to the success of other readers in more fully utilizing government programs. As a member of his club, you can pose questions of any sort regarding how to get government funding and people will answer these requests, helping you find new resources, fill out applications and grants, and much more.

Lesko’s utilization of broadband doesn’t end there. He’s also become an avid user of Ustream.tv, a site that allows anyone with a computer and a broadband connection to start streaming video of themselves live over the Internet. He'll often have a camera on while working in his office, but back in August he took this idea even further when he setup shop in front of the US Capitol building for a 72-hour Q&A; session. Not only did he take questions from people who walked by but also from people around the globe who were watching live streaming video. You can see clips from that experience here.

In talking with him you can tell this is a guy who gets it; he understands what the transformative power of broadband can mean to his business. But while in many ways he is on the cutting edge of technology, he didn’t seem like an ubergeek, just a normal businessman--or at least as “normal” as a man who sews question marks onto all of his jackets can be--who’s trying to find ways to make his business more efficient and expand its reach to serve existing and find new customers.

And from what I saw, he’s well on his way to embracing the possibilities that broadband has to offer, and in doing so he’s setting himself up to successfully transition his 20th century business into the 21st century thereby securing his future for the next 30 years as the preeminent source for getting information on how to get Free Money! from the government programs that exist to dole it out.

While the ultimate amount of bandwidth every home will want and demand in the next ten years is very much up for debate, there is one thing I know first hand: the less bandwidth you have the less you’ll use the Internet.

This is opposite of the most often used analogy of “if we build it they will come”, where whatever capacity you put into the ground users will find a way to fill up.

The reason I’m writing this now is because for the last couple of days my wireline cable access has been down and I’ve been forced to rely solely on my wireless EV-DO card to get online.

Now this isn’t the first time this has happened. Back in the fall in the week during our move to a new apartment I was faced with the same lack of wireline access, and any time I’m on the road I’m likely connecting online wirelessly. So I know what it means to only have 500Kbps or so to work with.

In trying to exist with that little bandwidth, I’ve made a few observations:

- With less bandwidth it’s not just that you can’t do as much, it’s that you don’t want to do as much. For example, sometimes the speeds I get wirelessly are capacious enough to support watching a YouTube video as the way God intended: instant on and with no buffering. But many other times it'll hang, the video trying to play back faster than it can download. So what ends up happening? Eventually I start dreading trying to watch any online video when I’m accessing the Internet wirelessly.

It’s not that I can’t watch YouTube videos; if need be I can always click on a video, hit pause, and let it load before playing it back. It’s just that I don’t want to, I don’t want to have to deal with the hassle of waiting or sitting through a video riddled with hiccups.

So while I do subscribe to the belief that the more bandwidth that’s available the more users will find ways to use that capacity, I also support the notion that the less bandwidth one has the less one not only can but will do with it.

- I’m increasingly reliant on bigger bandwidth networks for my profession. While this probably shouldn’t be a surprise, it is a frustrating realization on the morning of a day when wireline access is still shaky and I’m supposed to be working with not one but two different videocalling applications today -- testing out TVBlob by talking with its makers in Italy through my TV, and conducting a video interview with Michael Curri, who’s based near Paris, about his work helping communities leverage broadband to promote economic development. These are two applications that either won’t work or at the very least I don’t want to even try to see if they’ll work wirelessly; it’s just not worth the hassle of dealing with limited bandwidth.

- At the same time, to a large degree I really don’t need a big bandwidth network to do what I need to do. Sure I’m limited in trying out newer high bandwidth apps, but in terms of my day-to-day life, only having 500-750Kbps of throughput is surprisingly sufficient. I mean, I’m able to surf the Web, check my email, interact with most hosted applications like an online word processor, and even watch the occasional YouTube video (when the situation warrants the wait). After lightly chastising a couple techie-friends recently for their admitted lack of a reliance on broadband, it’s startling to realize that I’ve been surviving with less than a meg of consistent service.

What this tells is me is that 1. a lot of tremendous things are possible with less than a meg of access, 2. the less bandwidth I have the less I try to use it, and 3. we’re still a helluva long ways from the point where the use of broadband is not only ubiquitous but woven into the fabric of our day-to-day lives.

On one last note, I don’t want anyone to think I’m disparaging my EV-DO card. In all honesty, I love the thing. Being able to access the Internet from anywhere at anytime is a phenomenally freeing experience. With hotels often charging $10 a night for wireless Internet, the $50 I spend a month on my EV-DO service seems like a bargain. And while I’ve been lamenting over the lack of bandwidth, this little card pretty reliably delivers a half a meg of connectivity, with the occasional burst as high as 2Mbps. It may not be fast but it works when I need it, and that’s all that matters.

It's time for another Article Roundup, where I pull together a handful of articles and webpages I've found recently related to the wild world of broadband for your reading pleasure!

Racine Sophomores Discover Asteroid
Some Wisconsin teenagers in Racine through technology provided by Calvin College in Grand Rapids, MI used images from satellites based in New Mexico to discover a new asteroid. Only in a broadband world can that kind of geo-busting happen!

US Places Three Cities in Top Seven Intelligent Communities of the Year
After getting shut out last year, the US placed not one, not two, but three communities in the top seven list put out by the Intelligent Community Forum. That's huge progress that I'm going to follow up with and investigate as to what helped most. Considering the fact that all the descriptions mention broadband to one degree or another, I think I know what at least one of the key factors was.

New Edge to Offer Breakthrough Service for Priortizing Traffic Over DSL
OK, so I have to admit this link is to a press release, but I thought it worth sharing to illuminate the reality that there are legitimate business cases for prioritizing traffic on an Internet access network. While I do believe pipes should be as open as possible, to suggest that all prioritization is inherently bad is quite simply false.

4Home Media Launches Broadband Home Health Service
Another press release and it's for a product that doesn't exist yet, which I tend to dislike talking about, but it merited inclusion based on the fact it's a CES story about a healthcare product that uses broadband to some degree. While much of the buzz around has been of people getting creeped out, 4HomeMedia's service lets you set up a bunch of sensors around someone who's potentially ailing, like an elderly grandparent, and keep track of them online. It'll be interesting to see what kind of demand it generates when it launches later this year.

I had the great fortune to chat with Lynn Meikle of Meridian Township, Michigan on Monday.

You may have heard of Meridian recently as last Friday they made news by filing suit against Comcast to stop plans to move PEG channels from the analog tier to the digital one, which would then force analog subscribers to pay for a digital converter box to access them.

According to Meikle, the problems started with the introduction of a new statewide video franchise bill in Michigan, which has to date been interpreted as an opportunity for Comcast to revisit its local franchise agreements and pick and choose which parts it wants to abide by and which it wants to ignore.

One example of this is Comcast’s closing of public access studios, which they were charged with operating and making available to the public per the local franchise agreement. Now Meikle says the public access channel is waning as anyone reliant on those facilities to create content can no longer use them.

But the issue at hand in this lawsuit is Comcast’s desire to push PEG channels off its analog tier and onto its digital one. For them, it’s a matter of economics. By moving PEG off analog, they’re able to open up sufficient spectrum to add 40 additional digital channels and free up additional bandwidth on their network.

Meridian, on the other hand, has made the claim in court that there are federal laws that supersede the state franchise and protect the interests of PEG by guaranteeing their access to the platforms with the widest reach at the lowest cost.

But that’s only the legal way to look at things.

So if we've got the potential for there to be a form of HD that requires a minimum of 124Mbps within the next decade, to me that suggests that the only wireline access technologies we should be considering are those that can undoubtedly provide speeds in excess of 1Gbps to homes within the next ten years.

I'm not suggesting forcing anyone to do anything, but instead that government should not be incentivizing the deployment of any network that is not designed to eventually expand to meet the demand for a gig to every home.

On a related note, I think it's high time we start up the rallying cry of "No New Copper in the Ground!" It's been a couple of years, but the last I heard the majority of new home developments were still putting copper into the ground for telecommunications. This is utterly flabbergasting to me as it's my understanding that the cost of laying fiber is now roughly the same as putting in copper, yet fiber has lower operating costs and an exponentially higher potential for delivering bandwidth. What am I missing here?

I can understand why private interests are dissuaded by the huge capital expenditures fiber to the home demands, but from the perspective of creating a national broadband policy, the government needs to be thinking about this long term as anything they're spending money on should be with an eye towards the let's-only-have-to-do-this-once long-term and not the how's-it-going-to-make-us-money short-term.

And all you have to do is look to the future of "HD" video to see that some day soon you too will have reason to want/need a gig to your home.

What Is HD?

| No Comments | No TrackBacks

I’ve lamented in the past over the government’s effort to redefine broadband as being a speed capable of streaming one high definition signal but I wanted to revisit this topic in more detail.

The truth of the matter is that “HD” has no single definition.

Even in the realm of standards, three different resolutions all claim to be HD: 720p, 1080i, and 1080p. Today most everything “HD” that you see through you cable provider or on a Blu-Ray disc is actually 720p.

But that’s not the end of untangling this briar patch because labeling one’s video as “high definition” is a practice that’s common on the Internet, yet rarely does that label apply to video that’s 720p or greater. In fact, most often “high definition” online video refers to video that’s around DVD quality, typically delivered at about 1.5Mbps. This garners the hi-def label because the clarity is so much better than VHS-quality that it feels hi-def when really the label should be “higher” definition.

The definition of “what is HD?” swings dramatically in the other direction as well.

Quad HD is basically what it sounds like: high definition with four times as many pixels than regular, old HD.

The big daddy on the block, though is Ultra High Definition Video, or in Japan “Super Hi-Vision.” To get a sense for what means, imagine a screen with sixteen times the resolution of what’s considered “HD” today (roughly 32 million pixels vs. the 2.1 million of today). This technology will also support 60 fps (frames per second) as opposed to the 24-30 fps most common today. Japan has set a goal of 2015 for this technology to be a viable commercial offering.

But here’s the really interesting thing: to make Super Hi-Vision work in the labs today it requires 24Gbps of throughput. That’s right - 24 Gigabits per second.

To be fair, before this technology enters the wild developers first must find a way to squeeze it into 124Mbps in order for broadcasters to be able to support its delivery. But even still, that’s 124Mbps to watch one video stream on one TV.

While I haven’t specifically read about Super Hi-Vision’s use online, there’s little doubt that no matter how high resolutions go someone somewhere will want to try and deliver that video over the Internet. And as speeds reach and surpass the 100Mbps to homes it may be feasible.

But the point I intended to make here is to stress the fact that “HD” doesn’t mean one specific thing; it can mean many things. Additionally, “HD” is not a static, fixed number. It’s not like once everyone has a current generation HDTV that the push to higher quality video is over. In fact, in many ways it’s just begun, especially as all online video applications strive towards higher and higher bitrates. Plus, it’d be foolish to think that even the seemingly space-age technology of Super Hi-Vision is the endgame; there’s little reason to think that video resolution won’t continue to increase.

And what will we need as the quality of video continues to increase? More bandwidth, of course.

This is another example of why to me it’s not a matter of if we’ll ever need huge pipes into every house but rather when, because once one person has super high definition everyone’s going to want it. And even if the highest of hi-def never takes off, the push towards ever higher bitrates will never end, and therefore neither will the growth in demand for bandwidth.

Not to toot my own horn, but my submission to GigaOm's writing contest about what I'm optimistic for in 2008 related to the Internet was selected as an honorable mention!

You can see it about halfway down the list on this page.

This acknowledgment has made me even more optimistic, not because of any personal glory I may have won but because of the recognition my message received. It means that I'm not alone in my belief that 2008 can and should be a tremendous year for broadband, and that the incredible benefits it makes possible are goals worth working towards achieving for the betterment of society.

On another note, if you've read my piece earlier today on ConnectKentucky and haven't checked the comments I highly encourage that you do so as Brian Mefford stopped by and pasted in his lengthy response to Brodsky's highly critical article.

I'm looking to speak with people on both sides of this issue in the coming weeks to get a sense for where people to stand in the hopes of finding a position somewhere in the middle that we can all agree upon. More to come soon...

Don't Let Criticism of ConnectKentucky Obscure the Truth

| 1 Comment | No TrackBacks

Connect Kentucky has generated tremendous buzz surrounding its efforts to spur the deployment of broadband as it builds momentum to try and establish a national model for how states can encourage growth in the supply of and demand for bandwidth.

But last week the program came under harsh criticism in a piece by Art Brodsky through the DC-based public interest group Public Knowledge.

Among Brodsky’s criticisms are that Connect Kentucky’s lauded maps aren’t inclusive to all providers, that their local community teams don’t engage as fully as they could and should with bringing about real change, and that the underlying driver behind the Connect Kentucky team isn’t an urge to do right by their state but instead an obligation to try and promote the services of their alleged backers, the big telcos.

While I haven’t seen it circling the news wire yet, I’ve had the good fortune to read the response by Brian Mefford to what he describes as a wholly inaccurate and misleading article that ignores facts in its quest to frame Connect Kentucky as a telco front group. (I’ll be following up with him personally to get further reaction and to determine if his letter is approved to print publicly.)

At this point, I don’t know enough to make a judgment call on who’s right and who’s wrong. Most often the truth lies somewhere in the middle.

From everything I’ve heard in talking to people, the Connect Kentucky folks, despite the penchant for getting good press, are upfront and forthright in talking about the challenges they’ve faced and the failures they’ve had.

And in talking with them directly I’ve never got the sense that what they’re trying to say is that their model is the best and only way to do what they’re doing.

But for now I wanted to make one simple but extremely important point regarding this matter.

No matter what amount of truth there might be in Brodsky’s piece, we must not lose sight of the fact that the two core elements of Connect Kentucky’s model--mapping broadband availability and creating teams of local leaders to encourage the adoption of broadband--are essential to our national broadband policy.

Simply put: we’ve got to have a better sense for where broadband is and isn’t available, and we’d be remiss if we didn’t start taking a more proactive approach towards encouraging adoption and use of the Internet, which is arguably best achieved at the local level.

Don’t get me wrong, we can and should have a civil discussion over the merits of one way to accomplish these goals over another, but we have to all agree that these two goals not only have merit but should be top priorities for anyone interested in America’s broadband future.

Because even if everything Brodsky said was true, and Connect Kentucky is an evil telco-run organization worried about nothing more than the interests of their corporate handlers, that doesn’t change the fact that the spirit of what they set out to accomplish is both admirable and necessary.

Whether or not they’ve gone about it in the right way is the only question at hand, and the only reasonable thing to do at this point is to continue working towards establishing national models for assessing the current state and supporting the future of broadband, whether we use Connect Kentucky as the be-all-end-all model, a source for inspiration, or a series of lessons learned on what not to do, what they’ve been trying to do has to be done one way or the other.

One of the most lasting impressions from my trip to Vegas for CES last week was the prevalence not just of cool cameras and phones on display but those already in the hands of an attendee base heavily invested in the creation of content.

It was truly stunning how many people had their cameras and camcorders and audio recorders all out and recording, capturing different aspects of the days' events.

Presumably, many of them were trained on creating content for a blog or tech news site. So they're not just creating content, they're doing so with the intent of distributing that content over the Internet.

And perhaps none were more impressive from a tech-geek perspective then the two men behind one of my favorite blogs: Technology Evangelist. Ed Kohler and Benjamin Higginbotham often bring a full HD camera crew along with them to events, but this time around all they had was an unbelievably fancy looking phone through which they not only recorded high quality video but also uploaded directly to their blog.

Needless to say, my "only makes phone calls with the occasional text message" cell phone was feeling rather inadequate while discussing these matters over a beer at the Venetian.

But back to my main point, which is that we live in an era where anyone and everyone can be and increasingly are trying to contribute their own perspective to the public, and in doing so we're all creating incredible demand for bandwidth in order to support the transfer of all this data.

The problem, though, is that with so many new voices it can be nigh impossible to keep track of everything that's being said.

Case in point, during a recent presidential debate one online forum registered enough comments that it'd take you more than 10 hours to read them all, despite the event itself only being a couple of hours long.

As another example, after the New Hampshire primary I jumped onto Google News and saw more than 4000 stories related to that news, and those were just the more reputable news sites. I'm sure when you add in all the personal opinions on blogs and message boards the number jumps at least into the tens of thousands.

And if the trends shown at CES continue, of more and more ways to capture audio, video, and pictures getting integrated into more and more devices, there's little reason to think this trend will slow any time soon.

We truly are entering a world where everyone's a content producer, where everyone can try their hand at the process of creating news, but in doing so we're also going to have to figure out a way to sort through all this noise in order to find the truth.

What I Liked at CES - Innovative Apps from South Korea

| 1 Comment | No TrackBacks

With their leadership in the deployment of fiber to the home, I suppose it shouldn’t surprise anyone that two of the most innovative broadband-connected devices on display come from South Korea.

The first is called the Virtual Studio from Darim. What it allows you to do is create presentations for the web where it looks like the speaker is standing in front of a futuristic screen that can virtually display any content, like PowerPoint slides or video.

The experience it creates is akin to a weatherman in front of a map, only you can also do things like reorientate the position of where your head shows up on the screen and dynamically switch between content on the fly complete with cool 3D transitions.

The system itself is a box that you connect a camera and a broadband connection to that you control with a separate LCD touchscreen. This all includes a hosting service and a host of value-added possibilities, like the ability to install a telepresence videoconferencing system alongside these presentation system.

While many of DARIM products are for high-end professionals, Virtual Studio is aimed at a broader smaller business market, with its basic entry level version clocking in at $13,000. Still a significant investment, but much cheaper than the tens and hundreds of thousands of dollars analogous broadcast equipment can cost.

The other innovative Korean product I wanted to mention is SBN Tech’s Video Phone. (Should admit this is my generic name for it; there wasn’t any literature I could take away and their website doesn’t seem to list this specific product.)

While videophones aren’t new, what I liked about theirs is first off its 10-inch touchscreen. It’s the biggest screen I’ve seen on a videophone and the picture looked great, though I can’t speak for its performance as they were running it over a closed network, not the Internet. Another advantage to this device is its attempt at interoperability. While a language barrier prevented me from understanding all of what they were saying, I did catch that they’ve built this videophone to work with those made by D-Link. Plus, like seemingly everything these days, it has built in Wi-Fi so no need to connect to a computer to get online.

What I also liked was the fact that while today the device primarily just makes calls, it has the capacity to run additional applications on top of or around the live video. While it may be a challenge figuring out which applications to pursue first and how to do so without overly complicating the device, this was one of the few products that gave me a real wow factor that I saw on the show floor.

The biggest issue I had with it, though, is its cost. I can’t remember the exact numbers, but it’s going to be either $300-400 for the videophone, and then another $30 or so a month for the ability to make calls. That seems like a lot, especially per month, as there a number of free videocalling options other there, but then I realized that $30 a month is roughly what people pay for voice service. So perhaps that number isn’t so high if we assume that some day we might start to see videophones becoming more prominent than voice-only ones.

Here's a short video showing what it looks like:

What I Like at CES - IBM's 3D Future

| No Comments | No TrackBacks

Instead of burying you with one mega-post I thought it best to split up my comments on specific products I saw out at CES into a series of shorter posts.

First up, I found myself somewhat surprised that one of the most innovative things I saw at the show came not from a new, small company or a large media company but instead IBM. More specifically their pursuit of the 3D Web.

As a quick background, IBM has made splashes recently with their strong interest in the virtual world of Second Life, where users create avatars in order to run (and fly) around a virtual 3D world in which they can interact with other users and build a wide variety of things, like clothes, buildings, and actions.

Also a while back I’d read about IBM’s internal project entitled the Metaverse, which was a 3D world intended for use by IBM employees to communicate with other IBM employees in a 3D environment.

So it’s not like I hadn’t heard of this before, but I still found myself surprised when I came upon IBM’s booth and the most prominent thing on display were kiosks touting their work experimenting with and developing a 3D Web.

In talking it over with their representatives, I learned that IBM is highly committed to this 3D future, whether it’s working on developing a virtual presence for a wide range of possible customers (like retail or real estate, where having a virtual mockup of a space can help close a sale) as well as for internal use.

I find their internal use particularly interesting as what started out as a skunkworks project has grown into an environment that supports the interactions of more than 5000 IBM employees. When asked what the greatest benefits of using 3D virtual worlds has been, the two primary areas were in collaboration and modeling.

On the collaboration side, they cited the positives of having more visual cues indicating who wants to be talking during a multi-person meeting (something that can be nigh impossible on a conference call) as well as the extremely positive sign that often after meetings they find pairs of people wandering off or staying behind to continue conversations. This suggests the conversations they’re having are real and engaging, despite being done through computer-generated avatars.

On the modeling front, virtual worlds and Second Life in particular offer robust tools for creating in-world objects, whether they be boxes or chairs or anything really. Because of this, IBM has found useful these tools for creating 3D virtual models of real-world products for people to get a feel for prior to manufacturing or physically prototyping. This makes a lot of sense and seems like it could be an incredibly powerful tool for many distributed workplace situations.

All in all I walked away highly impressed at the commitment IBM has made to 3D virtual worlds. And quite frankly, I couldn’t be more excited to see what having the resources of a giant like IBM committed to this space will mean for the continuing evolution of 3D environments for practical business purposes.

Working on a longer post that details my reactions to specific products I saw while in and around the show floor of CES, but for now I wanted to share my overarching feelings.

To be frank, I was more than a little disappointed.

While I'll admit my inability to see everything CES had to offer, I feel comfortable claiming to have walked at least a third and probably more than a half of the exhibit space, which is a sufficient enough sample to draw this conclusion: there wasn't much there that used broadband in an innovative way.

Sure there were fancy shmancy webcams, boxes that bring Internet video to your TV or pocket, a variety of security systems, and other boxes that help you manage and remotely access your content, but there was next to nothing there that sent or received bits over the network that did much of anything new other than to add features and/or make easier existing functionalities.

Also disappointing was the near total lack of anything broadband-related that enhanced something other than entertainment. This shouldn't have been surprising given the overall bent of CES, but it's still frustrating that we have yet to a reach a point where a show like CES is littered with cool new technologies that leverage broadband to improve healthcare, education, and government. These areas weren't totally ignored, but at best they were only represented by a small handful of companies.

But at the same time, the degree to which entertainment is increasingly wired to the Internet is staggering. We're talking about more and more set-top boxes that bring online content to the TV, TVs that connect directly to the Internet, cars with built-in media systems, cameras that upload photos and video all on their own, home servers for backup and managing your media, USB thumb drives with Wi-fi that automatically backup to an online service, and more.

I don't think any of these things are going to improve society to any great degree, but they're all likely to find an audience eager to leverage their capabilities and in turn create more demand for bandwidth.

And despite my complaints I did find some cool things, which I look forward to sharing with you all in tomorrow's post.

As mentioned yesterday, I attended a panel of representatives of the biggest broadband providers in the US, including: Tom Tauke of Verizon, Jim Cicconi of AT&T;, Joseph Waz of Comcast, Mr. Ali of Sprint (my apologies to him if he's reading this: I didn't write down his first name, it's not in the program, and Google has failed me), and the co-chair of the Internet Innovation Alliance Bruce Mehlman.

The panel started with short presentations by each. Here are a few notable items I took away:

- Waz mentioned that four out of ten of their customers are still analog, but that number is decreasing, and as it continues to do so it will allow Comcast to recapture bandwidth. How much that might improve the performance of their network I didn't get a chance to ask, but I'm going to follow up with him to find out.

- Tauke focused primarily on demand for bandwidth, showing a bar graph that compared the amount of bandwidth needed for different apps contrasted against the capacity of cable vs. DSL vs. FiOS. His overall assessment is that with demand continuing to grow, all carriers are and will continue to be forced to invest in their network.

- Tauke also specifically mentioned the challenges of and need for a better way to reach rural Americans who don't have access to broadband today. I found this somewhat surprising given Verizon's push to divest itself of its rural assets, but at the same time heartening that they seem interested in directly engaging with trying to find a solution to this problem.

- Cicconi shared AT&T;'s attempt at creating a graphical representation of the Internet, citing the 320,000 nodes that make it up, and briefly discussing the fact that the Internet's a network of networks and therefore not controlled by any one person or company. He also alluded to AT&T;'s own investment to keep up with demand as it works on quadrupling the capacity of its overall network (read more about this here).

They then started in on a series of questions from the moderator. As the answers sometimes made points that were somewhat tangential to the questions, I'm just going to continue my use of bullet points to highlight those tidbits I found most interesting.

It's amazing how combining a seemingly endless show floor with limitless choice in nighttime activities can derail the best intentions of any blogger to post throughout the day. Never worry, though, as there have already been many interesting, enlightening experiences I'm looking forward to sharing with you throughout the week.

The first I want to delve into is the bizarre juxtaposition of the two sessions I was most interested in attending yesterday.

One was entitled "National Broadband Deployment: Are We There Yet?" and the other was "Finding the Right Bandwidth."

"National Broadband Deployment..." featured, among others, a representative from Google, the Consumer Electronics Association, and non-elected representatives of Congress and the Executive Office of the President.

"Finding the Right Bandwidth" featured a slate of wireline and wireless network operators, including AT&T;, Verizon, Comcast, and Sprint.

The unfortunate challenge I faced was that the first session was at noon at the Las Vegas Convention Center, and the second was at 1:30 at the Venetian. Being able to get between the two in less than half an hour was an uncertain affair; both buildings are huge, it can be a scrum to get a taxi at the convention center, and the only other option--the monorail--I was completely unfamiliar with.

So, much to my chagrin, I made the decision to forgo the "National Broadband Deployment..." session for the "Finding the Right Bandwidth" session in the hopes of finding a more interesting discussion regarding the use of broadband.

I admittedly regret the decision to some degree looking back as the second session didn't plow much new ground, especially in the area I'm most interested in of the supply and demand of bandwidth--I'll share my notes from this session in my next post--but that's not what frustrated me most.

That honor goes to the unintentional symbolism of having two sessions about the availability (and, to a lesser degree, use) of broadband that each to a degree feature a different side of the debate (though Verizon and NetCoalition, a group of ISP organizations, were on stage as well) occurring on opposite sides of the conference.

It's almost like the realities of the broadband debate being overly polarized and lacking real two-way dialogue became personified in the scheduling of these two panels. The arrangement paralleled the tendency for the two sides on issues like net neutrality to not engage in a straightforward, productive dialogue.

I'm not trying to blame anyone for this happening, especially since I understand on a much smaller level the challenges of constructing a conference agenda that is inevitably imperfect due to availability and scheduling and the like.

But I couldn't ignore the irony of this arrangement.

The truth of the matter is that this intellectual stalemate that has put a chill over substantive telecom debate has to end. There are just so many issues that are going to have to be worked through over the next few years as the Internet continues its rise to prominence as the dominant medium of the 21st century.

If we want to have any hope of resolving these issues and finding the best path to America's broadband-enabled future we need to create an environment where people and companies can state their opinions and be heard and thoughtfully considered, where we can all disagree yet find a way to focus less on our differences and more so on our similarities, where we can uncover the threads of truth that connect us all and that we can all agree on.

And to do this, the only way is to have all parties come together and engage in a civil, upfront dialogue where we all respect the opinions of others, state our own opinions eloquently and without vitriol, and understand that on some issues compromise and a willingness to evolve one's positions is the only approach that will allow us to find those truths that can ultimately be what's best for our country rather than what's best for an individual company or person.

So here's to 2008, a year that I will both hope for and fight for to be the time when we finally all come together, acknowledge the importance and validity of all parties involved with making the Internet great, and find a way to come together help everyone participate in and profit from a broadband-enabled future.

I'm on the ground in Vegas preparing to head out to explore what many wondrous tech treasures the mega-conference CES has to offer.

Last night, while walking the Strip, my mind started to wander, taking in the sights and sounds of a casino-lined midnight street. For whatever reason, this carnival atmosphere got me thinking about broadband and the similarities between it and Vegas.

Vegas is an amalgam of stuff built on top of stuff, not to the extreme of a New York City, but enough so that you can see how growth has been organic, driven not by a central plan but by the interests of companies and individuals looking to create what they think is the best way to part people from their money.

Through this uneven growth many great successes have been realized and wondrous sites created, but you still can't shake the haphazard feeling of how today's Strip has come to be.

This was the first thought that got me thinking about broadband and the way it's been built out; a series of private companies and individuals deciding what they think is best without any semblance of a central, organizing plan.

There's another feeling I can't ignore when I'm in Vegas: the shallowness of it all. It's undoubtedly America's playground, but for what purpose? Do people feel satisfied when they leave (questionable)? Is there any real purpose here other than mindless entertainment (doubtful)?

Again my thoughts bled over to broadband as I couldn't ignore the reality that to date the impact of the Internet has been less than substantial on society as its use remains in large part limited to facilitate the delivery of light entertainment, videos of dogs chasing their tails and games that only help pass the time.

But in thinking this through further I came upon one huge difference between broadband and Vegas: in Vegas everything is geared around the central precept of cajoling people into spending as much money as possible. You can't go anywhere without finding a new way to spend your money; in fact, most of it's been designed so as to force you to walk past more opportunities to do so than you might like. And the casinos have spent billions in an attempt to create environments in which you're compelled to reach into your wallet and get into the game.

Now contrast this intent against broadband. To date, network operators have done relatively little to encourage the use of their core service of providing broadband access. There's little to no effort to get people to consume more bandwidth, to become "addicted" to its use or at least feel compelled to give it a try.

To draw from another Vegas analogy, I sometimes feel like network operators are in the business of selling timeshares. They want to get people to invest in timeshares, but not actually use them. Even more bizarre, they're fine if people show up, but they're not doing much to introduce customers to all the wonderful things that can be done there. And overall their focus seems much more on building out their services to influence public opinion and drive new customers rather than trying to maximize the experience of their existing customers.

I'll admit this analogy may not be a great one as I don't literally mean to draw parallels between network operators and timeshare peddlers, but I used it to drive home my point: there's a whole lot more network operators could, and probably should, be doing to encourage demand for their services.

In some instances, there is some small movement in this direction, but what we need is a seismic shift, something far beyond incremental steps, if we are to fully embrace the possibilities of broadband.

And despite my seemingly railing against network operators in this post, in actuality it's my belief that we won't be able to get where we want to go as quickly as we need to get there without the network operators being more engaged in the pursuit of increasing demand for bandwidth.

They've got the direct relationships with customers; they've got the networks and the ability to introduce a new era of quality of service online; so they can be the greatest partners in the push to make America broadband-enabled. If only they'd take the lesson on full display in Vegas that once you get the customer you should be working hard every day to increase their demand for your services.

But enough big-picture thought for now; it's time for me to focus on the task at hand: heading out to the conference to explore the latest and greatest in consumer electronics.

I'm going to have my laptop with me and am hoping to get some posts up throughout the day, but if not you can be sure I'll post a daily roundup of my experiences today and tomorrow in America's playground, now littered with the coolest cutting-edge toys.

Wish me luck!

I've come across another angle through which to consider the amount of bandwidth needed to support a 21st century society.

This article highlights how upgraded telescope sensors are providing the SETI (Search for Extraterrestrial Intelligence) project with a 500-fold increase in data from their search of the cosmos for people to talk to.

I think this is an important trend to note, that the increase in data isn't only coming from having more users come online and the introduction of more bandwidth-intensive applications, there's also the push by most anyone collecting data for research from digital instruments to improve those instruments and therefore collect more data.

I mean, what SETI's doing is nothing new, they're just now able to record and process data that's many magnitude's of order bigger and more detailed than they were in the past.

And this is data that's likely to go directly out over the Internet as the SETI project has long relied on the power of distributed computing through its SETI@home initiative to process the giant data sets their research creates. (As a review, distributed computing relies on using the idle processing power of many lower powered, networked computers rather than one ultrapowerful supercomputer.)

The simple truth is that we're smack dab in the middle of an exponential growth curve in the amount of data that needs to be stored and processed. And while it's been ramping up for a while, we're now entering a time of truly epic growth, especially as new instruments and technologies allow us to gather more data than ever before.

Hey look, 4G technology--one of the many new wireless standards in development--hit 173Mbps in a recent field trial.

That's great. Having faster wireless broadband would be a wonderful thing.

But every time I see a story like this I find myself grimacing as I worry what fuel it may add to the fire that someday we'll live in a world without wires.

So I can't help but continue to hammer on this point: wireless is essential for anywhere access to the Internet but it doesn't have sufficient capacity to be the primary conduit.

How do I know? Well, 173Mbps may sound like a lot but you can't forget that that's not per user that's per tower or node. So the more users that try accessing the Internet through a tower the lower the speeds they can realize.

I feel the same way when people start talking about the latest and greatest copper technologies. For example, DOCSIS 3.0 is being hailed as the savior of cable systems in the face of fiber to the home deployments like Verizon's FiOS. Sure it promises 100Mbps to the home, but getting their won't be free and the long-term outlook for capacity on copper is cloudy. The outlook is even more uncertain for technologies like Broadband over Power Lines, which are totally unproven in delivering 100Mbps speeds to homes over great distances.

Let's now look at this against the crystal clear backdrop of fiber. An optical fiber can carry not just megabits per second but gigabits even terabits (that's thousands and thousands of megabits). In the labs a single strand of fiber has shown sufficient capacity to support all the world's Internet traffic.

I mean, this is just such a no brainer to me: the Internet is fiber optics, so some day the goal should be to have a fiber optic cable to every house. No other last mile access technology offers the limitless, future-proof capacity of fiber, so unless you think demand for bandwidth is going to just stop the only way to look at this is that fiber to the home has to be the ultimate goal for any national broadband strategy.

For me that means the only question left to answer is whether we'll get these fiber optic connections in the next five years or the next fifty.

Again, this isn't an effort to discredit wireless and copper technologies, but instead an attempt to set wireless in its place as a complementary technology and to frame copper as a transitional technology relative to the eventual goal of a fiber pipe to every home.

It never ceases to amazes me how often the discovery of an optimistic story of the Internet comes alongside that of a pessimistic tail of failure elsewhere.

Today I've had not one but two colleagues mention to me the recent partnership of Netflix and LG Electronics to develop a new set-top box along with other avenues that enables the delivery of movies from Netflix over the Internet to the TV.

Yesterday I read about the demise of Wal-Mart's digital download service for movies.

Anyone else find this funny? The nation's largest retailer of DVDs couldn't make digital downloads work just as the nation's largest DVD-rental-by-mail company decides to give it a go.

This isn't to say this is an apples-to-apples comparison, though. Netflix has already been delivering movies to the computer as a free add-on to their mail service under the simple premise that a movie delivered online means saving the cost of mailing media. Whereas Wal-Mart was totally new to the space, having to take a big step outside of its core business acumen of making the delivery and retailing of physical goods as efficient as possible.

At the same time, Wal-Mart isn't the first company to underperform or completely fail at establishing a viable business in getting customers to pay for video delivered over the Internet, so despite Netflix's built-in user base, existing revenue, and relationships with content providers it's hard to get too excited about another movies-on-a-box offering. That said, eventually someone has to be a big winner in this space and Netflix likely has as good a shot as anyone.

But for now that's mere speculation as their announced partnership did not include any ready-for-market products.

Even still, my point holds true: too often stories of Internet success seem accompanied by Internet failure. For every opportunity seized so many others miss out despite the best of intentions.

Some of this failure is good as it allows for lessons to be learned that can be applied to later, better iterations of an idea, and other failure is welcome as the free competition of the Internet marketplace weans out the weakest products and services, as is arguably the case with Wal-Mart's digital downloads.

But too often failure is attributed to lack of awareness and acceptance by users, or resistance from entrenched interests, or the basic inability to understand and utilization the potential the tools of the Internet have to offer.

What we need is a fundamental reimagining of how we approach broadband initiatives, where we stop considering things as individual initiatives and start thinking about them as part of a larger whole. We shift our thoughts from "How can I use this cool app" to "How I can improve what already exists through the use of broadband".

Only then will we move past the one-step-forward-one-step-back pace of Internet development and make possible a society where the use of broadband is intrinsically weaved into the fabric of our lives, leaving us all the better for it.

I've written about the exaflood more than once in the past, about how we're facing an oncoming explosion of demand for bandwidth to support all sorts of broadband applications.

Most recently I wrote about the need to stop talking about the exaflood as a maybe and start recognizing it as the reality it is.

Well if you're still on the fence about this, consider the findings discussed in this Computerworld article.

While I'm not sure if I can recommend reading this article as it's filled with pretty technical info, the gist of it is that everyone from entertainment companies to universities to hospitals are dealing with data overload as they attempt to establish digital archives.

Here's a startling statistic: "According to Milford, Mass-based analyst firm Enterprise Strategy Group Inc., private sector archive capacity will hit an eye-popping 27,000 petabytes by 2010."

For those keeping track at home 27,000 petabytes equals 27 exabytes, but for now that's beside the primary point that this is a lot of data and that finding ways to efficiently store all this data has already become a significant challenge.

What's driving this growth is things like archiving email, video libraries, medical test results, and beyond. What do all these applications have in common? Their usage is all on the upswing, further driving demand for storage.

Now you may ask what this all has to do with the need for broadband. Well, simply put all this data needs to get into its intended archives in the first place, which means either being carried in on a physical disk or sent over a network. Additionally, the data in these archives is most often meant to be accessible by others, and while they may not all be connected directly to the Internet many of them are and arguably all of them eventually should be.

So here's an example of one broad category of applications that by the end of the decade will potentially be sending exabytes of data over the network. And this application largely has nothing to do with real-time delivery of data, and the projections don't rely on mass adoption by new users. This is simply an extension of what's already happening today.

When I look at the exaflood I do so in terms of the amount of data people want to send over the network, and from that perspective there can be no doubt the exaflood is on its way, assuming it's not already here.

The unresolved question is whether or not the exaflood will overwhelm the networks that have created the opportunities to send data around at high speeds. That's a more contentious issue and one that, unfortunately, we'll either A. wait for but not hear anything from and therefore ignore, or B. have crash over our unsuspecting, unprepared heads.

This reality isn't intended as a threat so much as, in my opinion, an effort to raise the dialogue about how important getting more capacity into the Internet is.

And I'm hopeful that at least on this issue we can all be in agreement: more bandwidth is a good thing and we should support every effort to get it.

Online Music Gets Less (More) Restrictive

| No Comments | No TrackBacks

There's some big news in the world of digital rights management, or DRM: Warner has become the third major record label to sign on board with Amazon's DRM-less music download service.

It seems like a tremendous development, a sign that the music industry is recognizing that perhaps DRM is not the panacea to all their problems and that perhaps being more liberal in what they allow customers to do with their music might not be such a bad thing.

And then I read this story, about how the RIAA has opened a new front in its fight against pirates, now claiming in court that it's illegal for a customer to copy music from a CD they legally purchased onto their computer for personal use.

So what they're saying is that when you buy a CD, the only rights you're purchasing is the rights to play music from that CD, you're not buying any rights to the music itself.

Now this claim doesn't surprise me too much. It's always been the case that when you buy a movie or album on disc you're not buying the music so much as a license to play that video or music on that disc. But at the same time, users have years of experience taking the songs they buy on CD and moving them onto other devices, whether it be a computer or personal media player like an iPod.

Because of this, it's flabbergasting to me that only now would the RIAA try to exercise its right to declare infringement for what has become a very common user behavior.

The arrogance of the belief that they might actually not only win this case but introduce a new era where consumers will willingly pay more money for the rights to play music on their mobile device or computer when they've already bought an album is galling. When will the recording industry understand the age-old adage that consumers will only pay more when they get more?

It's as if they don't understand how precarious a position they're in. More and more artists are recognizing that they don't necessarily need major record labels to make money creating music. And more and more consumers are realizing that there's a whole world of music available on the Internet that they didn't have access to before and now can get access to without having to worry about the schemings of major corporations.

Perhaps the most telling story I've read about this whole space is this one, which details how given the current pricing scheme of a site like iTunes it would cost a user tens of thousands of dollars to fill up the capacity of the latest media players with new music. And you know what? There are rumblings that record companies don't think the 99 cents per song iTunes is charging is enough.

There's undoubtedly a sense of madness with a touch of desperation in this space, but if record companies don't open up their eyes to the reality of their situation, their constant machinations to "protect" their content may lead to their untimely demise, when the music makers and their listeners finally realize they don't need major corporations to tell them what they can listen to and how they can listen to it.

Yet another example of how broadband is upending a traditional marketplace, introducing both new opportunities and new challenges to the status quo.

To start off the new year, I'd thought about putting together a post of all the 2007 year-in-review and 2008 predictions articles I'd come across as they relate to the Internet and the use of broadband.

While perusing my RSS reader yesterday, I discovered that I was far from the first to do this, perhaps best exemplified by Cynthia Brumfeld over at IPDemocracy.com. She's produced two charts: one of 24 Year-in-Review Articles and one of the Top 16 Articles That Feature 2008 Predictions.

So, having been beat to this proverbial punch, instead I want to highlight a few similar articles not included on her list that for good or bad made me think more deeply about the recent past and oncoming future of the Internet.

Consumer Apps: 2007 Year in Review
This ReadWriteWeb.com post highlights, in my opinion, how the broadband app space is still too focused on me-too applications that don't do much to further society. Social networking is the main area of discussion, following by talk of personal publishing.

These apps are certainly not bad things, it's just that they tend to create a world that's somewhat separate, even tangential, to the rest of society. People are expected to join these networks to be part of a group, but to what end? People are encouraged to publish writings about their life, but who cares? It's not that social networks and personal publishing aren't being used for good, it's just that the majority of the effort put into these endeavors, both by developers and users, doesn't seem aimed at any end other than filling up pages with content, creating shallow relationships with strangers, and sharing random tidbits with friends. Again, I'm not against these technologies, I'm just frustrated by the fact it seems like we could and should be getting more good out of using them.

The last three trends they mention are IPTV, web office, and iPhone. These are all trends that really matter as getting video over the Internet, finding new ways to be productive, and having mobile experiences that matter will be three of the biggest drivers of demand for bandwidth in 2008.

2008 Web Predictions
You'll find lots of interesting thoughts in this list of predictions from ReadWriteWeb.com. The only problem is that they all seem to be too insular, aimed at people in the know, describing things that will happen to further the existing paradigm rather than introduce new opportunities.

Internet development, at least that which gets the most publicity, is still focused on "consumer" apps that deal with entertainment or communication. This reality bears out by the fact that not a single prediction found herein regarding web applications talks about the use of broadband in healthcare or education or government.

I've decided that one of my missions for 2008 is to talk to more applications developers in the hopes of convincing them to focus at least some of their efforts on developing things that could benefit society at large.

Drama 2.0 Predicts What Won't Happen in 2008
Is it a bad sign when the predictions article I agree with the most is the one predicting all the things that won't happen in the coming year? Predictions like "The Majority of Web 2.0 Startups Won't Develop Scalable Long-Term Business Models" and "There Won't Be Much Innovation" found in this Mashable.com post unfortunately ring too true.

And the reasons for these anti-trends are in large part the issues I've lamented earlier in this article. The Internet space is still too caught up in its own hype, developing within its own bubble, catering to the early adopters who are already there. If the Internet really wants to grow up, spread its wings, and find the marketshare needed to grow and continue to innovate, the broadband industry needs to start focusing more on engaging all of society with broadband apps.

Despite these concerns, I'm hopeful that '08 may be the year we see the first apps that are designed to be must-haves for everyone that will break free from the boy's club of the Internet to benefit everyone.

Mashable's 2008 Predictions: Mark's List
This list of predictions is still a little industry-centric, but I liked it because the predictions are all well-reasoned and explained and all seem likely to happen. If you're interested in websites becoming the new operating systems, the push towards common logins between applications, and/or the business of online video, I'd encourage you to read this post.

Looking, back, looking forward: Best of 2007 and predictions for 2008
This article cites a decent mix of trends from 2007, including online video, open access, and casual gaming, but I have to admit my dismay at reading through their predictions for '08. Three out of the seven focus on the wireless space in some way, whether it's the devices or the networks.

The reason I bemoan this attention isn't because I'm anti-wireless--I think wireless access is a crucial piece of the broadband ecosphere--I'm just worried that we're going to get so caught up in hyping mobile that we'll lose site of wireline applications that demand more bandwidth, and that we'll get distracted by the glitz of new mobile apps and not pay enough attention to those mobile apps that may benefit society, like falling over ourselves about mobile video without considering the possibilities of mobile pain diaries or electronic medical records.

About this Archive

This page is an archive of entries from January 2008 listed from newest to oldest.

December 2007 is the previous archive.

February 2008 is the next archive.

Find recent content on the main index or look in the archives to find all content.