Why is this page text-only?

ABOUT

App-Rising.com covers the development and adoption of broadband applications, the deployment of and need for broadband networks, and the demands placed on policy to adapt to the revolutionary opportunities made possible by the Internet.

App-Rising.com is written by Geoff Daily, a DC-based technology journalist, broadband activist, marketing consultant, and Internet entrepreneur.

App-Rising.com is supported in part by AT&T;, however all views and opinions expressed herein are solely my own.

« May 2007 | Main | July 2007 »

June 2007 Archives

June 4, 2007 5:17 PM

My Love/Hate Relationship with the Exaflood

A couple weeks ago, the Washington Post ran an editorial about the exaflood by Bruce Mehlman and Larry Irving, former presidential advisors on Internet-related issues and current co-chairmen of the Internet Innovation Alliance.

In it, they define the exaflood in the following terms:

“Yet as new content proliferates, today's high-speed connection could be tomorrow's traffic jam. The strain on broadband capabilities and the looming data deluge is often called the Internet exaflood. "Exaflood" stems from the term exabyte, or 1.074 billion gigabytes. Two exabytes equal the total volume of information generated in 1999. The Internet currently handles one exabyte of data every hour. This mushrooming amalgamation of data is pushing the Internet to its limits.”

Now, I have to admit, when I first heard the term “exaflood” a few months back, I hated it. The absolute last thing I’d ever want the explosive growth of the Internet be compared to is a natural disaster. To me, it seemed too threatening, adding a foreboding “or else” to the larger debate around the need for continuing investment in broadband infrastructure.

Yet over the last few weeks, through discussions with colleagues and reading this editorial, my views have begun to shift about the term and what it can mean.

While “exaflood” is often used in conjunction with images of a congested Internet, unable to handle the growing traffic of information, flood analogies can also have a positive spin: a flood of praise, a flood of new customers, and so on.

A common thread between these two definitions is the sense of being overwhelmed.

Of course, being overwhelmed is rarely a good thing, even if whatever’s driving the flood is. For example, imagine I start making the coolest T-shirts in the world from my garage. Someone somewhere picks up on what I’m doing and starts promoting it on a popular website or a TV show.

All of a sudden, I’m flooded with more orders than I can fill. Customers calling in are greeted with busy signals. My email server overloads and online orders begin bouncing back to people. What should be one of the greatest days in my life has been negated by getting too much of a good thing.

Could much of this consternation have been avoided? Absolutely. I should’ve already had a manufacturing partner in place to help production keep up with demand. I probably needed to hire more people to answer phones. And I could’ve acquired more email capacity to keep from getting overloaded.

All this brings up another aspect of the flood analogy: in most instances, as long as you’re properly prepared, floods don’t have to result in disasters.

But also like a flood, we can’t properly prepare for it by focusing only on individual issues. We need to develop a comprehensive understanding of the big picture otherwise we leave ourselves open to spending too much time focused on fortifying some areas while neglecting the leaks in others.

So with all this in mind, I’ve come to view the “exaflood” not so much as a threat but instead as a rallying cry meant to inspire us to further educate ourselves about how the business and technology of the Internet works so that when opportunities arise for the government to play a positive role in encouraging the continued growth and maturation of the Internet it can do so to the best of its ability.

June 7, 2007 10:03 AM

Visualizing the Internet through Akamai

Just stumbled across an interesting section of Akamai's website.

Akamai is a content delivery network, or CDN, which manages a network of servers placed across the country and throughout the world. They enable the delivery of content, including webpages, live and on-demand video and audio, and applications.

The link above will take you to a visualization module they've created on their website that shows in real-time the traffic that is going through their network. You can check out how many live and on-demand streams are in use and how many visitors they're serving per minute.

They've also got a real-time web monitor here that shows a map of the world highlighting where Internet traffic is particularly heavy. Also available is a mode where you can see what areas of the world have been hit most heavily by network attacks in the last 24 hours. (As I write this, Venezuela appears to be getting hit hard as they're the only country that's bright white.)

I can't confirm Akamai's claim that they deliver 20% of the world's Web traffic, but it wouldn't surprise me at all if that were true. And that's what makes these pages interesting, as they provide a window into what the Internet is doing at this exact moment.

For example, Akamai's currently serving over 30 million users a minute. If that 20% is correct, extrapolate that out and you can see that 150 million people are currently using the Internet.

Of course this is an incredibly inexact science, and in terms of the global population that number still only represents a small percentage of people online right now.

But even still, I can't help but marvel at the thought of how far the Internet has come as a legitimate mass medium, or enabler of mass media, or whatever you want to call it. 150 million is a lot of people doing the same kind of thing at the same time all across the world.

June 11, 2007 10:58 AM

Broadband Alone is Not a Panacea

I stopped by the Broadband Policy Summit last week and had the opportunity to listen to a keynote address by FCC Commissioner Robert McDowell.

While he said a lot of interesting things about what we need to do to encourage the deployment of broadband, it's what he didn't say that caught my attention most: Not one word about encouraging the use of broadband applications.

Now, I have to give him credit that he didn't fall victim to the easy trap of talking in vague, grandiose terms about the potential of telemedicine, egovernment, and the like.

But even still, not one word about broadband applications.

When people talk about broadband deployment it's often couched in terms that suggest that the deployment of broadband can be a panacea for all of societies ills. "If only we had more bandwidth, we could do all these amazing things."

I can't help but disagree with this sentiment on multiple levels:

1. If you build it, they might come, or they might not.
While it's true that demand for bandwidth generally trends in line with the increased availability of supply, we have not yet seen the revolutionary, nationwide adoption of Internet applications promised by the increasing penetration of broadband. For the most part, applications are still being used in small pockets. An implementation of videoconferencing here, the use of webcasting there.

There's a whole lot we can be doing with the broadband networks we have today that we aren't, and we can't assume that by deploying advanced broadband networks a wellspring of new applications will appear all on its own.

2. Through more broadband the US can be more competitive in the global economy, assuming we actually utilize these networks.
McDowell spent a good chunk of his time denouncing the gloom and doom of the US falling down the OECD's broadband penetration rankings. He opened my eyes to a number of shortcomings about what these rankings mean, but in the end the debate over how much broadband penetration the US does or doesn't have seems moot.

For me, what matters less than how many people are using the Internet is how they're using it. For example, imagine we reach a day where 100% of Americans subscribe to broadband at home, but no one's using the Internet for anything other than email, checking the weather, and watching funny clips on YouTube. In this scenario, what have we really accomplished? How much of a return would we as a society be getting from the investment and education spent on reaching this goal of 100% penetration?

3. We must encourage the deployment of broadband, while simultaneously promoting its use.
If we want to realize the full potential of the Internet, we can't do so solely be championing the deployment of advanced broadband networks. We also need to work equally as hard on finding ways to educate and inspire the public to use the bandwidth these networks provide to a fuller extent.

We should be exposing kids in the classroom to the possibilities of working in the digital economy, not sitting back and waiting for a handful of geniuses to figure it out on their own. We must identify opportunities to help support the adoption of broadband applications by cash-strapped organizations involved in furthering the public good, like medical facilities, schools, and government agencies. We need to be incentivizing private companies to start looking more intensively at how the use of broadband applications can drive new efficiencies in their businesses.

Broadband alone will not solve all of our problems. But deploying broadband while making every effort to further its use and adoption does hold tremendous potential for exponentially increasing the positive impact of the Internet on all facets of society.

June 12, 2007 11:32 AM

Impressions from the P2P Summit

Hello from cloudy California! I'm out here attending the P2P Summit put on by the DCIA and the Digital Hollywood Summit, and I'll do my best to keep you all up to date on the happenings of these exciting events.

Yesterday I attended the one-day P2P Summit, which pulls together many of the players in the P2P (aka peer-to-peer) space to talk about where the industry is and what it needs to do to continue moving forward.

While I could go off on one of my long-winded rants about a number of interesting topics that were brought up, instead I want to provide you with a flyby of some of the interesting things that were said throughout:

- When you analyze at the usage trends around the illegal distribution of popular content like first-run TV shows, it looks like an established marketplace, where demand for shows follows regular patterns after the shows first air, where demand for seasons is greater than demand for individual shows, and where a handful of shows draw the majority of the traffic. (Unfortunately, I can't remember who was saying this as I walked in on the middle of his presentation.)

- Eiton Efron from Oversi, a company that enables P2P caching for ISPs, made an interesting observation that while traffic shaping technology has begun to make a lot of noise as a way to reduce the impact of P2P traffic on networks, the advent of P2P streaming technologies like Joost makes this more difficult as users are watching video in real-time, meaning if a network operator were to try and squeeze that traffic down it would create an immediate, noticeable degradation in quality, whereas squeezing P2P downloading traffic simple means it takes longer to download the file.

- In between sessions, I was chatting with Shelly Palmer, an outspoken thought leader in the transition to TV 2.0, and mentioning how my proximity to DC has led to a growing interest in policy issues. His response, and I paraphrase here, "We're all going to be involved in policy soon." It's his belief that there are too many established interests already beginning to fight over Internet regulation, and that if the online industry wants to ensure that we don't end up with federal legislation that hurts the Internet rather than helps that we'll all have to step up and make our voices heard in order to better educate legislators as they navigate these complex issues and make tough decisions.

- On another non-P2P related note, Jonathan Lee from MediaDefender, a provider of anti-piracy services and technology, made the interesting observation that just a few years ago a developer of his commented on how they'd never need more than the T1 line they already had. Today, the same company requires the equivalent of 6000 T1 lines.

- Back to the world of P2P, Katie Mitic from Skyrider, a company focused on helping organize and monetize P2P traffic, highlighted how if you add together the install bases of all the P2P clients in the world, it's bigger than any other application out there. For the most part the different clients don't talk to each other, and I can't confirm her claims about having the biggest overall install base, but even still it's remarkable to think about how what once was a fringe technology has now become relatively mainstream, which strongly suggests P2P traffic isn't going anywhere.

- Along these same lines, another speaker who will remain anonymous due to my shoddy note-taking made the observation that there are 100-150 million searches happening on each of the major P2P networks every day, which is equivalent to a search giant like Yahoo!, further proof of how far P2P has grown.

- Conversely, Michael King, CEO of Abacast, a company that enables live P2P content delivery, discussed how while his company's technology is designed to scale to support huge audiences, he struggles with making this pitch to hesitant customers when his biggest test case topped out at 10,000 simultaneous viewers. That's not to say they haven't tried having bigger events, instead this highlights the challenge in getting a big group of people to show up and do the same thing at the same time on the Internet, especially for a live event which has a distinctly limited window for opportunity for users to join in.

- On the same panel, Chuck Kalmenac, who works in AT&T; Labs on issues related to network performance, showed a graph that highlighted the fascinating reality that while P2P traffic has long been the biggest driver of bandwidth consumption, that the recent rise of video sites like YouTube has just in the last few months pushed HTTP traffic past P2P, and that if the recent growth curves hold true, that that gap may widen in the coming months.

June 13, 2007 11:40 AM

Trendspotting from the Digital Hollywood Summit

Yesterday I attended the first full day of the Digital Hollywood Summit, and what a day it was! It started as I sat down to enjoy a blueberry bran muffin and ended up meeting a woman that manages former Playboy bunnies who make their livings in large part based on subscription websites where they upload photos and video of themselves in various states of undress, and I finished my day eating dinner at a Mexican restaurant on the Santa Monica pier with one of the forefathers of online gambling.

In between, I bore witness to a series of panels chock full with experts discussing the past, present, and future of online content delivery.

Here's a rundown of some of the more interesting tidbits I gleaned:

- An oft discussed topic in the content delivery space is the willingness of consumers to watch long-form video on their computer screen. While attending a session on Television 2.0, Rishi Malhotra of HBO added a new wrinkle to this line of thought: "It's not about the screen you're looking at so much as the chair you're sitting in." I found this to be a very savvy observation that deflects attention from the nature of the screen towards the nature of the viewing experience.

- Along related lines, in a session about how Hollywood can enable the next level of consumer entertainment experience, Lewis Henderson of the William Morris Agency made the general comment that we no longer live in a world where screens should be looked at separately. Today, all screens can be monitors for displaying content of all sorts, and we shouldn't hold on to the outdated distinctions between them.

- I generally agree with Henderson's comment, though in a later session, Silvio Scaglia of Babelgum made the bold statement that, "TV over the Internet cannot compete with traditional TV. That's why we should focus on different things for online video than simply replicating what's on TV." This suggests the need for more differentiation between the screens. In the end, I think both sides are right, though I'm not yet ready to place any bets on how these distinctions will play out over the next few years.

- While much of this event is filled with positive energy about how far the industry has come, there was also a consistent thread of speakers expressing their concerns about the many challenges online video faces moving forward on the road to more mainstream adoption.

Jordan Levin of Generate gave a historical context in saying, "It used to be hard to get a show made and on the air, but it was easy to get attention." Today, he continues, it's easy to make a show and put it online, but it's incredibly difficult to get the attention of users.

The challenges of finding content and the exponentially growing amount of available content will make it "harder and harder to achieve mass audiences," or so said Jared Hoffman, also of Generate on a different panel.

John Penney of HBO highlighted how the overall pie for advertising dollars is relatively static, with about $65billion in TV dollars and $65billion in direct marketing dollars that anyone trying to build an online content business based on advertising has to fight over. And much like the stock market, "Whenever there's a gain by one party, someone else is likely losing out on money."

- My final thought for the day is a totally tangential observation I made: while I believe they're recording the audio of all the sessions, for most of the sessions there was not a video camera in sight. In many ways this blew my mind. Here we are at a conference focused on the technology and business of producing and distributing video, located a stone's throw from Hollywood, featuring an exhibit hall full of companies that help enable the workflow of working with video and crawling with bloggers/podcasters/reporters carrying cameras to conduct interviews, and no one's capturing video of the sessions.

I understand that video production can be expensive, and video of panel discussions isn't always compelling, but it seemed as though if any conference were going to record everything on video, it'd be this one, which points to an important element to remember when considering the present and future of demand for bandwidth: video as a medium is still very nascent, and video production, while increasingly prevalent, is not yet ubiquitous.

I often talk in terms of demand for bandwidth from the user side and supply of bandwidth from the network operator side, but the other element to supply is having a proper supply of content to drive demand for the bandwidth networks provide. Some day, every event will be captured on HD video, and the growth to get to that point should prove to be another major driver of demand for bandwidth.

June 19, 2007 10:44 AM

Celebrating Innovation Powered by Broadband

After a weekend spent partially on horseback pondering last week’s adventures at the Digital Hollywood Summit, I’ve had a chance to step back and look at the big picture of what it all means.

The impact of YouTube and BitTorrent on the entertainment industry was clearly evident as I perused the exhibit floor and later waded through a bag full of glossy marketing materials.

YouTube, by the many different offerings enabling content owners to employ social media tools that power the creation of communities around content.

BitTorrent, through the growing number of companies that leverage some form of P2P, often built off of the BitTorrent protocol, to drive new economies of scale for content owners in the delivery of their content.

These two areas of social media and P2P distribution – with YouTube and BitTorrent being noted in particular -- represent the two applications with the biggest appetites for bandwidth today.

They also both exemplify the evolving nature of the demands placed on broadband networks, where Internet video is no longer merely being downloaded it’s also being uploaded to post on sites like YouTube or act as a relay point in a distributed P2P network.

But despite accounting for big chunks of Internet traffic today, both technologies and the marketplace for the services they enable are quite nascent. Social media is limited by the number of people who record video and can understand the opportunity of putting it online. Legitimate P2P delivery only accounts for a small percentage of all P2P traffic, and its current revenue can’t stand up to other traditional or online distribution channels.

Yet, they’re both two of the hottest things going in Hollywood IT circles. Content owners love the opportunities social media tools deliver to engage their customers. And the big media companies are all facing the same challenge as they scale up their online video businesses of how can they lower their costs for distributing video.

The message these thoughts are trying to reinforce is that both social media and P2P applications are rapidly transitioning from being the hot new thing to valid and vital components of the Internet’s continuously expanding communications toolbox.

And because of their reliance on the upload capacity of their users, these applications wouldn’t be possible without the increasing capacity and availability of broadband networks that has come about over the last few years.

So I’d suggest that we take a moment to step beyond bemoaning YouTube and BitTorrent’s role in enabling piracy and instead celebrate them as living proof of the innovation that investment in broadband infrastructure can help drive.

June 22, 2007 12:03 PM

Preview for Next Week: My Trip to UTOPIA

Just got back late last night from a whirlwind day and a half tour of UTOPIA, the high profile fiber-to-the-home build that unites 14 communities outside of Salt Lake City and delivers symmetrical Internet access of 15Mbps and above.

During my short stay I had a chance to meet with Roger Black, COO of UTOPIA; the team over at XMission, one of the service providers who rides UTOPIA's network; most of the upper management of DynamicCity, the company that helps UTOPIA and other communities design, deploy, and manage fiber networks; and some of the people over at iProvo, which serves the nearby community of Provo with a fiber network.

And scattered in between and during these meetings were visits to XMission's banks of servers located in a renovated historical building in downtown Salt Lake City, the underground concrete bunker facility UTOPIA utilizes near its headquarters in a business park in Lindon, and iProvo's headend which was situated in a residential neighborhood.

Next week, I'm devoting myself to writing up posts that originate from my experiences in UTOPIA in an effort to help give more insight into what's happening with this project, and how the lessons they're learning reflect larger issues regarding the Internet and the growing supply/demand for bandwidth.

I'll be approaching this from multiple angles, including what happens to bandwidth demands once the capacity of fiber is put in place, how open access networks work and where they fit into the bigger picture of the Internet's evolution, and what challenges both UTOPIA and iProvo have faced as they push hard to drive adoption of and innovation on their networks.

Look forward to the first post being ready for you to read first thing Monday morning. Until then, it's time for me to get outside and enjoy the beautiful weather here in DC!

June 26, 2007 12:59 PM

UTOPIA's Open Network: Unlimited Promise or Unrealized Potential

For the first writeup following my trip last week to the UTOPIA project, I want to take a moment to consider what is arguably its most revolutionary and controversial aspect: its open access network.

Rather than retailing services directly to end users, UTOPIA sells wholesale access to its network to service providers, who then handle the customer acquisition, billing, delivery of services, etc.

For a list of the service providers currently riding UTOPIA’s Community MetroNet, check out the table on this page.

Let’s pause to look at this concept from a historical perspective.

Before the Internet, we lived in a one pipe, one service world where cable offered cable TV and telcos offered telephone services over their respective wires.

Today we’re in the midst of an increasingly competitive multiple pipe, multiple service world where cable companies and telcos are racing to upgrade their networks and deploy triple play services, new last-mile access entrants like wireless and BPL can be seen on the horizon, and a broadband-powered Internet is churning out viable competitors to traditional telephone and TV services.

Open networks like UTOPIA take all this to another level by enabling a fiber-powered one pipe, many service world where competition relies less on the limitations of last-mile access technologies and more so on the merits of the actual services.

Opening up access to private networks is something large network operators have historically resisted, most notably during their fight against local loop unbundling, which forces ILECs to sell discounted access to their networks to competitive DSL providers.

In many ways, I understand their reluctance. If I had spent a ton of money to build a road, I wouldn’t want to be forced to allow other companies to put up tollbooths that would compete with my own, especially if my whole business model was built on the premise that I’d be dealing directly with consumers rather than staying in the background as a backend wholesaler.

Yet there’s also another parallel to consider: what is the entire Internet if not a giant, decentralized open-access network through which a multitude of services compete over the same broadband pipe?

So what I see UTOPIA building is a network that combines the principles of openness on the Internet with the capacity of an all-fiber infrastructure. Through this, there’s a tremendous opportunity to establish a next-generation vision for how the Internet can operate, where applications—which can extend far beyond the traditional “services” of telephone and TV—come out of the cloud and into the network, thereby introducing a new paradigm for competition in communications services and new levels of quality of service.

But while this seems like a utopian vision for America’s broadband future, during my trip to Utah I also learned that it is one that’s still in a very early stage of development.

For one, the underlying premise of “if we build it, they will come” as it applies to service providers has not yet been fully realized. Other than the Internet services offered by AT&T;, no other major service provider has deployed on UTOPIA’s network, the service providers who have come on have focused primarily on the traditional triple play rather than innovative new services that leverage the capabilities of this network, and there have not been any significant commercial deployments of cutting edge bandwidth-intensive broadband applications in-network.

Another challenge for UTOPIA has been getting consumers to understand the advantages of its open architecture and the ability to select from multiple service providers over the same pipe. Some of this has to do with the fact that it’s a rather radical new concept, though another part of it stems from UTOPIA’s decision to rely primarily on its service providers to market the network, who don’t necessarily have a strong incentive for letting people know they can switch to a competitor’s service whenever they want.

And UTOPIA must also face the realities of being a new model that has reached less than 50,000 homes, which limits the energy service providers and applications developers can devote to it when much larger markets are available through the major network operators and over the general Internet.

But these are all problems with potential solutions. Living proof of the innovation that open networks foster can be found in Vasteras, Sweden, a city of 80,000 people where the deployment of an open fiber network has led to a vibrant marketplace of more than 100 different services. UTOPIA is starting to take a more active role in educating residents about the value of open fiber networks. And as UTOPIA continues its buildout across 14 cities, additional cities opt-in to the UTOPIA consortium, and DynamicCity—the company that’s helped UTOPIA design and deploy its network—finds other communities willing to take the plunge, the viability of this new marketplace will only increase.

What will the future hold for open access networks? In all honesty, I have no idea. It’s an idea that holds a lot of promise, but it’s also one that simply cannot take hold over night. So for the foreseeable future, all eyes will continue to be on UTOPIA as they strive to prove the viability and realize the promise of open fiber networks.

June 29, 2007 1:52 PM

Killer App Sighting in UTOPIA: Videocalling on Your TV

In a post earlier this week about my recent trip to UTOPIA I discussed their open fiber network, suggesting that while it offers a revolutionary alternative to closed private networks, the reality has not yet lived up to the promise.

Now, I want to dive into one area where UTOPIA’s full-fiber network—which enables symmetrical 15Mbps connectivity for $40 a month—has begun to realize it’s potential, namely, establishing their community as a testbed for big-bandwidth applications.

UTOPIA’s network has held court to a number of companies interested in testing applications that can make use of the capacious bandwidth provided by fiber. Unfortunately they’re under the strictest of NDAs with most all of them, preventing them from discussing any partner, save one.

That one is TVBLOB, an Italian company that may be on the verge of launching a true Killer App for fiber.

On their website, TVBLOB describes themselves as “a technology development company seeking to radically alter what you can do with the common television set.”

Of course, any time you’re talking about transforming TV there will undoubtedly be a set-top box (STB) involved, and when I saw that was the case with TVBLOB I have to admit I nearly groaned out loud. I’ve seen box after box try to establish itself in the living room, and only two have succeeded in any real way: TiVo and Slingbox.

But I knew I needed to give this a shot as TVBLOB will soon be offering the first box I’ve seen built to enable high quality videoconferencing in the living room through the TV. So as I sat down for a conversation with Fabrizio Caffarelli, TVBLOB’s founder, and Lisa Morris, their VP of Sales and Marketing, I did so with a mix of anticipation and trepidation.

The TVBLOB box is smallish; bigger than an Apple TV but much smaller than a TiVo. And in this case it was connected to what was at least a 40” LCD TV hung on the wall of a conference room in DynamicCity’s offices. A small camera was mounted on a tripod which rested on a table underneath the TV that also housed an audio mixer of some sort that linked a pair of wireless mics.

In all honesty, it was hard to envision that setup in a living room as there were wires everywhere, making the scene somewhat unsightly. TVBLOB does have the advantage of being built to work with the technology you already have, though, and later in our conversation I learned that they are actively considering ways to adapt peripherals to better suit the living room environment, such as cameras that can mount on flat panel TVs and building a microphone into the remote control.

Initiating a call only took a couple of button pushes and we were live, opening a virtual window into their office on the other side of the ocean.

The quality of the video was impressive right from the get go. A definite step up from desktop videocalling applications as it filled the entire TV screen and, despite having a lot of people in the background moving around, the picture held up with very little of those blocky pixels you often see in online video.

It doesn’t quite reach the quality of video found in higher end, dedicated videoconferencing units produced by companies like Polycom, Cisco, and TANDBERG, though. It’s nowhere near DVD quality and is more akin to what you’d see on an older VHS tape.

But that’s OK because those products are thousands of dollars whereas TVBLOB’s pricing, while not yet finalized, will be more on the order of hundreds of dollars, or even less than that if done as tens of dollars a month.

In terms of the quality of the conversation we were able to have, it was quite respectable. The sound held the whole way through with only a couple of dropped words, and the video never loses synch with the audio, which is something I find to be horribly disorientating.

I did notice a bit of a lag as I could hear my voice coming out of the speakers in Italy, and it made any attempt to interject a comment or question somewhat troublesome. But to quantify it I’d say it was better than the lags I remember hearing in the early days of VoIP when I was calling from Madison, WI to Minnesota, and now that video’s entered the equation it makes it much easier as we could communicate through gestures and expressions without having to rely solely on lagging audio.

All in all it was a very positive experience for me. I felt very at ease, and I could see how this holds the potential to really open up the use of videocalling to a much larger group of people.

And the best part is, it’s an application that loves bandwidth. Fabrizio said it could run on as little as 512Kbps, but his body language suggested he wasn’t particularly satisfied with what you could do with that amount of bandwidth. Where it really starts working well is at 1Mbps, and in Utah we were cruising along about where the box currently tops out, around 2Mbps.

Remember, that’s 2Mbps of symmetrical bandwidth. And because it involves real-time communication that needs to be a steady connection otherwise any drop in throughput will be very noticeable. This level of bandwidth doesn’t quite make it a fiber-only application, but it will push the upload capacity of most all cable systems and the overall capacity of a lot of DSL connections to the limit.

What I also found compelling about their solution, which is set to launch sometime later this year, are their plans to build out the capabilities of their box to include the greatest hits of other STBs, including recording TV shows, placeshifting video, extending the reach of media from the PC to the TV screen, and providing a platform through which content can be delivered directly from the Internet.

This truly is a compelling vision for what the future of this product can be, and you’ll be able to read more about them over the coming weeks and months at KillerApp.com as we track their progress on the road to becoming the next Killer App.