Why is this page text-only?

ABOUT

App-Rising.com covers the development and adoption of broadband applications, the deployment of and need for broadband networks, and the demands placed on policy to adapt to the revolutionary opportunities made possible by the Internet.

App-Rising.com is written by Geoff Daily, a DC-based technology journalist, broadband activist, marketing consultant, and Internet entrepreneur.

App-Rising.com is supported in part by AT&T;, however all views and opinions expressed herein are solely my own.

« September 2007 | Main | November 2007 »

October 2007 Archives

October 1, 2007 10:47 AM

Halo 3 Causing Network Issues, Highlights Positive Side of Traffic Shaping

Last week I wrote about the launch of a videogame called Halo 3 and the perception that demand for players trying to get online and play it could take down the Internet.

Today I came across a related post about a university that's been experiencing network issues between the hours of 8pm and midnight every since September 25th, Halo 3's launch date.

It's important to note that again, the demand for playing this game has not threatened the overall Internet yet, but instead this post relates to the impact of Halo 3 on one entity's network, in this case a university.

Also, be forewarned that the post isn't much of an article but instead a short firsthand recounting of what one student and system admin has been dealing with.

Yet I still found it interesting as in this post he mentions how they were able to solve the problem by instituting a form of packet shaping.

I won't try to explain what the problem was or the specifics of how they fixed it as that's a bit deeper into tech-ese than I'd like to go, but it's important to understand the basics of packet shaping.

The gist of it is that network operators use packet shaping to manage their networks by identifying what's running over their pipes and making decisions as to if that traffic can be sped up or if it needs to be slowed down or rerouted.

In the case of this university, they had to employ traffic shaping in order to maintain the integrity and preserve the capacity of their network to ensure that students who were trying to do real work over the network had the resources they needed.

Packet shaping is often much maligned as network operators overstepping their bounds, but packet shaping can also be a very useful tool, like in this situation where there was an obvious need to manage gaming traffic so that it didn't interfere with someone else on the network who may be researching a cure for cancer or putting together their masters thesis.

October 1, 2007 3:15 PM

The FTTH Conference Begins! A Thought on Fiber Discrimination

It's sunny and gorgeous out here in Orlando, FL, but I wouldn't know it as I'm in a hotel on my way out the door to the opening of the Expo Hall at the FTTH Conference!

As always happens, I've already missed some great content: a keynote address by Bret Swanson, a senior fellow at Seattle's Discovery Institute on the exaflood.

He's apparently the guy who coined the term "exaflood" and the big reason he's speaking at this show is his belief that fiber-to-the-home is the best answer to dealing with the oncoming demand for bandwidth.

I had to miss what I'm sure was a fantastic talk because I was helping one of my clients for whom I've done some marketing work pull together some last minute items on the show floor.

That client is a company called Global Online Solutions Network, or GOSN. I'll write them up in more detail in the next couple of days, but for now I wanted to share one thought with you that I picked up from a conversation with John Hughes, their founder and CEO.

He said to me during a powwow session that one of the keys to his company, which has developed a way to protect communities and homes through live, event-driven video monitoring, is that they don't discriminate. GOSN doesn't care if who's trying to break into your house. It doesn't matter if they're black, white, young, old, male or female. They catch them all.

He then alluded to his belief that fiber should be the same way. The deployment of fiber should not discriminate; we need to be shooting for the goal where everyone can get it, without regard for whether you're rich or poor, in a city or in the country.

Of course, this goal is easier said than done as forcing mandatory buildout requirements on private operators is troublesome as they try to make the numbers work, and municipal broadband initiatives are still limited across the country, even though many are finding great success.

So in my mind that means we either need to get momentum behind widespread government deployment of fiber that can forgo the need to realize short-term gains, or find a way to make everyone an attractive potential customer for private deployers to go after.

Here's one thought I just had while writing this post: what if we provided some sort of financial incentive (tax breaks, loans, grants, etc.) that becomes more lucrative the further out a home is from a community? Additionally, what if we provided a broadband subsidy that increased the lower the income of a household?

I don't think focusing more attention on getting fiber to the less advantaged will harm or slow down the deployment of fiber to the more well off and urban. And perhaps this might be a more effective stimulant for private entities to pursue what are currently believed to be not all that attractive users.

I don't know if this is the answer, but it's an answer. What ideas do other people have? Is it even realistic to think we can ever reach the ultimate goal of 100% fiber penetration?

October 2, 2007 1:32 AM

Talk About a High Bandwidth Application!

Reporting from the FTTH Conference floor, I had an interesting experience with the realities of how much bandwidth next gen video applications actually demand.

As I mentioned yesterday, I've consulted for GOSN helping with their marketing, and while I didn't intend to mention them on consecutive days, another interesting thing happened: they proved themselves to be a truly high bandwidth application.

They were responsible for mounting: four motion sensitive cameras onto the Home Networking Zone--a well appointed faux four room house--a pan/tilt/zoom camera to a lightpole and another hung from the ceiling along with four other cameras showing an aerial view of the show floor.

You know how much bandwidth they needed to and from the Home Networking Zone alone? 27Mbps, and the cameras had bitrate and framerate left to fill if more bandwidth was available.

Luckily, that bandwidth should be available tomorrow so I'm excited to see what happens as we push the limits of what's possible in home security.

You might ask, how much would it take to truly max out the cameras' capacity? Over 100Mbps...symmetrical.

Talk about real-life evidence of the need for a 100Mbps Nation!

October 3, 2007 9:28 AM

A Trio of Articles About the Need for Bigger, Better Broadband

Came across a trio of articles highlighting different discussions about a similar topic: the importance and vulnerability of the Internet.

"The Internet wasn't designed for people to watch television," he says. "I know because I designed it."
This quote came from Larry Roberts, who oversaw the development a direct ancestor of the Internet: a government network called ARPAnet.

I've often heard people talk about how the Internet wasn't designed to handle video, this is probably the most direct quote I've ever read.

But for all the complaints, the panel agreed the Internet is a remarkable, essential development. Said Metcalfe: "The problem is our [expectations] are going up faster than the technology."
This quote came on a panel last week hosted by SRI where a series of thought leaders convened to discuss the limitations of today's "sufficient" Internet.

I found this last quote to be rather compelling as it really cuts to the heart of the matter: the potential of the Internet is limitless, but the reality of the Internet is currently facing some real limitations.

"Even though we can't see the Internet, our economy is dependent upon broadband."
Michael Kleeman, a senior fellow at UC San Diego and at the Annenberg Center for Communication at USC, wrote this in a San Francisco Chronicle editorial on Monday.

This is another incredibly important point: the Internet generally works, but users often have little idea how it works, yet we're relying on it more and more every day, and there's still little widespread understanding of what the Internet even is.

All three quotes seem to me to point to the awakening of a real debate on national broadband policy in this country and, most encouragingly, a growing awareness and understanding of the need to not just sit back and continue to let the Internet grow on its own but instead aggressively pursue avenues that will allow us to ensure the Internet can continue to cement itself as a reliable cornerstone of our daily lives.

October 4, 2007 8:02 AM

Prioritized Traffic: Maybe Not Such a Bad Thing

Last GOSN sighting for the week, I promise, but I had one more thought to share that came out of my experiences witnessing them set up a working model of their high bandwidth community video protection solution at the FTTH Conference.

I wrote a post on Tuesday about how GOSN's SafetyBlanket is a truly high bandwidth application, demanding up to 100Mbps of symmetrical access, and how at the Home Networking Zone they were gobbling up 27 out of the 30Mbps coming into and out of that faux building.

In talking with one of the IPTV providers who was having to try and squeeze into that remaining 3Mbps, he made a comment alluding to his surprise over why/how GOSN was able to take over the majority of the bandwidth on that network.

But later that day I overheard a GOSN salesman talking to a conference attendee and discussing how old school security captures the telephone line when it's set off. When a standard security system--sensors on doors and windows--goes off it takes over the telephone line in order to ensure the alarm monitoring center receives the alert. Though when this happens you lose your telephone service until the alarm is turned off.

While I can't confirm at this moment, I think that was some of what was happening with GOSN taking over most of the bandwith into the house. Simply put: they're a security product that needs priority access to the network in order to ensure their ability to effectively monitor homes through live and on-demand video.

And let's think about this for a moment: is it really such a bad thing for them to have priority access to the network over an IPTV signal? Should we treat that IPTV signal in the same exact way as a live video feed sent from a house being broken into on over to the local authorities?

For me, this episode really drove home the point that all Internet traffic is not created equal. Some applications demand more bandwidth, less latency, and/or the ability to prioritize their bits over something that's less important, like security over entertainment. And I see no reason why we shouldn't give them that access.

October 8, 2007 8:44 AM

Thoughts from the KMB Conference

After a successful run at the FTTH Conference, I headed west to the Gulf of Mexico to attend the 40th edition of the KMB Conference.

The “MB” in KMB stands for Mike Beilis, a former AT&T; executive and a driving force in the development of the tele-lecture, which leveraged the then-new technology of telephone conference calls to enable remote teaching in university and corporate settings.

His conference brings together a dynamic, diverse group of individuals representing everyone from state officials to broadband providers to think tankers to discuss the real problems and opportunities of this thing we call the Internet, and how broadband should or shouldn’t be regulated.

He’s been doing this for more than 20 years, while simultaneously producing an impressive stream of informative video interviews that have been praised by state legislators and regulators as key resources in their understanding of how the telecom industry works. Many of these videos are available for download on his website.

But back to the conference: it was a tremendous experience for me personally, and I believe for everyone else in the room as well.

The best thing about the event was its intimacy. Coming from the two thousand plus attendees of the FTTH Conference spread across what seemed like miles of exhibit floor and session rooms into an event with forty to fifty people all in the same room and with the added bonus of having most everyone’s name and title listed in the program was a revelatory experience.

I got to know who everyone in the room was. I got to have real, meaningful conversations with at least half of the attendees. And we were able to have true conversations in this setting, with audience and panelists bouncing ideas back and forth in an environment where Mike encouraged everyone to “leave their ideologies at the door.”

Mike featured a jam-packed agenda over the 48 hour run of the conference, including sessions on telemedicine, public safety, gaming, and a number of angles on the central topic of how do we get more broadband into under and unserved areas of the country.

My overarching feeling from attending this event? That everyone agrees broadband is a good thing, that there are a lot of smart people working really hard to try and figure it out, but that there are also a lot of unresolved questions as to what’s the best way to move forward.

I’m not sure if we came upon any hard and fast answers last week, but we did have a series of great conversations that the legislators/regulators I spoke with agreed had helped them explore these issues for themselves to a much greater degree, making them all glad they attended the event.

Now I just hope that my contributions were worthy of an invitation the next time around!

October 9, 2007 10:18 AM

Microsoft Enters Consumer Telemedicine Market, with Thoughts from KMB Conference

Last week Microsoft launched HealthVault, their online personal health record service.

Basically it’s a roll-your-own electronic medical record solution. Input your families’ personal health info, and then manage everything online and make it accessible securely to doctors. You can read more about it here.

The timing of this announcement was almost too apropos as I read about it mere moments after chatting with Dr. Jay Sanders about the manifold opportunities of telemedicine applications at the KMB Conference.

Dr. Sanders is president and CEO of The Global Telemedicine Group through which he consults an impressive client roster on telemedicine related issues. He’s also an adjunct professor of medicine at Johns Hopkins, and has studied and implemented telemedicine technologies and programs for more than 30 years. Needless to say, he’s now my telemedicine guru.

Talking with him confirmed my belief that the telemedicine space has a lot of untapped potential, with so many different ways in which the Internet can be used to improve and expand healthcare. But significant factors continue to hold this space back.

Arguably the biggest is changing people’s behaviors. Just because you have an electronic medical record system in place doesn’t mean doctors will trust it over pen and paper. Just because you have sophisticated robots for monitoring patients doesn’t mean anyone really knows how to use it properly.

And if telemedicine upsets established behaviors and shifts expectations too drastically, watch out: Dr. Sanders shared how the use of telemedicine in prisons to replace in-person doctor visits can reduce the cost of transferring prisoners by more than half. Only problem is that in doing so it takes away opportunities from prison guards to earn time and a half moving prisoners around, which has created some pushback, slowing down these possibilities from becoming reality.

While this has been a real problem, there are some so-called issues in which Dr. Sanders sees opportunity rather than hurdle. As an example, the lack of proper reimbursement for telemedicine applications relative to their analog antecedents is regularly lamented. Simply put: there are some cases where a doctor who reads an X-Ray over the Internet can’t be reimbursed in the same way as a doctor who’s holding the X-Ray in their hands when they look at it.

But Dr. Sanders wondered aloud to me, why focus on reimbursements when there are opportunities to get people to pay for medical products and services directly out of their own pockets?

Just think about the possibilities for commercial products that help baby boomers better monitor their folks’ health as one broad example.

So needless to say, coming out of this conversation into reading news about Microsoft’s HealthVault, I couldn’t help but think that this was only an early drop in the oncoming deluge of products and services available to consumers who want to start realizing the benefits of telemedicine today instead of tomorrow.

October 10, 2007 1:34 PM

Great Video from NextGenWeb Featuring Dr. Jay Sanders

While working on my next post inspired by last week's KMB Conference, I took a break to peruse the archives of NextGenWeb.org.

They've done a great job of covering broadband related events around DC, in particular the APT's continuing series entitled "Broadband Changed My Life."

At their last brown bag luncheon, the NextGenWeb crew sat down for a conversation with Dr. Jay Sanders, who I wrote about yesterday.

Watching Dr. Sanders talk for just these few short minutes makes the promise of telemedicine so painfully obvious it hurts. Yet at the same time it's prompted me to cry even louder, "Why aren't we taking advantage of all these great opportunities yet?!"

October 10, 2007 2:53 PM

The Impossible Dream of Competitive Broadband Marketplaces In Unserved Areas

One of the biggest issues discussed at last week’s KMB Conference was how best to get broadband to areas that are underserved—where there’s no real competition in broadband—and in particular unserved—where there’s simply no broadband at all.

With representatives of Kansas, Iowa, Idaho, and Vermont all in attendance, there was strong interest in this subject.

There seemed to be universal agreement on the need for some reform of how the Universal Service Fund and the USDA’s RUS program work (or don’t work, depending on who you ask). But not necessarily a detailed consensus on what needs to be done and how those changes should be made.

One area where I sensed a general understanding is that more focus needs to be put on incentivizing the deployment of broadband in areas that are unserved, where no broadband provider exists unless you count satellite, which I do only as an option of last resort, not a true competitive force, as the latency in satellite “broadband” is too great for real-time two-way Internet communications like VoIP and videocalling.

But there are many thorny issues within all this, like whether the government should only subsidize the first provider to a market, or if they need to continue subsidizing new entrants to ensure a level playing field, or if they shouldn’t be subsidizing anything at all.

The most common argument for how to spur the deployment and improve the economics of broadband is that if the government stays out of the way competitive market forces will ensure the deployment of faster networks at a lower cost.

But where this theory comes up short is in situations where a competitive marketplace is impossible as there aren’t any competitors to compete.

When I spoke with state legislators and regulators from the aforementioned rural states, they couldn’t comprehend how competition is the answer when they can’t get one provider to deploy, let alone multiple competing entities, in many of their communities.

I think it’s important that we recognize this as we go about crafting our national broadband policy. Competition is great and should be encouraged, but it’s simply not the answer for every community if our ultimate goal is the speedy deployment of broadband connectivity to every home in America.

There’s just no way around the fact that in pockets and across wide swathes of America, a broadband monopoly is the best many communities can hope for, especially in rural areas, where even a monopoly can seem like an impossible dream.

The key thing to note is that competition is not the silver bullet for broadband deployment in the US. It is most certainly one of the most powerful tools to spur faster speeds and lower prices in well-served areas, but a purely competition-based regulatory approach is impractical for many other areas.

So as we draft legislation and set regulations, we must ensure that we maintain enough flexibility to accommodate both ends of the competitive spectrum, from encouraging fair competition in competitive markets to monitoring the actions of natural monopolies in areas where they’re unavoidable.

October 11, 2007 8:07 AM

Research Gets A Heavy Dose of Bandwidth

A couple of big announcements last week from the world of Internet-enabled research.

On Monday, Google and IBM teamed up to announce the launch of a program that will open up high powered computing clusters to universities.

These computing clusters are sometimes referred to as clouds and/or grid computing. I wrote a post about this a few weeks back. The gist of cloud/grid computing is by using the Internet you can stitch together the processing power of multiple computers to process large data sets more quickly than any single computer could.

In the case of this Google/IBM program, 1600 computers are being made available across three locations.

The reason for making this computing power available is twofold: one, to give students an opportunity to learn firsthand how to program software that works in a grid computing environment; and two, to give researchers access to a computer grid that can help crunch all sorts of numbers.

For example, in Maryland, the cloud will be used “to create a system for automatically translating text in difficult foreign languages such as Chinese and Arabic,” according to this WashingtonPost.com article.

Networks that link physically distant computers and combines their processing power and connectivity are fascinating to me because perhaps nowhere else can we find a more compelling reasons for broadband connectivity: an opportunity to create the world’s most powerful virtual supercomputers and in doing so made possible incredible new tools in the drive to solve the world’s most pressing problems.

The other major news was Internet2’s announcement of their recently upgraded network.

Internet2 is a high speed network that links more than 200 college campuses with ultra-highspeed connections for the purposes of enabling cutting edge research on a host of topics. Some private businesses have access to and use the network as well, but it’s a largely academic focused endeavor.

Part of their announcement was the fact that the Internet2 is now operating at 100Gbps, with the possibility of increasing that speed tenfold in the near future.

But perhaps even more notable is the ability for Internet2 members to now provision their own dedicated 10Gbps connections for limited periods of time on demand. That means if you’re on Internet2, you can now dial up a connection with another university and push a terabyte of data in less than 15 minutes.

Now that might seem just a bit excessive to the average home user, for whom the term “terabyte” is probably an utterly foreign and unfathomable concept.

But research nowadays generates a ton of data. As analytic technologies improve and computers get faster, the amount of bits and bytes that researchers are analyzing in order to more fully understand how the world works is increasing exponentially.

What I’m not sure about at this time, though, is whether these on-demand 10Gbps connections will be used for anything more than moving large datasets quickly. I’m curious to find out if there are any applications being run that require that much bandwidth to push data in real-time.

I may not know the answer now, but never fear! I’m on the hunt to find out.

October 15, 2007 11:58 AM

Bring on the Exaflood!

Over the weekend I began to yet again have new thoughts about the exaflood.

As loyal readers know, I’ve written in the past about my personal struggles coming to accept the exaflood messaging, wondering if the images it invokes are too negative.

Accusations of fear-mongering have begun percolating up with increasing intensity whenever discussion of the exaflood hits the front page. Some have argued that the exaflood messaging is nothing more than an attempt by big-time network operators to scare people into a deregulatory mindset.

And recent research has revealed that the growth in demand for bandwidth and the Internet has leveled off somewhat from its meteoric growth of the early years.

But what I don’t understand is the persistent undercurrent of doubt as to how high consumer demand for bandwidth may or may not rise.

In my mind it’s not a matter of if or when we’ll all start needing 100Mbps to the home, it’s about recognizing that if in five years we’re not all demanding that much bandwidth then we’ve failed to achieve everything that is now within our grasp through broadband and the Internet.

I say this primarily because most of the best things the Internet makes possible are distinctly and increasingly bandwidth-intensive applications.

I’m talking about a world of entertainment and education accessible through online libraries stocked with standard, high, and soon-to-be higher definition video ready to download, stream, or share.

I’m talking about real-time, two-way video communication: a concept first introduced in the early days of television that has since evolved into everything from desktop webcams using free software to dedicated hardware for in-home monitoring of patients to full-room telepresence installations that allow businesspeople to look each other in the eye as they make deals.

I’m talking about high quality live video monitoring of your home and neighborhood.

I’m talking about services that allow you to upload, manage, edit, and share your personal media.

I’m talking about live and on-demand webcasting of events like local government meetings.

These are just a small sample of the types of applications enabled by broadband that hold the potential to revolutionize our day to day lives. And these are the applications that once adopted by mass audiences will drive huge demand for bandwidth.

Because of this I don’t see the exaflood as something that should be shunned as unnecessary fear-mongering but instead as something that we should all be striving to achieve and encourage.

I say bring on the exaflood, and with it the myriad opportunities the Internet has to offer!

Now that bigger pipes are being laid, let’s see what we can do to open the spigots as wide as possible so we can push further ahead into a digital future that’s well within our reach not tomorrow but today.

October 16, 2007 10:35 AM

Casually Gaming the World

While taking a break from work last night, I headed over to a casual gaming site I often frequent called jayisgames.com.

Now to clarify, I define a “casual” game as one that I can pick up quickly, play for a few minutes, and be entertained. That can mean anything from card games to trivia games to platformers (think Mario) to simulations, and everything in between. Another important characteristic of a casual game, in my mind, is that I can play it inside my browser; in other words, I don’t have to download and install an application.

Generally speaking casual games are not all that bandwidth-intensive. For most, the game downloads through Flash in the browser, with file sizes ranging from a few hundred kilobytes to a few megabytes, so there’s little to no real-time data being pushed back and forth while playing.

But what caught my eye as I visited jayisgames.com last night was their latest game design competition.

They’ve done this a few times in the past: they pick a theme, set the rules, and solicit submissions by the gaming community in exchange for the chance at fame and, of course, fabulous prizes.

This time around the theme of the competition was “ball physics.” What this generally refers to is the use of simple shapes where the gameplay is based on whatever physics are created for that gaming environment.

Here are a few games that I played and enjoyed, with short explanations as to how they utilized the theme:

Asteroid Pilot – You’re a plane in a mine shaft on an asteroid. You’ve got cargo strapped to your tail that you must navigate safely to the top. The only controls are up/left/right. The trick is, as you try adjusting the direction of your plane, your cargo is swinging back and forth below you, making it increasingly difficult to keep the plane straight while not bashing your cargo into the rock walls.

Kaichou – Billed as an “experimental shooter,” in this game you’re a ball that can charge up and shoot other balls, which exploded into multi-colored flowers when hit. Use the directional arrows to move, the mouse to aim, and the left mouse button to charge and shoot your weapon.

Space Kitteh – You’re in outer space. You can jump from planet to planet using your jetpack. You’re trying to save cosmic kitties. (Hey, I never said all my posts were going to be about how to save the world through broadband.)

These are all prime examples of what I consider to be “casual” games.

But the specific games weren’t what caught my attention so much as the scope of the competition.

Continue reading "Casually Gaming the World" »

October 17, 2007 10:31 AM

Students of the Facebook Generation

Read a terrific post by Ed Kohler of TechnologyEvangelist.com fame yesterday in which he cites a great new YouTube video describing today's educational environment from the perspective of students. He also links to a fascinating interview with an educator who primarily teaches in an online environment to students from across the country.

Rather than appropriate those links in my post, I want to encourage everyone to head over and read Ed's where you can view the video and click through to the interview. I'd hate to poach his hard work!

But I do have a few thoughts to share after digesting the content contained within his post.

First off, the video seems to suggest that our current educational system is if not broken at the very least very strained. Students have more demands on their time than ever, and they're increasingly stuck in classroom situations where they're one student out of a hundred, where their teachers don't even know their names.

A telling quote near the beginning of the video laments the 18th century paradigm still dominant in the classroom environment, a quote made even more impactful when it's revealed that it was said by Marshall McLuhan back in the late '60s.

And really, what has changed in the classroom since then? I went to college from '98 to '02, a time that should've been revolutionary in the use of technology, but other than being able to access the occasional class notes online and generally using the Internet as a research tool, the old mode of lugging books to a classroom where you passively sit and listen to someone preach (I'm sorry, teach) was still by and large the status quo.

Yet at the same time, the Internet seems to provide the perfect solution to out-of-control class sizes, overburdened teachers, and under-stimulated students.

The Internet enables an environment where students are no longer constrained to the information presented during class; they're able to access the world's libraries from a single screen.

The Internet empowers teachers to better manage relationships with large student bodies; face-to-face is always best, but there are a number of new efficiencies that can be realized by more heavily relying on the Internet to facilitate teaching.

The Internet extends opportunities beyond the current capacities found within the brick-and-mortar walls of any one school; teachers and students no longer need to be in the same room to make possible learning opportunities.

It's stunning to me how many of the biggest challenges universities and colleges face today can find solutions online, and yet we're still only seeing these changes happening in drips.

I understand that change can be hard to realize, especially in established and change-resistant atmospheres like the classroom of a tenured professor, but as I've said before and I'll say again, and again, and again: we need to stop talking about the possibilities for change and start pursuing making these changes a reality. We need to stop preaching about the promise and start figuring out how to create a seismic shift in the use of broadband in education and beyond.

We can't keep patting ourselves on the back for small successes when what's needed is the widespread realization that through broadband and the Internet we have the opportunity to dramatically rework the way society runs. That this is all possibility not at some point in the future but today.

And not to get too global on you all, but we mustn't forget that we're not alone in this world, and that the longer we wait to aggressively pursue initiatives that push us more fully into this Digital Age, the further behind we might fall in the race to maintain our dominance in the global economy.

October 18, 2007 9:56 AM

How Broadband's Growth Has Been Powered By Users

I had the great fortune to meet John B. Horrigan, associate director for research at the Pew Internet & American Life Project, a couple of weeks back at the KMB Conference.

He's been at this effort to better track and understand usage of the Internet for more than seven years; needless to say, he's one of the ones who really gets it.

This morning I came across an article he wrote that I highly recommend everyone go check out.

In it, he mentions a gathering back in 1993 entitled "Users: Who Needs Advanced Networks?" that postulated the greatest use of the Internet to be two-way video communication in educational, medical, and political arenas.

He then goes on to contrast that with what's really been driving demand for upload capacity: the rise of social media, where users and viewers are now becoming producers, uploading photos, music, videos, posting to blogs, and the like.

There are some interesting observations and concrete numbers to be found herein, though I can't help but continue to bang my head against the wall that it almost seems as if we were having more concrete discussions about how to use big-bandwidth networks to improve society 15 years ago than we are today.

October 18, 2007 12:00 PM

Anything You Can Do I Can Do Better...

Last week I wrote a post inspired by Microsoft's introduction of its HealthVault personal health information service.

Today, I read that Google's announced a personal health information records initiative of some sort.

There's a total lack of details beyond a vague allusion to their plans to help facilitate the "storage and movement" of people's health records.

It's not often you see Microsoft get the jump on Google, but despite Google not releasing any details about their plans I can't help but think this could be more significant news than Microsoft's.

I say this in part because Microsoft's initiative appears to be purely consumer-driven. In other words, individual users can use the HealthVault to upload their individual information.

I'm sure this will be great for a certain segment of users, but I'm not sure if it will be all that helpful in the larger push to digitize all health records past, present, and future.

This is where Google may be coming in.

As many of you may know, one of Google's big projects is to try and digitize the world's libraries. It's been a contentious endeavor, but one that has seen them ramp up their capacity to scan the printed word into a digital format.

Combine this with their founding principle of wanting to organize the world's information and I'm starting to sense the possibility that when they launch their personal health records product that it will likely be shooting to not just be a consumer product but perhaps that mystical silver bullet that healthcare systems have been looking for as they try to find a straightforward solution for entering into the world of electronic medical records.

Now, I don't have any actual proof of this intent, and Google may end up only coming out with a me-too product analogous to what Microsoft offers, but I'm holding out hope that they're going to seize upon what seems like a massive opportunity to leverage its renowned search capabilities with its high profile brand name with its newfound capacity to digitize paper documents to introduce a new era in electronic medical records.

But only time will tell!

October 22, 2007 9:32 AM

Considering the Role of Competition at an ITIF Event in DC

This past Friday morning I caught a cab and headed across the National Mall to attend a morning event put on by the Information Technology & Innovation Foundation entitled “Building the Broadband Economy and Society.”

The first session presented an interesting dialogue about the current state and necessity of competition in the broadband access market.

It sparked some thoughts I’ve been having about the future of competition, in particular the comments of Ev Ehrlich, who in a past life served as Under Secretary of Commerce for Economic Affairs, and Rob Atkinson, president of the ITIF.

Ehrlich pontificated about the nature of competition and how we can tell if it’s working in markets.

In his estimation, the fact that the telecommunications marketplace appears cognizant of pricing and value suggests the right forces are in place to spur innovation.

He explained, often in oligopolies, or markets dominated by a small number of sellers, products are static as the need to differentiate doesn’t exist.

He argues, though, that the broadband market is proving quite innovative, as evidenced by lowering prices, increasing speeds, and improving services.

He cited the fact that old natural monopolies—ie the telephone companies offering telephone and the cable companies offering TV—are starting to compete as further evidence the current regulatory regime is working.

He sees within this new dynamic a duopoly that’s racing to figure out the best way to sell broadband so as to differentiate their services from other last mile access technologies.

Rob came at the issue from a different angle, questioning the often unchallenged need for more competition in telecommunications. “We’ve elevated competition to a level of sainthood,” he said. “Do we really want four pipes to every home?”

A big challenge in deploying wireline broadband (wireline equals cables in the ground or strung across poles) is the high upfront capital cost needed to install the infrastructure.

More pipes means less marketshare per provider making it more difficult to amortize the costs, according to Atkinson. (Unless we can somehow get the 50% of Americans who don’t have broadband at home to sign up and start dramatically increasing the size of the consumer pie, of course.)

As such, more pipes do not necessarily mean lower prices. He believes that while we should never impose barriers to competition at the same time artificially inducing competition may not be the best approach, especially as a default policy stance. (He’s gone into a ton more depth about these thoughts in a recently released paper about “The Role of Competition in a National Broadband Policy,” which I’m reviewing now and will right up more in-depth comments shortly. You can find it on the ITIF website.)

Rob’s questioning the need for four pipes to every home got me thinking about two questions I often pose to colleagues in discussions about these issues:

- How can we avoid a future where some homes are getting multiple fiber connections while others have none?

- Should we be focusing our attention on spurring competition based on the physical characteristics of last mile access technologies or instead on the services they deliver?

Continue reading "Considering the Role of Competition at an ITIF Event in DC" »

October 22, 2007 12:21 PM

A Missed Opportunity: Trumpeting the Benefits of Broadband to the Green Revolution

Moving from Alexandria, VA up into southwest DC has obviously scrambled my brain somewhat as I completely missed the opportunity to join the chorus of Blog Action Day. Luckily, my most loyal, honest reader--my mom--called me out on it, so here I am attempting to make amends.

Blog Action Day was an attempt to unite the blogosphere for one day to write about one issue in the hopes of affecting a large scale change in awareness of and attitude towards a particular issue. In the case of what occurred on October 15th, the topic was the environment.

I'm certainly not the first to say this (here's a good post on Cisco's blog doing just that), but it's worth repeating: through broadband we have incredible new opportunities to reduce our negative impact on the environment.

Yet we need to be realistic about these opportunities as well, recognizing that what can at first seem to be a blessed promised may ultimately end up harming rather than helping the cause.

A prime example of this is in the long-promised paper-less society computers were supposed to enable. As anyone who works in an office knows, we're using more paper now than ever.

Sure content can be created, delivered, and viewed digitally, but computers and the Internet have also opened up huge new stores of information, and many people are still more comfortable reading, annotating, and storing documents on paper rather than on a computer screen.

Ultimately I think what'll push us more firmly towards a paperless society will be the introduction of epaper, or digital display technologies that mimic the form factor of paper. Eventually this technology will be so cheap to print you'll see it on cereal boxes, but for the time being economics, form factors, and established behaviors prevent it from entering into mainstream use.

There's little doubt that the use of broadband and in particular broadband applications like videoconferencing and telepresence promise to reduce and even eliminate the need for travel, but again, there's a flipside to this promise.

Through the Internet, businesses can now reach customers located further afield than ever. New efficiencies mean they can make more sales calls, generating more business. So even if the Internet reduces the ratio of travel to business, the overall amount of travel may be on the rise.

And despite the many wondrous things made possible by computers and broadband, we mustn't forget that everything comes with a cost, even in the digital age. Computers require electricity, as does the network every time you send or receive a byte. And the influx of cool new technology means an increase in the waste that's created as consumers leave behind old gadgets for the latest and greatest.

Looking back on this post, I didn't intend for it to be so curmudgeonly about the positive environmental impact of broadband, but if you read closely you can still sense the limitless possibility. All these technologies can do what they're promised, just so long as we don't lose sight of the fact that despite the wonders made possible by broadband and computers, they all still come with a cost.

So as we transition further into a digital society, let's not overlook this basic premise as we seek out new ways to utilize broadband and computers to ostensibly save the environment.

October 23, 2007 12:22 PM

Writing from TelcoTV - Considering the Opportunity of NetVideo for Network Operators

I’m on the road again, this time in Atlanta to attend the TelcoTV conference, where telephone companies talk about how to add TV to their stable of services.

Yesterday I attended their pre-conference NetVideo Summit, which covered the challenges and opportunities of Internet video from the perspective of network operators. The series of sessions offered an interesting juxtaposition of themes.

One oft-mentioned challenge was the struggles associated with obtaining rights to content from the studios, whether in order to deliver TV or Internet video experiences. A speaker from Avail Media mentioned a number of concessions they’ve had to make as they developed their Internet TV platform in order to satisfy the demands of content owners.

But Joe Trainor from Narrowstep struck a more contrarian note, suggesting that what’s driving the stubbornness of studios is not necessarily arrogance but instead the fact that they’re facing an uncertain future that they’re preparing for by circling their wagons and doing whatever they can to establish the highest possible value of and control over their content.

At the same time, he observed that studios are coming around on the fact that their approach towards digital storefronts should mirror that of brick-and-mortar stores: they need to cast their nets across as many outlets as possible in order to maximize reach, sales, and revenue.

Alongside these discussions was the general consensus that network operators are perfectly positioned to step into this space and leverage the fact that they have the customer, a trusted brand, and established billing relationships in order to help facilitate the marketing, discovery, and delivery of NetVideo.

In fact, this was really the overarching theme of this NetVideo Summit: that network operators should focus not just on IPTV but also on NetVideo as an opportunity to open up new revenue streams.

But I also sensed an undercurrent in the room that the telephone companies believe their most urgent need wasn’t to pursue a NetVideo strategy but instead to stay focused on the speedy deployment of TV service so that they can compete directly with cable companies, who are poaching phone subscribers left and right.

And these constraints on time and resources stand even more starkly when set against the backdrop of the marginal success of high profile NetVideo plays like Comcast’s The Fan, as well as the general confusion as to how best to approach the NetVideo opportunity.

Yet the thing that network operators can’t forget is that the opportunity of NetVideo may be fleeting. Content owners are increasingly creating their own presences and establishing their own audiences online, without need for distribution partners.

If you’re like me, you believe that the NetVideo revolution is inevitable and will increasingly eat into the mindshare of TV. So it’s important for network operators to jump in as soon as possible as otherwise they may not only miss out on new revenue, but also have to deal with the double whammy of rising network costs as consumer demand for NetVideo, and therefore bandwidth, continues to increase.

So in summation, network operators shouldn’t lose sight of NetVideo as they pursue IPTV lest they end up having to face a future where content owners have established sufficient online audiences to bypass the need for traditional TV systems entirely.

October 24, 2007 10:00 AM

Comcast - Why Are You Adding Fuel to the Net Neutrality Fire?

Just when I sensed the net neutrality debate was cooling off and the opportunity for a rationale dialogue finally seemed possible, Comcast had to go and stick its finger into the eye of the net neutrality camp.

Here's what's happened:

In August, Comcast denied it was filtering or shaping P2P traffic powered by BitTorrent.

Last week the Associated Press and Electronic Freedom Foundation released results from nationwide testing they conducted to confirm whether or not Comcast is actively harming BitTorrent traffic.

The results? A resounding yes, and not just to BitTorrent but also other P2P protocols, and not just to P2P traffic but also other applications like Lotus Notes when someone tries sending too large of an attachment over the network.

What does Comcast have to say about all this? They're now admitting they are actively slowing down file sharing traffic.

The stated reason of these actions is so that they can preserve the user experience for customers on their network, which they claim is threatened by P2P file sharing networks.

Now, there may be some truth to this. P2P networks are notoriously bandwidth and network intensive. And with cable systems being shared assets, what one user is doing can affect another.

But I can not express how disappointed I am in how all this played out.

What was one of the biggest arguments against enacting net neutrality legislation at this time? That there hadn't been many or any egregious examples of a network operator actively degrading a specific application's traffic.

Well, here you are: a specific example of one of the biggest network operators in the country degrading actively interfering with an application on their network.

The thing is, I'm very much in favor of a network operator's right to manage traffic on their network. The network is their asset and they should be able to do with it what they may.

Yet at the same time, how could Comcast be so short sighted as to try and get away with this practice under the radar?

First off, there's an entire army of net neutrality supporters out there looking for an opportunity like this to identify these behaviors and trumpet them as evidence for their cause. So it's nigh impossible to get away with anything that even hints at degrading traffic without somebody finding out.

Secondly, if file sharing traffic truly is a significant problem on your network, then why not be more open about that fact, tell people what you're doing, and then try to work with the applications developers and your customers to come to a mutually agreeable solution?

I just can't comprehend how this possibility was lost on Comcast. When I read through the series of articles listed above I literally smacked my forward in exasperation.

And what makes this even more challenging is that while P2P networks are notoriously rife with illegal media distribution, they're increasingly being used for legitimate purposes. An example listed in an article about is that someone was blocked from downloading a copy of the bible from a P2P network, which is content that's in the public domain. So you can no longer hide behind the argument that all P2P traffic is evil and illegal because that's simply not the case.

This is the single worst/best thing (depending no your perspective) that's happened to the net neutrality debate since Ed Whitacre's infamous comments that Internet applications shouldn't expect to continue getting a free ride on AT&T;'s network.

October 25, 2007 2:55 PM

Pondering Promoting Broadband Through Word-of-Mouth

I often preach to people about the need to stop thinking about broadband as a separate thing but instead something that can influence and be influenced by every aspect of society.

Through regular repeating of this mantra, I'm starting to find myself seeing new possibilities for the use and evolution of broadband wherever I look.

Case in point: on my flight back from TelcoTV yesterday I sat next to Ted Wright, managing partner of Fizz Corp., an Atlanta-based word of mouth marketing agency.

Fizz works with a wide range of companies, in particular beverage makers, to help them take advantage of the growing significance of word of mouth marketing in influencing consumer purchasing decisions.

I should've been taking notes as Ted rattled off a series of fascinating statistics, but the one that really caught my eye was his claim that 90% of purchasing decisions where the individual item is less than $1000 are driven by someone telling someone else how much they like a particular product.

He continued on to discuss how mass marketing used to work, where the more money you spent to have your brand appear in more places, the higher you could drive sales. But now, with the influx of so many new media channels and the over-saturation of advertising, consumers have largely tuned out these messages.

Instead what's driving a large part of our economy are a select group of influencers. Influencers are people who often know a lot of people, or at least a few of the right people, and like to talk about their experiences using different products.

Because of the geometric rate of word of mouth buzz--one person tells two people, who each tell two people, etc.--and the lifecycle of new announcements--which Wright says sustains its efficacy for eight generations--if you can get 1000 influencers to try and like a product, you can generate awareness and demand for that product among an audience of millions.

In listening to all this it started to dawn on me: what possibilities might there be to leverage this new word of mouth marketing paradigm to drive adoption, awareness, and use of broadband applications?

How can we identify and engage the influencers within a community in order to get them excited about the possibilities of broadband and in turn initiate their spider webs of influence in order to create demand for what broadband has to offer?

Unfortunately, this is a brand new idea for me, so my thoughts are only half-formed as to specific action items for pursuing this line of attack, but know that I'm going to be following up with Ted to discuss more specifics about what he does and how he does as part of my quest to unearth the best strategies for accelerating our use of broadband.

In the meantime, does anyone have any thoughts for how we can leverage word of mouth marketing to promote broadband and broadband applications? Or any examples where this practice is already realizing success?

If so, throw your hat into the commenting ring and let us know what you think!

October 26, 2007 12:32 PM

AT&T; – Missing the Boat or a Company on the Cutting Edge?

Another highlight of my journey to TelcoTV was a keynote address on Wednesday given by Peter Hill, VP of video and converged services for AT&T; Labs.

He gave what was described as the first U-Verse demo to an audience of that size that included a variety of forward looking technologies and possibilities not yet found in AT&T;’s current IPTV deployment.

Rather than attempt to do my own rundown, I thought it easier to link to a couple of articles that have already scaled that molehill. My favorite was this Telephony Online article. Also enjoyable was this Multichannel article and this Telecompetitor article.

Overall there were a number of interesting initiatives Hill demonstrated, many of which were beyond anything I’ve seen from other major network operators and most of which sought to leverage existing web assets by bringing them to the TV through the U-Verse interface. More than anything, the presentation really drove home AT&T;’s aggressive push to flesh out and build upon the IP portion of their IPTV offering.

The fact that AT&T; is pushing the technological envelope on what’s possible over the Internet on a TV set may be surprising to many, but it arguably shouldn’t be.

Let’s take a moment to consider AT&T;’s HomeZone product. The HomeZone set-top box combines a satellite receiver for broadcast TV with a DSL connection for on-demand. The interesting thing is that the on-demand content supplied by Movielink and Akimbo is pulled from the same servers as if you were to access their sites through a computer. So in many ways, other than the Xbox Live Marketplace—which allows Xbox owners to buy and download video through their gaming consoles—HomeZone is the most widely distributed bridge between the Internet and the TV out there.

Now they’re pushing the envelope again with this array of new IP services. Despite their reputation for being a slow-moving giant, they seem to be the operator who may be investing the most in the idea that the next major battlefield of the broadband wars will be fought based on value-added services and not purely on speed and cost.

Building off of these thoughts, I’ve been meaning to share with you all another revelation I had at the KMB conference a few weeks back when an AT&T; representative discussed their fiber-to-the-node strategy.

After the presentation, an audience member began his comments by stating his belief that AT&T; had made a colossal mistake in not pushing fiber all the way to the home, a potentially fatal one at that.

Before that presentation and subsequent comment, I fell into that same camp. I know you have to respect your shareholders, but FTTH is just such a no-brainer, necessary, future-proof investment in my mind that I couldn’t understand why they wouldn’t do it.

But something clicked while watching as the speaker walked us through a series of slides that showed how AT&T;’s been building out its fiber optic network closer and closer to homes over time, every generation getting one step closer.

Continue reading "AT&T; – Missing the Boat or a Company on the Cutting Edge?" »

October 26, 2007 1:24 PM

Race to Web Apps Not Just About Google and Microsoft

Last week I came across this Reuters article, which reported from the Web 2.0 Summit on an interesting bit of news coming from the mouth of Adobe's CEO Bruce Chizen.

The news? That Adobe is working on bringing all of its software to the online environment.

Often when talking about the movement away from desktop applications to hosted applications, the two main topics are: what's Microsoft doing? and what's Google doing?

But we shouldn't forget to keep any eye on a number of other companies, in particular Adobe.

Adobe made its name be developing a rich portfolio of high-powered desktop applications for manipulating all sorts of media: video, audio, images, motion graphics, etc.

Since their acquisition of Macromedia in '05, they've increasingly turned their attention to the Web.

For example, their launch of a hosted Photoshop app and Adobe Remix, an online video editing tool.

They're also stepping out beyond their core areas and into things like office productivity tools through the acquisition of a site called Buzzword, which lets you create documents through a slick interface in your browser. (I'm trying it out now and will be writing up some thoughts soon.)

Chizen admitted that this push online is a long-term goal rather than a near-term announcement, but considering the power and popularity of their desktop apps; the fact that they control the Flash, Flex, and AIR platforms (more on AIR soon); and their interest in branching out beyond their core areas of interest, it behooves anyone interested in web applications to keep a close eye on what they're doing and not get caught up in the hype that's built up around the battle between Google and Microsoft.

I know I, and in turn my readers, will be kept up to date as the race from the desktop to the Internet continues to heat up.

October 29, 2007 12:04 PM

More from the Comcast Brouhaha...

The fallout from the news that Comcast is officially interfering with P2P traffic continues to grow.

It's officially hit Congress now.

Here's an article in large part about a conversation the author had with Representative Rick Boucher, D-VA.

And there's been called for by Sens. Byron Dorgan, D-N.D. and Olympia Snowe, R-Maine, which will look into whether Comcast's actions represent legitimate business practices or were in fact unfair and anti-competitive.

While Comcast is denying blocking any P2P traffic they do admit to delaying it under the premise that by doing so they're protecting the user experience of non-P2P using customers on the network.

But I think there's something fundamentally off about their perspective on this matter, as evidenced by a chat log I found between a techie user trying to get his P2P to work and a Comcast representative, found here.

It's an interesting conversation likely worth reading if you want to keep up on these matters (though as a warning, it is a bit lengthy.)

What really caught my eye was Comcast's attitude towards P2P, as evidenced in the following quotes:

"Considering most P2P usage is for acquiring illegal material, there is no harm in any possible traffic delaying. Any legit files that might be available on P2P software is usually available for standard download on the appropriate websites also..."

"If comcast is actually delaying P2P traffic to help deter illegal activities, I would think of that as a good thing. P2P was initially created for the sole purpose of illegal file distribution once napster was sued..."

"If someone decides to only use bit-torrent to distribute legit stuff they are probably only hurting themselves in regards to making that software widely known..."

"Comcast is able to do whatever they wish on their network. If there is traffic allocations setup to deter people from seeding within our network, then that is something which we can do on our network. Those people who are looking for such files can get them from other seeders anyways outside out network, where available..."

"As you mentioned you are still able to download whatever files you wish without issues. If our traffic allocation is not setup for inbound P2P at this time then at least you can still get the files you want from the P2P network..."

So let's parse this out for a minute.

First, they're trotting out the tired claims about P2P being all about distributing content illegally. While the majority of that traffic still does skirt the law, a growing amount of that traffic is being used for legal video and software distribution. Additionally, I'd argue with the claim that P2P was developed initially and solely for illegal activities. The fundamental roots of the Internet are a P2P network, long before people were trading songs and movies.

Second, they're claiming that since the user was able to download the files he wanted from another source, that there is no problem, despite the user's inability to participate in a P2P network.

Third, along these lines, when the user pushes about his desire to seed a P2P network, or become a node for upload files in addition to his desire to download, Comcast basically says: not on our network you don't, and besides, aren't there plenty of people on other networks who can do the seeding?

What is all this saying? That Comcast had decided to disallow P2P networks on their system. That they will not recognize P2P traffic as a legitimate use of their network, no matter what content is being distributed.

Yet despite all this, when we look back on the question of does what they're doing constitute unfair or unacceptable business practices, there's not necessarily a clearcut answer.

What they're doing isn't precisely what the spectre of net neutrality has been warning about: Comcast is not slowing down one type of traffic so that another has a competitive advantage.

What they're arguing is that P2P is a threat to their network, that P2P rarely supports legitimate traffic, and that therefore they must restrict its use in order to ensure a continued high quality user experience for their users.

Now, if Comcast were to be slowing down one P2P protocol in favor of another, or slowing down P2P traffic that's directly competing with a business of their own, then we're firmly in the arena of Internet neutrality.

But what they're really doing has more to do with the issue of network neutrality and what rights they have to manage traffic on their network.

So it'll be interesting to watch how all this plays out as the more I think about it the less clear who's legally in the right in this matter becomes.

Yet I can say one thing: no matter who's right or wrong, we must do everything we can to find a path to reconciliation between network operators and P2P developers. P2P is a revolutionary category of protocols and technologies that holds tremendous potential to create new efficiencies in the distribution of content, in particular very large files.

Having one of the largest Internet providers in the country up and decide to stop allowing it on their network is an extremely unfortunate happening and I hope we're all able to work together to find a way through this quagmire in the very near future.

October 30, 2007 1:24 PM

Writing about Video on the Net from the Video on the Net Conference

tWriting today from Boston where I’m attending the Video on the Net conference.

Wanted to share a series of thoughts and observations I had during yesterday’s New Video Summit.

I got to the conference midway through a keynote presentation by Jeremy Allaire of Brightcove fame. Brightcove works with a number of high profile media brands, offering them a platform through which they can deliver video online.

He was discussing how way back in ’05 major media companies were slamming doors in their face. At best online video was being used as a marketing tool, at worst it wasn’t being used at all.

But he observes that that reality has obviously shifted, and he credits the rise of piracy as being a primary catalyst for convincing content owners to get in the game, lest they suffer the same fate as the music industry.

The thing that I appreciated about the rest of his comments, though, was that he stayed away from the typical rhetoric about how explosive the growth for online video has and will continue to be.

Instead, he spoke pragmatically about things like the fact that the “device divide” has taken longer to resolve than he expected. By this he means the push to bring Internet content to the TV and the lack of any dominant solution for that problem.

He also admitted that while the Internet has long promised the creation of a thriving marketplace of long tail content producers—or people who make content for niche audiences—that there aren’t that many people building real businesses on niche content. It’s not like it’s not happening, it’s just not happening in a big way.

(As an aside, during the subsequent session, a representative of OpenStage.com, a site that allows content producers to post their creations and have them ranked by the audience, talked about the challenge of how the Internet created this open platform for anyone to distribute content through, but 10 million people showed up to share their content and now consumers have no way of separating the worthwhile stuff from the bad.)

He touched on the reality that many of the initiatives built around web video are being financed based on projected future revenue, rather than proving themselves to be viable, profitable businesses given current demand for their content.

So putting this all together, he admitted that this is all taking longer than he originally thought it would, and he guesstimates that it’ll be 5-10 years before web video starts eating into viewership for TV in any significant way.

I’m actually still a bit more bullish on how soon that shift will happen, but I think Allaire hits upon a lot of good points. Arguably nothing has been more hyped on the Internet than the availability of all sorts of on-demand video, but we can’t forget how nascent all of this truly is.

I wanted to put particular emphasis on this reality as I think sometimes we get so caught up in the hype we begin to think that there’s no way we can guess where things are going next and that since everything’s growing so fast there’s no need to do anything to encourage that continued growth.

Instead, I’d argue that we can make some very good educated guesses about where things are going if we just pause for a moment and consider reality vs. the hype, and there’s likely still a lot that could be done to help spur this growth onwards and upwards.

Look forward to some of what my guesses are as to the future of this space and how we might achieve these goals later this week.

October 30, 2007 2:11 PM

My Favorite Pasttime: Interface Hunting

As regular readers may have noted, I have a strong affinity for interface design. Computers offer untold opportunity for creating new ways to interact with information, a reality often pushed to its furthest reaches online through broadband.

Whether an interface introduces a new paradigm in accessing information or simply looks cool, I always find excitement when I stumble upon examples of extreme interfaces in action.

Along these lines, earlier today I stumbled upon a fascinating list of 10 "Awe-Inspiring Interactive Websites".

Be forewarned that some of these sites take a while to load (or at least they do on my wireless card), and some of them contain content you may find disturbing. And I didn't click on every link to make sure that I'm not linking to any content that some may find offensive and/or not safe for work.

But if you're like me and enjoy being exposed to the infinite possibilities of broadband-enabled interfaces, then I encourage to be bold, click through, and see what this new world has to offer.

October 31, 2007 11:25 AM

The Web Goes HD

Don't know how I missed this but on Monday, Akamai introduced a new portal showcasing HD video called The HD Web.

In talking with some folks out here at the Video on the Net conference, I learned that Akamai had long been trying to convince it's customers that the HD opportunity was there and waiting for them, but they were having trouble convincing people that that was the case. So, they decided to go ahead and create a portal stocked with HD content to show rather than just tell people.

In this NewTeeVee article an Akamai representative admits that only 10-20% of their customers have the necessary technical requirements to access this HD video properly, but that's not the point: they just want to show it can be done.

So what do you need to enter into a world of HD Internet video powered by Akamai? An Internet connection of more than 7.5Mbps and a computer outfitted with a minimum of a 2.4GHz processor, 384MB of RAM, and a 64MB video card.

Not surprisingly, this initiative is sponsored in some fashion by Verizon FiOS. The fiber industry has long been searching for the killer app that will highlight the benefits of all that bandwidth, and perhaps HD video will be it.

Whether or not this initiative actually goes anywhere in the near term considering how small the audience is who has this connectivity, and the expense involved with delivering HD video (easy way to think of it is this: the more bits to transfer the more it costs to deliver).

But it certainly is an interesting trend that bears watching over the coming months. Especially in light of a recent announcement by Akamai competition Limelight Networks who unveiled high definition content delivery for the web last week.