May 2008 Archives

In this edition of's VidChat, I speak with Michael Curri, founder of Strategic Networks Group, a global consulting firm that helps quantify the economic impact of broadband and guide communities and businesses in making the most of what connectivity has to offer.

This is must-watch video for anyone interested in how broadband drives economic development, especially local community leaders.

There were a number of good points made by Michael during our conversation. Here are some of the highlights:

- Many of the benefits of broadband are off the balance sheet for the carriers. When a network operator is considering deployment, they're only looking at the financial aspects of how can they make money. But there are many more benefits for communities to having higher capacity networks deployed and used. Balancing these two realities is one of the bigger challenges and opportunities of the broadband revolution.

- Michael reasserted the growing belief that broadband is not a field of dreams. You can't just build it and they will come, you need to be thinking about how to use the networks as much as how to get them built.

- I like how Michael sums up the benefits of broadband to businesses: increase revenues, decrease costs, and improve customer service.

- He asks a very good question of communities: what kind of economy do you want to be? Where do you want to be in five years? It's vitally important to know where you want to be in order to know where to go and what to do. And this gets to the heart of what Strategic Networks Group does, helping communities figure out where they are and where they want to go in the use of broadband to drive economic development.

- Some of the reports Michael mentions being available on their site can be found here.

- This was an eye-opening number for me: according to Michael's research, on average every dollar spent on broadband infrastructure results in a tenfold multiplier impact on GDP. So if we were to put $100 billion into building out fiber across the US, we can expect to realize a trillion dollars of economic growth. Talk about a no brainer!

- When talking about the value of deploying broadband, the ability to attract new companies to the area is often cited. But according to Michael, 70-80% of the growth broadband helps realize comes from local companies finding greater success. I think this is great news as while there's nothing wrong with attracting new companies, there's also nothing better than being able to make existing businesses more profitable as they tend to have deeper roots in their communities.

- As an add-on to this, I found it interesting that it takes 2-3 years for businesses to realize the full value of broadband from when they start committing to using it heavily.

- One big point Michael emphasized towards the end of our conversation is how important increasing takerates is to improving the viability of networks. Basically he's pointing out the simple truth that if you can get 60% to subscribe instead of 30% that that enhances the economics dramatically for anyone trying to decide if they should deploy greater capacity. He even went so far as to say that if you can get 90%, that you don't even have to pay for service. Now I'm not quite sure how that works but I'm going to follow up with him to find out.

- I asked him about the impact of fiber vs. other broadband technologies economic development. He admitted that while some businesses can really use that capacity, in general having higher speeds doesn't make a huge difference today. That said, he did cite the fact that in talking with businesses in fiber communities in the US he's learned that the most significant enabler isn't speed, it's been reliability. They claimed that copper technologies would go out on a regular basis, whereas they can't remember when that's happened with fiber. Michael then makes the very salient point that as businesses come to rely more heavily on broadband, the importance of having reliable networks only increases.

I consider it a great fortune to have someone like Michael contributing to the mindshare on He and I are making plans to dive into many specific areas around the general topic of how broadband can drive economic development over the coming months. If there are any issues in particular you'd like us to address, be sure to add them in a comment below.

Creating Learning Spaces in Second Life

| No Comments | No TrackBacks

Second Life is a virtual 3D world. Create your avatar, buy property, build a house, interact with others, you can even make money doing things like designing clothes or serving as a real estate broker. Corporations have been using it as a marketing platform as well as an enabler of collaboration between teams.

Second Life has come under criticism for not having as many active users as has been attributed to it, and for many initiatives based in this virtual world only realizing moderate success.

But even still it's a fascinating technology and use of broadband, and if you want to learn more about how to use it for the practical purposes of training and enabling e-learning, check out this tremendous list of resources related to creating learning spaces in Second Life.

Can You Spare Any FLOPs?

| No Comments | No TrackBacks

I've written about distributed computing before.

It combines the power of personal PCs connected through broadband to crunch large datasets for purposes ranging from finding aliens to curing cancer.

If ever you though about joining in and helping out a cause by sharing your spare CPU cycles, then answer this call to arms and choose from more than a hundred different projects to contribute to.

Last week I had the opportunity to attend an Internet Innovation Alliance Symposium at the Waldorf-Astoria called "The Exaflood - Finding Solutions."

It consisted of a pair of panels stocked with industry luminaries discussing the reality and challenges of the exaflood. Going into the event I had some trepidation that it would be a one-sided defense of the exaflood concept along with a steady drumbeat against legislation, but that couldn't have been further from the truth.

Instead what I got was a lively, in-depth discussion that covered a number of interesting points, so many, in fact, that I've decided to split my coverage of the event into two parts.

For part one I'll dive into the first panel, which featured: Paul Mankiewich, CTO of Alcatel-Lucent; Andrew Odlyzko, who researched trends in Internet demand at the University of Minnesota; Nick Rockwell, CTO for MTV Networks; and Bret Swanson, senior fellow at the Progress & Freedom Foundation and director of the Center for Global Innovation.

I was most intrigued by the involvement of Andrew Odlyzko. He's been a leading voice tempering worries about the exaflood as his research has shown that while for the first decade Internet traffic was doubling year over year, that recently it's only been increasing by about 50%.

In his remarks and subsequent discussion, he alluded to the historical context that people were worried about the petaflood a few years ago but nothing bad happened. He even went so far as to say that with a 50% growth rate that he doesn't see the need for giant new investments in capacity.

At the same time, his perspective was quite nuanced. On the one hand while citing the projections network operators have made for continued growth in demand, he lamented that if Qwest's projections of 19% growth that that would likely be disastrous for the continued adoption of the Internet. On the other he pointed to the fact that while South Korea has only one sixth the population of America that they generate the same amount of demand for bandwidth, and that it'll take 5 years for us to catch up to where they are today, let alone where they'll be in the future.

Where Andrew sees the biggest challenges is in his observation that the real money isn't in delivery high bandwidth to consumers. And, in fact, he believes network operators are facing a bit of a dilemma: should they be limiting or encouraging growth in demand for bandwidth? By maintaining the status quo he feels they can support the current 50% growth rate without need for major investment, but if they try encouraging growth that could push the rate to 100% year over year and therefore demand greater investment.

These ideas get to the heart of one of the bigger problems we face in getting more capacity deployed. Broadband providers have trouble making more money off of faster service, and because of the all-you-can-eat model prevalent in America the more the customers use their networks the tighter their margins get. I worry that what this all leads to is a system where the best scenario for network operators is 100% adoption and retention but 0% use and only enough investment to keep people placated.

The comments of Nick Rockwell added the interesting dimension of a content owner's perspective into the mix.

One thought he shared is that technologies like the Slingbox, which are driving significant demand for upstream bandwidth, are actually workarounds trying to overcome the limitations of the current distribution system. He even went so far as to say that in a perfect world the Slingbox wouldn't be necessary as that functionality would already be in place.

Nick gets credit for having the best line of the event. Playing off of the common theme of "if we build it they will come" he said, "If we don't build it they can't come." Continuing on he argued that there needs to be excessive capacity in order to drive prices down and innovation up. In his mind, no amount of bandwidth is enough, and as an innovator he'd love it if bandwidth capacity ran ahead of his ability to make use of it, which hasn't been the case anywhere until recently.

He then took that idea to a global level, suggesting that innovators are developing products now in South Korea that you just can't do in the US today with our current broadband infrastructure. And beyond that, until we have that level of infrastructure, innovators can't really even be thinking about what they could do with all that capacity as they have to focus on what's possible with what they have to work with today.

This was a sentiment echoed by Brett Swanson. He firmly stated that abundance must come first, that abundance leads to innovation, and that by deploying what seems like too much now we will find new innovations.

I asked the panel how much bandwidth is enough and Brett was the only one to give a specific answer: 50-100Mbps in the next 5-10 years. He then also alluded to the fact that cable companies can reach this level and beyond as they start looking at moving to a switched network.

While none of this was necessarily new info to me, it was kind of startling to have these thoughts all presented at once as what it looks like people are saying is that we're already at a competitive disadvantage to countries like South Korea when it comes to enabling an environment that fosters innovation. And not only that, because of the time and effort it takes to deploy new networks, every day that passes that we're not moving forward aggressively to catch up we're falling further behind.

To put it more succinctly: we're already five years behind in positioning our country to be leaders in the 21st century, and until we get our act together we're just going to keep falling further back as our position as a global Internet leader continues to be threatened.

Throughout this panel I found the contributions of Paul Mankiewich to be fascinating. He's a guy who's literally helping build the Internet, going so far as to refer to himself as a plumber rather than a tech guy.

While I've often heard that the Internet wasn't designed to support all the interactive rich media applications that have sprouted up I've often disregarded that sentiment. But it's a different matter entirely when the CTO of a company as significant as Alcatel-Lucent bluntly says that they didn't predict the explosion in content created by users and that the idea that people are pushing up content is something they're still struggling with.

This isn't giant network operators trying to claim the Internet isn't ready for this as a cover for them to deliver less service or degrade the service they already provide, this is one of the guys who designs the underlying technology of the Internet saying they weren't expecting this remarkable transition of the Internet beginning to realize its full potential as an interactive medium.

Now, to some degree I'm really disappointed to hear this. Why couldn't people have predicted that there'd be demand upstream as well as downstream? Hasn't the Internet always been touted as a two-way communications tool?

But at the same time I understand that the dominant paradigm of all mass media up until the Internet has been consumer as passive recipient rather than active user.

Even still, Paul helped point out that many of the challenges the Internet faces today aren't technological limitations but instead business processes like DRM that are preventing content from being stored in the network.

The final thought I'll leave you with today came from Nick, who's response to questions about the need for legislation to guide network management decisions was that we need to listen to the engineers. He believes that they've done a great job scaling the Internet to date, and that we should let the people who built the networks solve the problems.

While he didn't explicitly say so, I'm guessing that he includes in this also the input of engineers who make the applications that run on the networks.

I think too often we're allowing debates around issues like net neutrality to be driven by public interest groups that aren't taking into consideration the intricacies of network deployment and management. And on the flip side, the network operators aren't being open enough about the challenges they face.

If we could just get all these great engineering minds in the same room, from both the application and network sides, I feel like it wouldn't be all that difficult to find solutions to these problems.

Of course, the challenge is that we can't ignore the business side of these issues, and that many of the decisions being made about how to run networks technologically will have a huge impact on the businesses of everyone involved.

The gist of this first panel seemed to be that while yes there is tremendous growth in demand for bandwidth that it is not yet a significant problem in terms keeping networks up and running but that we are facing tremendous challenges if we want to remain global leaders in Internet innovation.

More to come tomorrow from the second panel!

On Friday I sat down for a VidChat with Peter Csathy, CEO of SightSpeed.

SightSpeed is responsible for the videocalling application I use to create VidChats.

It’s a product I continue to be impressed with in terms of its ease of use and its ability to work well even in the absence of big bandwidth connectivity. In fact, during much of this call I was receiving less than 100Kbps from and sending less than 200Kbps to Peter. While these aren’t ideal conditions and the picture quality would be much improved with more bandwidth, I still found it impressive as despite the lack of bandwidth it didn’t impact the quality of the conversation we were able to have.

So sit back and enjoy learning a bit more about SightSpeed and what it’s going to take to encourage the rest of the world to start adopting videocalling.

(Be forewarned, I'm trying out different video hosting sites, and this one introduced a second of feedback right at the beginning.)

As happened last time and with every VidChat, here are followup notes and links to the topics discussed herein:

- To learn more about SightSpeed's MySpace widget, click here. If you want to actually use it you'll need to sign up for an account at

- Peter mentioned SIP on a couple of occasions. SIP stands for Session Initiation Protocol, and it enables SIP-equipped devices to talk to each other. To date those primarily consist of VoIP phones, though it portends a future where different SIP-enabled videocalling platforms may be able to talk to each other, which could be a boon to spurring the mainstream adoption of videocalling.

- He discusses how his company is using SightSpeed to enable his employees to only have to come into the office twice a week, working the rest of the time from home. He specifically cites this as a tremendous way to reduce our dependence on and demand for foreign oil. I wrote about the environmental benefits of greater use of broadband here.

- The CODIE award Peter mentions was given to SightSpeed as the Best Communication Solution of 2008. The CODIEs are given out by the Software & Information Industry Association, a trade organization for the software and digital content industry. The CODIEs claim the title of the only "peer-recognition awards program of its kind in the industry." You can see all the 2008 winners here.

- I did a little digging and found that to realize the 640x480 picture Peter mentioned is possible through SightSpeed you need 1.5Mbps on both ends, which is a bit more than most people have on the upload side, unless you live in a community with FTTH, of course!

- To learn more about SightSpeed's different pricing options, go check out their site here.

The Potential of a Multicast-Enabled Future

| No Comments | No TrackBacks

During my travels to NYC this week I had the great fortune to meet and commune with Dom Robinson, a fellow contributor and head of GlobalMix, a London-based content delivery network focused primarily on enabling the delivery of live webcasts.

While his interests range far afield, his longtime passion has been for multicast.

First it's important to understand that we live in a primarily unicast world. If you want to stream a live event you first send the video from the location of the event to a server, and then for as each new user shows up you need to send out a new stream. Therefore with each additional user you need more server and bandwidth capacity. So if 100 people request a 1Mbps stream from the same server, that means you need the server to be powerful enough to handle delivering that much simultaneous traffic and you need more than 100Mbps upstream connectivity to serve all those people. (I'm simplifying this dramatically as very few live streams run through a single server that everyone's logging on to, but it gets the point across.)

That's unicast. In a multicast world, things are remarkably different.

Multicasting is akin to broadcasting where you're sending out a single stream that an infinitely large audience can view. When a TV broadcaster sends video over the airwaves or a cable operator sends channels over copper, there's no additional cost or need for additional capacity to support additional users. You're able to just put out a single multicast stream and an infinitely large audience can view it.

The multicast concept is far from new, and I've long known about its similarities with broadcast TV, but in talking to Dom, he opened my eyes to what multicast can really mean. His vision is that some day you'll be able to turn on a camera from anywhere and deliver live video to an infinitely large audience without it costing much of anything to send.

Currently to reach a large audience you need to either setup your own servers or, more typically, employ the services of a content delivery network, which host banks of servers around the world and promise to make scaling up to larger audiences painless. But there's still significant costs involved, which only go up as the size of the audience increases.

Multicasting could enable a world where you don't need to do any of that to deliver live video to a worldwide and expandable audience.

Multicast has not made any significant headway, though, despite being around for years as it's not as simple as just turning on a camera. To work properly, multicast requires the involvement of pretty much everyone in the value chain of content delivery, in particular network operators. They need to enable their networks for multicasting.

I learned in talking with Dom that most networks already do have the capability to enable multicasting, it just isn't turned on. But unfortunately it won't be as easy to get going as flipping a switch as to really work you need to get buy in from every network operator as otherwise a multicast stream will stop working the moment it jumps from a multicast-enabled network to one that isn't.

Many in the streaming industry have given up on the possibilities of multicast because of the general lack of enthusiasm among network operators to implement it and the great challenges multicast faces in trying to get everyone on board.

But I can't ignore how exciting a future could be where I can start streaming live to an infinite audience at little to no cost.

To that end, I began discussing with Dom the possibilities of trying out some test deployments among a limited number of network operators. He couldn't have been more excited, so I'd like to put out an invitation to any network operators out there: if you want to show the value of your network, prove your commitment to enabling all that is possible online, and want to have some fun trying things out on the cutting edge of technology, drop a comment in to this article and let's see where we can take this.

One of the more interesting threads through my conversations with the streaming media elite at Streaming Media East these past few days has been reactions to discussions about bandwidth.

On the one hand, there was a steady stream (no pun intended) of sentiment lamenting the current state of broadband and the fact that many parts of the country don't have high speed access capable of supporting high bandwidth video.

At the same time, very few I spoke to seemed fired up about the possibilities of the rapidly growing fiber industry.

Some cited the fact that they believed wireless was the future, and that once wireless network were delivering DSL/cable speeds that that's the most important thing, not having the ultra-high capacity of fiber.

Many, despite the sentiment that the telcos have not lived up to the promises made back in the 90s, seem perfectly fine sitting back and letting the market sort out deploying capacity. Their attitude is that demand will always fill supply, and that there's already a lot that can be done with today's networks.

Even those who were thinking about a future richer in bandwidth seemed to stop short of embracing the promise of fiber, citing the growing capacity of copper up into the 25-50Mbps as being more than sufficient to support most all of the applications available to use on the Internet today.

To some degree, I understand this attitude. These are all people building businesses and trying to make money off the current system. Sitting around and dreaming about a future where bandwidth is unlimited won't make them any money in the near-term, especially with the market of fiber-connected homes being so small relative to the overall market.

At the same time, I continue to be disappointed by the general lack of enthusiasm I get when broaching the subject of 100Mbps to the home. Either people don't see the value above and beyond current broadband, or they don't think it's realistic that people will have that much bandwidth any time soon, so they don't seem to want to be bothered by it.

Don't get me wrong, there are definitely some developers who share my enthusiasm about a full fiber future, but we've got to be realistic that without those thought leaders who are innovating and making all the wonderful things on the Internet possible, if we can't capture their imagination, than we're going to be fighting for 100Mbps with one hand tied behind our back.

The reality is there are very, very few applications that demand 100Mbps. Of course, before there will be many there'll have to be more customers with the capacity to use them, but before that we must plant the seed into the heads of everyone working on building the Internet that a future where 100Mbps and beyond is ubiquitous is not only possible but should be the ultimate goal we're all working together towards. Only then will we start to see more ideas and more possibilities open up for demonstrating the value of full fiber networks.

I've been attending Streaming Media East in the Big Apple, and yesterday sat in on a session about user-generated video in education led by the venerable Paul Riismandel, fellow contributor and video guru at Northwestern University.

First I learned about the University of Toledo through the words of their tech coordinator Gary Powell. His comments focused on the use of video in their school of education, where a federal grant afforded them the opportunity to purchase videoconferencing units to be used to facilitate remote observations of student teachers as they taught classes, saving professors the need to physically be in the classroom.

Once they established this program they realized that since they're sending live video it'd be easy enough to record it, so they developed an e-portfolio initiative where students could take video of them teaching and combine it with lesson plans, PowerPoint slides, or other related media. This has proven especially effective as the teachers license you earn in Ohio is valid in 48 states, so e-portfolios provide a way for prospective teachers to more easily reach a broader array of schools with a high impact application for employment.

And further showing what's possible through broadband, they’re also working on creating LCOT, or a Learning Community of Teachers, which is a program that provides an online support group, network, and information resource to teachers who have recently graduated. Its purpose is to reduce the high churn of half of all teachers leaving the professional within 5 years of graduating by giving them somewhere to turn when they have questions or need to vent frustrations.

We also heard from, a site where they’ve been interviewing hundreds of experts in fields ranging from metaphysical explorations of death to specific discussions of certain health conditions. Anyone can add their own comments to a discussion or start their own idea threads, with the intent of facilitating intellectual conversations and discovery through an online video platform.

In the education world, universities have started requesting their own branded pages where they can both link to BigThink content and add talks from their own faculty, and students have been using it as a research tool as they can get video and transcripts from some of the leading minds in the world to help flesh out or sparks ideas for papers.

The final presenter was from He put forth the idea that universities should be curators of the videos that relate to their respective institutions. What allows them to do is create their own pages and then use their search engine to retrieve videos from across the web that are related to them. These include everything from advertising put out by the college's communications department, to student films, to any video that's tagged with the name of that particular institution.

In this way alumni can be kept in the loop as to what's happening at their alma mater, high school kids deciding where to go can see what life is like on campus from many different angles, and this tool can even be used to conduct research about a particular institution. Here's an example of this model in action with Skidmore College.

This was a fantastic discussion to attend, with some interesting new ideas presented that help further highlight the potential impact of broadband and online video in education.

Nothing epitomizes how nascent the broadband revolution is than the all-too-common trend of bad news accompanying good news across all facets of the Internet.

For example, here's an article about how Gen Y is going to change the web, how that generation will embrace and extend its possibilities far beyond what's commonplace today.

Now here's the flip side: according to Parks Associates, a fifth of US households not only don't have Internet access, they haven't even sent an email yet.

So we've got one segment of the population zooming forward, while another even larger segment still hasn't bought into that most basic of Internet applications.

This same good-news-bad-news trend manifests itself elsewhere. Like this fascinating story about how researchers are using the virtual world Second Life as a platform for testing out artificial intelligence. What they've done is create characters who are guided by computer minds instead of human.

But here's the bad news: 90% of all initiatives by businesses to incorporate the use of virtual worlds to enable collaboration end up failing within 18 months. The primary reason cited is that too much emphasis is placed on the technology and not on how people use it.

For every step forward there seems to be at best a pause and at worst a step backwards.

Now I know I can't be overly critical of this as the Internet continues to grow unbelievably quickly from a historical perspective, and many of the benefits it potentially provides demand fundamentally altering different aspects of our lives, which isn't likely to happen quickly if we're talking about moving all or at least most of society.

But at the same time the issues that come up don't seem insurmountable so long as we make a concerted, coordinated effort to overcome them, to share information about best practices and how to inspire adoption, to understand that only together will we all realize the full benefits of broadband.

VidChat: MindTouch, Wikis, and 100Mbps

| 1 Comment | No TrackBacks is proud to present the first edition of VidChat, a series of conversations with those thought leaders who are powering the broadband revolution.

Up first, Aaron Fulkerson, founder and CEO of MindTouch, who I’ve written about previously here.

Here’s how they describe themselves: “MindTouch, recognized the world over for innovation beyond open source wiki collaboration and content management, is delivering a leading edge application integration and development platform. MindTouch Deki Wiki, built with a Web Oriented Architecture, enables users to connect teams, enterprise systems, Web services and Web 2.0 applications with IT governance.”

Let’s hear what he has to say!

(Quick note: Please excuse the static on my mic. I’ll get it resolved before the next installment.)

I had a blast doing this interview, especially the informal chatting before and after I hit record on the call. Aaron’s great because he’s not only smart and has a great idea with Deki Wiki, but he’s also really enthusiastic and positive about what’s possible through broadband.

As a followup, here are links that relate to topics discussed during the call:

- Aaron made mention of Consumer Reports writing about the use of Deki Wiki in a neighborhood watch, which I found here.

- For more insight into how Deki Wiki is actually being used to make groups of people more efficient, check out these case studies.

- Aaron was right in saying Metcalfe’s Law to refer to the idea that networks become exponentially more valuable the more endpoints it has.

- If you want to give Deki Wiki a try without having to download and install anything, you can use their free hosted version called to create your own wiki.

- Or download and install it on your own computer.

- The conference Aaron mentioned where MindTouch was going to be celebrating is OSCON, which stands for Open Source Convention, happening July 21-25 in Portland, OR, learn more here.

- While a little techie for most, “gets you the latest on what’s new and interesting with mashups, Web 2.0 APIs, and the Web as a Platform. It’s a directory, news source, a reference guide, a community.”

- Don’t know how I hadn’t encountered Live Mesh before, but it’s an application that redefines what your personal network means by getting all the pieces of your digital life to work together. I’ll write more as I learn more about it.

- The public housing project Aaron mentioned in San Francisco that’s getting 100Mbps is called Valencia Gardens Housing. It represents the first 240 units of an eventual 2,500 unit deployment to deliver not only fast but free Internet access. It’s an endeavor of the Internet Archive, “a non-profit organization dedicated to preserving a record of the Internet” and leverages the San Francisco municipal fiber network.

- I mentioned asking other applications developers about how having 100Mbps to every home would impact their businesses; well here’s the post I wrote about those experiences.

If you have any questions about MindTouch or Deki Wiki be sure to add them in as a comment below.

On the next AppRising VidChat - Peter Csathy, CEO of SightSpeed, makers of the application that made this videocall not only possible but easy and enjoyable as well.

Twitter Keeping People Informed

| No Comments | No TrackBacks

I wanted to share an article I came across about Twitter.

It details how Twitter's been used in emergency situations like the California wildfires and the recent Chinese earthquake.

Many ended up relying on Twitter more than the mainstream news as it provided info that was more current and in some cases more accurate than what was being shown on TV.

Twitter was enabling the people on the ground, living through the emergency, to communicate with the world, and in so doing provide another example of the revolutionary opportunities the Internet opens up to make us a more connected world.

Even more exciting is that this is an example of how an application originally developed for the social media crowd to keep up to date with their friends has evolved into a new form of media and communication that holds the power to save lives as it may have done in California and China.

Twitter is a simple, little app, but one with big potential for positively impacting our lives.

The Changing Face of Interpersonal Communication

| No Comments | No TrackBacks

There's a lot of talk about how the Internet is changing the way we communicate, but I wanted to share a real-world example of what that means.

I've got a business colleague and friend who I've known for a few years. He's a great guy to know, both a good person and an absolute genius when it comes to all things related to digital media delivery.

But he can be a difficult guy to get a hold of. I can't tell you how many futile phone calls I've made and emails I've sent trying to reach him with questions large and small.

It's always been extremely frustrating, but at the same time understandable. He's currently CTO of an ad network, a job that requires him to put in massive hours and respond to an endless stream of requests for his time. He just doesn't have the time to answer all the calls and emails he receives, so he ends up having to ignore those that are not directly related to his day-to-day activities otherwise he'd be totally overwhelmed.

So for the longest time I gave up hope of getting quick answers from him, until I learned something: he's a huge user of Twitter and IM.

Now that I've discovered this, I'm following him on Twitter, a microblogging site where users input posts of less than 160 characters to describe their current status. By following him I now get regular updates as to where he is and what he's doing, helping me understand if I have any chance of reaching him at all.

I've also got myself back onto instant messaging (I jumped off that train a few years back when I found myself having multiple meaningless conversations about stuff that didn't matter). Now, if I see him online, I can send him a quick note and more times than not jump right into a conversation with him.

So I've gone from not being able to reach him at all to knowing where he is and being only a mouse-click away from chatting.

It's a remarkable turn of events and demonstrates both the power of these Internet applications as well as how the habits of the under-40 sect are revolutionizing the way we use technology for interpersonal communication.

One of the Coolest Flickr Visualizations Around

| 1 Comment | No TrackBacks

To start with, Flickr is a website that lets you upload, organize, and share photos with friends, family, and the world.

Flickr is a Web 2.0 application in that the data on its site can be easily inputted into other sites, which though not talked about as often is a defining characteristic of Web 2.0.

So what that means is other sites can tap into Flickr's treasure trove of images to do different things with them.

Without going into an extended exploration of all the things this can mean I want to just focus on one instance, which I just discovered moments ago, that might be the coolest visualization tool for Flickr images I've ever seen.

It's called Tag Galaxy.

Click on the link and then enter a tag in the box. A tag is a word used to describe an image. So if you want to see pictures of puppies, type in puppies.

You're then shown what looks like planets in a solar system. The central one will have your tag, and any surrounding planets represent tags that are related. So in the case of "puppies" I got planets like "dogs" and "pets" and "cute".

Now, click on a planet. If you click on a related planet then you're taken to another solar system of planets with the central planet containing both your original tag and the related tag you clicked on. So if you inputted "puppies" and then click "cute" the central planet would be "puppies+cute". What this is doing is helping you refine your search so you're only finding the images you want to.

To see what the site's really all about, eventually you have to click on the central planet. Once you do, the solar system goes away and in its place is a single globe covered in tiles. Images then fly in from all angles to cover the tiles so that the globe is covered in images.

Click on an image once and it jumps out. Click on it again and it expands while also giving you the option to go to that image's original Flickr page.

Navigate around the globe by grabbing the space around it and moving in any direction.

Now to be honest, I'm not 100% sure if this endeavor is adding much to society other than a liberal dash of cool. That said, if I were looking for a particular image this is certainly a fun way to do so and may be more effective than hunting through

But in any event it's a prime example of the kinds of interfaces for finding information that only broadband and computers make possible. I highly recommend anyone reading this to go check it out and have some fun.

I read a tremendous article found in Jim Baller's regular email newsletter earlier this week that highlights a number of interesting and important points.

It details an initiative where 150 teachers are going to be finding employment in Wyoming teaching South Koreans how to speak English.

Firstly, it's a tremendous example of the use of broadband as the teaching is conducted via videoconferencing.

Secondly, they specifically mention that what makes this possible is the fact that Powell, Wyoming, where the teachers will be located, is deploying a full fiber network with the capacity to enable high quality videoconferencing.

Thirdly, it's another example of how broadband enables the creation of new jobs that allow people to work from home.

Fourthly, it shows how there are businesses to be made catering to educational pursuits and not just entertainment related endeavors.

Fifthly, it shows how far ahead South Korea is in their use of broadband to enable better education.

Lastly, and unfortunately not necessarily a positive, it highlights the fact that South Koreans are aggressively pursuing applications that can not only be a good business but also benefit society as the money behind this comes not from the US but a South Korean venture capitalist.

Whew, that's a lot of points hit in an article that's not much longer than this post, but there's simply no denying how many relevant points it touches upon.

But what I think I like about it most is that even though it's being funded and driven by South Koreans, it's still creating new jobs here in the US. It's jobs like these that will help us reverse the trend of outsourcing so that other countries can come to rely on the expertise, know how, and hard work of the American people.

And it's important to never forget that this is all possible only through the power of broadband.

Check Out The Audiotool

| 1 Comment | No TrackBacks

Nothing’s better than clicking a link and discovering something new and cool, especially an app that is both visually impressive and functional. Well, here is just such a link.

It’s called Audiotool. It's an online virtual reproduction of audio equipment like drum machines and mixers.

And the best part is if you know how to use them you can make real music. (Or if you don’t you can watch this.)

When in Audiotool you get an overhead view of the equipment that takes up your whole browser. You connect inputs and outputs, adjust dials, press buttons. You can move pieces around to a configuration you prefer.

It can be a little obtuse if, like me, you’re generally unfamiliar with how to make music this way, but I still found it fascinating how one option in creating interfaces for broadband applications is to simply mimic analogous technologies from the real world. And more impressive is how well it appears they were able to pull it off.

They, in this case, being a German company called Hobnox that describes itself as a next-generation entertainment and rich-media publishing platform.

To check out a fully interactive demo of Audiotools, just click this link, accept the terms of usage, and you'll be able to see for yourself what I'm talking about, a use of broadband that's both cool and functional, a killer combination.

Competition Healthy in Wireless Broadband?

| 1 Comment | No TrackBacks

Much ado was made when the recent wireless spectrum options wrapped up and the two biggest winners were AT&T; and Verizon. Pundits lamented over the fact the two biggest companies got the two biggest chunks of spectrum as they feared that an opportunity had been missed to introduce new competition into the wireless broadband market.

But now things have changed. Wireless broadband has a new entrant with some serious firepower behind it, namely a joint venture between Sprint, Clearwire, Google, Intel, Comcast, Time Warner, and Bright House Networks to deploy a nationwide WiMax network.

The consortium has set the goal of reaching 120-140 million people by 2010, offering speeds roughly comparable to DSL or cable broadband.

While ambitious in scope, the project appears feasible given the combined power of the respective companies.

Sprint and Clearwire are already in the wireless business and own a bunch of spectrum.

Intel and Google are intimately involved with developing hardware and software that takes advantage of this wireless connectivity.

And Comcast, Time Warner, and Bright House are all eager to be able to add wireless broadband to the services they sell customers.

And this initiative won't lack for cash initially as Intel, Google, Comcast, Time Warner, and Bright House have promised to invest a combined $3.2 billion into deploying this new capacity.

So within a couple of years we're going to have at least three major deployments of nationwide wireless broadband, not to mention the hundreds of regional public and private deployments.

Does that mean we can stop worrying about having sufficient competition in the wireless space? Can we declare mission accomplished?

I'm cautiously optimistic that we can. And in fact, quite frankly, I'm not sure we can realistically hope for anything better. There just aren't that many companies capable of the investment necessary for a nationwide wireless broadband deployment.

I'm also hopeful that this is a sign that competition is working, at least in the wireless world. There's enough demand to incentivize companies to invest billions in creating capacity despite a pretty competitive marketplace.

As Martha Stewart says: It's a good thing.

Presented by the International Academy of Digital Arts and Sciences for more than a decade, the Webbys are annual awards acknowledging excellence on the Internet.

While much of the focus is on the best advertising, website design, and interfaces rather than applications, some of the categories include socially beneficial things like Activism, Charitable Organizations Non-Profit, and Health.

You can see a Flash-based layout of the winners here or a straight list of entires sortable by category here.

If you're just looking to spend a couple of minutes randomly looking up cool websites, I recommend the first link. Otherwise for more intensive study I strongly prefer the second link as it's much clearer what things are. Sometimes flashier isn't always better when it comes to presenting information.

But regardless of which link you choose, there's definitely some good stuff to be found herein as these links represent some of the best web design out there.

In The End, The Users Always Pay

| No Comments | No TrackBacks

Here’s a simple truth about broadband that too often gets lost amidst the din: in the end, you and I are the ones who pay for broadband deployment.

It doesn’t matter who’s doing that deploying, be they private, public, or somewhere in between, its the users of those networks that ultimately pay to have them built.

Whether it’s private companies passing through costs and raising prices, or public entities spending tax dollars, or something in between, in the end the money’s coming out of our pockets.

It’s an important thing to remember as it reframes the public vs. private debate around deployment. Instead of defining one as good or bad, it suggests that if it’s our money driving this than we need to be considering two things first: how can we make the most of what we have and what are the goals we want to achieve.

I want to make sure my dollar’s being spent to give me the best possible service.

And I want my dollars to be invested with the long-term social benefits in mind not simply short-term profits.

Private guys are more efficient but not as interested in the public good, and public entities are all about the public good but notoriously inefficient.

Keeping this basic tenant that we're the ones paying in mind is important when considering just about any telecom legislation.

For example, last week I was lamenting about how the net neutrality debate had taken what I feel is a wrong turn when Sen. Wyden began threatening network operators with higher taxes and more lawsuits. What was left out of his remarks is the reality that since users ultimately pay, penalizing network operators would likely trickle down to harm you and me by resulting in higher prices or lower service.

At the same time, if we're the ones paying for the network, and private companies aren't building networks with enough capacity or reach to satisfy what we feel we need, then it suggests we should be putting our money elsewhere.

But at the same time again, I don't want my dollars going to purely public endeavors for fear that they won't be able to innovate in delivering new and innovative services over these networks as that task is most often best left up to private companies so long as they exist in a competitive environment.

I don't claim to propose any answers in this post, but I would suggest that before we make any more decisions regarding broadband in this country, we must first remember that in the end its the users (meaning you and me) paying to have it built, no matter who's actually doing the deploying, and its the users who pay to keep it running, no matter who's actually delivering services.

Fold Proteins, Score Points, Cure Cancer

| No Comments | No TrackBacks

Using your computer to help cure cancer is nothing new; the Folding@Home project has been around for years, leveraging a distributed network of personal computers to crunch numbers related to folding proteins when they're not being used for regular purposes like word processing and web browsing.

But now digital do-gooders have a new opportunity to take a more proactive approach to helping fight disease through broadband: Foldit.

Foldit is a computer game created by the University of Washington. Download/install the application, and you're ready to start contributing to the cause.

To play you manipulate 3D proteins in order to find the best possible ways they could fold. (I have no interest in trying to explain the mechanics/purposes of folding proteins, so if you'd like to learn more about this, click here.)

The important thing to know is that there are limitless ways in which proteins can fold, and "Foldit attempts to predict the structure of a protein by taking advantage of humans' puzzle-solving intuitions and having people play competitively to fold the best proteins."

Things should continue to get even more interesting over the summer when the project plans on releasing the ability for users to not just identify proteins but even allow them to design new proteins that can help prevent or treat a host of diseases.

As a warning to anyone interested in giving this a try, while I didn't catch the precise file size, I do know it took a good ten minutes to download over my top-end Comcast connection. So while the game itself isn't all that bandwidth intensive, be prepared to wait if you're downloading it over a slower DSL line.

And in terms of the gameplay, while interesting and well-packaged, I didn't find it overly compelling. But that said, I've never had that strong of an interest in biology and I'm sure things get more interesting once you make it past the initial training stages.

The goal of this initiative is twofold: first, to see if humans can do a better job of identifying the best way for proteins to fold than computers, and second if we are better to teach computers how to think more like us.

It's a fascinating use of computers and the Internet to leverage the power of the masses to solve complex biological problems. And proof positive that the future continues to be filled with many wondrous ways in which to use technology to make our lives better.

I'm officially flabbergasted by the debate around net neutrality.

This article details a recent speech given by Sen. Ron Wyden from Oregon as he spoke at a Computer & Communications Industry Association conference in DC.

In it he delivered a passionate declaration of his support for net neutrality. And to his credit he went into greater detail than most about his belief of why net neutrality is important and where the idea came from, stepping beyond simply equating it to free speech.

But at the same time, he spoke out directly against the possibility of network operators charging for higher tiers of service, which is something I see no reason why they shouldn't be allowed to do so long as these higher priority tiers of service don't slow down lower priority traffic and/or harm consumer freedom to use broadband however they want.

Here's where things get interesting: in denouncing the possibility of selling higher tiers of service, Wyden basically threatened network operators with two consequences.

First, they may lose safe harbor. Safe harbor is a provision of the Communications Decency Act that frees network operators from any liability associated with content delivered through their network. If they were to lose safe harbor it potentially opens them up to a host of lawsuits covering everything from kiddie porn to piracy to online scams.

Second, they may lose the Internet Tax Freedom Act, which has kept most taxes from applying to Internet connections. If lifted, a number of states will almost certainly begin taxing broadband as some have been wanting to do so for a while.

So if network operators ignore net neutrality and start selling higher tiers of service, they'll be vulnerable to new lawsuits and have to pay more taxes. Hmmmm...has anyone thought through what this might mean to consumers?

Last I checked, network operators didn't like squeezing their profits to pay extra taxes, so isn't it likely that any tax increase would simply be passed on to consumers?

And if network operators have to face a wave of lawsuits due to the loss of safe harbor, wouldn't that also likely result in the costs of litigation being added on to the bills of subscribers?

The cherry on top of this misguided sundae is that if we threaten to take away network operators' money, doesn't that mean they'll have less money to invest in upgrading the capacity of their networks?

So unless I'm reading this wrong, Sen. Wyden just made the assertion that passing net neutrality will mean higher cost and lower capacity broadband.

The only way I can see this not being the case is if net neutrality is followed up by massive government re-regulation of telecom, making a major move back to treating them as monopolies, telling them what they can and can't do, and providing a guaranteed rate of return to incentivize them to deploy everywhere.

At this time I'm not trying to qualify if that would be a good or bad thing, but it does certainly seem like an unlikely turn of events given that the last 15 years have been all about moving as far away from this mindset as possible.

What's frustrating in all this is that everyone has such a head of steam going about sticking it to the big evil broadband providers that it sometimes feels like all rational thought has gone out the window.

We can't punish network operators in such a way as to harm the interests of consumers. That's just not good policy.

What we need to do is set the goals for what we want broadband in this country to be, and then either find a way to entice incumbents into action to achieve them or establish a new alternative to the current market-driven approach to spurring broadband deployment.

And until we get to having that kind of a dialog, I couldn't be more concerned about how the unintended consequences of net neutrality may harm instead of help us.

Government Gets Wiki With It

| No Comments | No TrackBacks

Here's a tremendous article from Governing Magazine about the use of wikis in government.

A wiki, generally speaking, is a webpage or database where the users/readers can edit the content, enabling the collection and collaboration of multiple knowledge sets to establish the best possible answer/definition.

This article is great in that it not only defines what a wiki is and why it's a good thing, it also gives a series of specific examples of wikis in action.

It's a longer piece as it was their cover article, but it's a very worthwhile and informative read that I highly recommend anyone interested in the use of broadband in government go check out.

iProvo: Failure or Success for Muni-Fiber?

| 2 Comments | No TrackBacks

Big news in the world of fiber and municipal broadband. The biggest public fiber-to-the-home project in the country was just sold to a private company.

The name of the network is iProvo, located in Provo, Utah, a city of 100,000. The name of the company that bought it is Broadweave, which before this purchase focused primarily on delivering full fiber networks into new home communities.

While the network was successfully built out, it never attracted the customers it needed to pay the bills, suffering through a series of years losing money.

In total a recent article itemized $8 million in losses realized over the last few years of operation. So even though public proclamations indicated iProvo would be soldiering on, it was really no surprise that something had to give.

To finance the build, the city went $39.5 million in debt. They sold iProvo to Broadweave for $40.5 million.

Many will point to this sale as a sign of failure, not only by iProvo but the entire premise of municipal broadband.

I see things differently.

Provo spent $47.5 million total, they sold for $40.5 million, so in the end the city’s out about $7 million.

But you know what? The fiber’s still in the ground, the network’s still there.

Without the iProvo initiative it likely would’ve been years, possibly decades since they’re in Qwest territory, before anyone else laid that fiber, equipping the community with that much capacity.

Instead, for $7 million the community of Provo got wired with a full fiber network to every home and business that the city can continue using for public purposes. Is that really such a bad deal?

Now this equation is incomplete as it doesn’t put a value on the blood, sweat, and tears that went into giving iProvo a go. And the reality is that with less than a quarter of the population of Provo signed up for service, there’s still a significant hurdle that needs to be overcome if the network’s to ever become a viable business.

But in the end what matters to me is that the fiber got out there, so ultimately what some may see as municipal broadband’s greatest failure I instead see a success story that showcases how even when a public initiative fails it can still find a way to succeed.

Pure Publicity For NASA, But I Like It

| No Comments | No TrackBacks

I don't often link to stuff that's pure advertising, but in this case I made an exception.

Go check out NASA @ Home and City.

Click on the link and you'll open a new window that showcases an impressive, interactive overview of the impact research at NASA has had on our day-to-day lives.

You can view by home or city, and then by room or part of the community. Once in a specific area, you're given the opportunity to click on and learn about innovations ranging from memory metals to edible toothpaste.

What I liked is that not only is the information interesting, it's also really well-presented. The controls are intuitive, and the animations between sections smooth.

This is a truly dynamic example of how information can be presented online.

So if you're interested in NASA, cool new technologies, or excellence in Flash interface design, go check this out. You won't be disappointed.

USF: Behind the Times

| No Comments | No TrackBacks

The Universal Service Fund, or USF, was created by the FCC in 1997, is funded by a small charge telecommunications operators add to your bill, and has the intended purpose of increasing the availability and affordability of telecom services in rural areas and for low-income people.

The USF has come under heavy criticism in recent years as its payouts have skyrocketed--enough so that the FCC recently had to put a cap on it--and most all of its focus is still on subsidizing plain old telephone service.

For more about what USF was, is, and can be, check out this article.

The thing I wanted to add is how frustrating it is that we haven't been able to find a way to redirect the USF to focus on the deployment of broadband.

Now, this isn't a new idea, and the FCC has publicly stated its interest in working in that direction, but somehow we still don't have a solution for making it so.

Even more frustrating is that it's not like this is one of these issues where the incumbents are fighting any change. Everyone knows the USF is broken and needs fixing.

And it's not like we're still in a time where broadband is an unproven quantity with an uncertain future. Heck, a couple months ago the United State Telecommunications Association renamed itself the US Broadband Association, basically announcing that the future is in broadband not plain old phone service.

In the article linked to above, they suggest setting a digital transition date for phone service like the upcoming digital broadcast TV transition, mandating the use of voice over IP through broadband instead of plain old phone service.

I think this is a brilliant idea. And I'd suggest that if I could snap my fingers and make anything happen in the telecom world, one of the first things I'd do is find a way to transition every plain old phone customer into a broadband and VoIP customer.

Imagine how that would impact penetration rates. We'd finally have a way to blow past the mark of 50% of households having broadband!

In fact, I'd recommend any major telephone company to look at doing the same. They're already losing phone customers at an extraordinary rate to cable VoIP service. So why sit around and let the wound continue to bleed when they could potentially plug the hole by eating the short-term cost of equipping everyone with DSL modems and pricing a package at less than the cost of selling DSL and VoIP separately?

I'm not suggesting this would be a painless transition, but it does seem like it will eventually be a necessary one.

But back to USF reform: the simple truth is that until everyone in this country can access affordable broadband at home, there's still work to be done. Ultimately plain old phone service is a 20th century technology. We are now living in the 21st century and need to be considering the effectiveness of government programs and infrastructural needs of our country in that light.

Ultimately, the best and only solution is broadband everywhere. If we want to be a great country, the time to decide "if" we should do something has passed, now we must move aggressively to figure out the how, what, and when.

Using RSS When RSS Isn't Available

| No Comments | No TrackBacks

Last week I wrote about the wonders of RSS and encouraged everyone to set themselves up with an RSS reader in order to better keep up to date with the latest news from their favorite sites.

But there's one little problem: while most sites nowadays offer RSS feeds, not everyone does. So does that mean you're out of luck when faced with this situation? No!

Check out Page2RSS.

Simply input the URL of the page/site you want to follow, and whenever new content is updated you'll be notified in your RSS reader. It basically creates an RSS feed even if there isn't one there already.

A tool like this won't change the world, and ultimately everyone will have an RSS feed to subscribe to, but until if you end up getting into using RSS as I have, this can be a useful tool, especially if the sites you frequent have great information but may not be all that up-to-date with the latest web delivery technologies.

What I find often to be one of the biggest cool factor of the Internet is its ability to introduce new ways to interact with and visualize information. There are just so many things that can only be done online, it's mind-boggling. Here are some more examples of my continuing exploration into new interfaces and online visualizations.

Top of Mt. Everest
I've linked to this website before, but wanted to do so again as I found this 360-degree panorama--which are their specialty--to be particularly striking. It basically shows what your view would look like if you scaled Mt. Everest. The image runs across the entire browser window and features pretty impressive clarity. For the highest impact, I recommend clicking on the fullscreen button in the lower righthand corner. It expands the view to take up your entire screen. This is the first time I've seen this done and it definitely ranks high on the cool factor.

Never Been - A Graphical Story
Here's something that's neat on multiple levels. It's a story told with illustrations instead of words. What makes it unique is it's interface. To move forward you don't turn to a new page, instead your grab the image and drag it around to unveil the rest of the story. Too add interest, it's not a straight line but instead takes turns. In total the original art was nine and a half meters long. Online, though, it can all be fit inside your browser.

Typographical Madness
Have to admit, I'm not sure how functional or practical anything in this site is, but I still did find it interesting perusing through this series of experimentation when it comes to using interactive text. In particular I liked the Description and Good News, Bad News sections best. Beware Motion Sickness as if you're prone to becoming sick due to motion then this might not be the most pleasant experience.

What Is RSS? The Way To Stay Up-To-Date

| No Comments | No TrackBacks

Yesterday was RSS Awareness Day, a day on which bloggers across the world were asked to talk about and promote the value of RSS.

But what is RSS?

To start with, the acronym is most often cited as standing for Really Simple Syndication.

What that means is websites are able to syndicate their content through RSS feeds.

What that means is a website makes an RSS feed available, and then you and I subscribe to it using an RSS reader, which comes in all shapes and sizes. I like Google Reader as it's hosted so I can access it from anywhere, but there are also desktop readers that feature additional functionality.

To subscribe to a feed, simply click on the orange icon with a white dot and two white semi-circles. You can see what it looks like in the upper righthand corner of this site. The icon also often appears in the URL address bar when available. Click on it, copy the link information, and input it into your RSS reader.

OK, this is all well and good, but what does RSS do?

RSS enables websites to offer a feed through which they can push out, or syndicate, new content when it becomes available.

For users, RSS means being able to compile feeds from all your favorite sites into an RSS reader. The reader automatically accepts fresh content pushed to it from these sites. So now instead of having to navigate to each site individually every day, you can see what's new on all of them from a single interface.

There really is no better way to keep track of the dozens of sites you visit on a regular basis, though the downside is if you go a week or two without checking it, you can end up with a mountain of articles to go through. But without RSS I would've likely missed those articles completely.

Another aspect to RSS is its use for media delivery.

Podcasts are often delivered via RSS. You subscribe to the feed of a podcaster, and then whenever they post a new audio or video file, it'll be automatically pushed out to your RSS reader or media player, like iTunes.

The fundamental premise is to provide an avenue through which content owners can push content out to its loyal audience, rather than forcing them to navigate to a site and initiate a download on their own.

And RSS helps empower consumers as if a site is abusing it's RSS feed, sending the equivalent of spam for example, then you can simply unsubscribe to that feed and they can no longer send you content.

RSS is a technology still in its infancy as the vast majority of Internet users have not yet adopted its use, but it introduces a new paradigm in content delivery that should play a significant role in the future of the Internet.

But for now I encourage everyone to find an RSS reader and start subscribing to feeds. If you're someone who likes to stay on top of a lot of sites, you'll soon wonder how you ever survived without it.

There tends to be a perception about the Internet that all the cool new applications are being developed by young people, even pimply-faced teens, pounding out code in their garages, often building fortunes rather than going to college.

Well a new study by the Ewing Marion Kauffman Foundation has called that myth into question.

In their research, they found the majority of startup founders are middle-aged and college-educated. In fact, twice as many people in their 50s headed up startups than people in their early 20s.

To be honest, though, I didn't find these results surprising in the slightest.

From my own experiences, I can say that the majority of applications developers I meet have been founded by adults old enough to be my parents.

And that makes sense. They tend to have more experience in running business, they've got better connections for raising money or they have money of their own to invest, and they've often developed greater expertise in a particular marketplace over their career potentially making them better prepared to develop an application that caters to its needs.

Now, this isn't to say that old people are more successful at startups, though I would say that they tend to be more prepared to grow a startup into a real business. Not to rag on my fellow under-30 set, but I often get the sense in companies headed up by younger people, especially once they start finding some success, of barely controlled chaos. They're often learning on the fly what it means to run a business with dozens of employees, as opposed to those that are older, more experienced, and may already have that experience.

At the same time, I've also noticed that older people have a greater tendency to create companies and let them linger, even if they have not yet and may never find any great level of success. It's a little said to say, but in some cases it's because they're older that there's a greater sense that there may not be many more opportunities coming down the pipeline, which on the plus side motivates them to work harder, but on the negative means that they may be less willing to acknowledge their ship is sinking.

Younger people, on the other hand, seem much less worried about failure, which sometimes means being a bit more sloppy, but overall means they're constantly thinking about and open to new ideas, so often the greatest innovation comes from their fertile minds.

Ultimately the best situations tend to be when you're able to combine the experience, urgency, and doggedness of older people with the energy, creativity, and free-flowing nature of younger people.

Of course, these thoughts are all based to some degree on caricatures. There are certainly energetic, creative older people, as well as experienced, focused younger folks.

But it's still interesting to acknowledge that there is a vast swath of people working in that space in between Google and the garage, and it's my belief that within this group of people the most innovative and impactful use of broadband will emerge.

About this Archive

This page is an archive of entries from May 2008 listed from newest to oldest.

April 2008 is the previous archive.

June 2008 is the next archive.

Find recent content on the main index or look in the archives to find all content.