Why is this page text-only?

ABOUT

App-Rising.com covers the development and adoption of broadband applications, the deployment of and need for broadband networks, and the demands placed on policy to adapt to the revolutionary opportunities made possible by the Internet.

App-Rising.com is written by Geoff Daily, a DC-based technology journalist, broadband activist, marketing consultant, and Internet entrepreneur.

App-Rising.com is supported in part by AT&T;, however all views and opinions expressed herein are solely my own.

« June 2007 | Main | August 2007 »

July 2007 Archives

July 3, 2007 2:21 PM

Building a Richer Web

Read an interesting column this morning by Robert Scoble that appears in Fast Company Magazine about the race to build a richer web, and wanted to take a moment to boil this down even further.

By saying “richer web” what I’m referring to is the trend towards flashier websites that increasingly feature animations, video, interactive elements, and other rich internet applications.

To date, the vast majority of that flash has been powered by… well… Flash, the ubiquitous multimedia enabler owned by Adobe.

But that may be about to change as in April Microsoft launched the beta version of Silverlight and in May Sun Microsystems announced JavaFX. Without getting into any technical specifics, the gist of these two products is that they will compete with Adobe Flash in the development of a richer Internet experience.

It’s an exciting, evolving space that is driving the push to make the Web truly interactive and much more engaging (or distracting, depending on your point of view).

From a bandwidth perspective, most uses of Flash to date haven’t been all that taxing on the network. Animation files tend to be small, and the interactive elements rely more on your computer’s CPU than its network connection.

But the boundaries of what a richer Web entails are expanding rapidly.

Flash now powers collaborative workspaces and Web apps that mimic the functionality of desktop apps like word processors. Real-time collaboration with other people or interaction with an application located remotely on a server demands a low latency connection so the experience doesn’t stutter or lag.

Flash is also increasingly enabling the delivery of video, primarily on-demand but with new live capabilities on the horizon. More video means a greater need for more bandwidth.

And the richer Web, already bitten by the video bug, will only become more video-centric with the introduction of technologies like Microsoft’s Silverlight, which has been designed with the delivery of high quality video in mind.

In fact, to encourage the adoption of Silverlight, Microsoft is giving away up to 4GB of free hosting for video. With that storage space, anyone can upload DVD-quality clips of 10 minutes or less and then incorporate that video into a website or Silverlight application. While Silverlight’s in beta, users get unlimited streaming, but even once the product launches Microsoft has stated their intentions to continue offering up to a million minutes of streaming video for free to Silverlight users.

The richer the Web gets the more bandwidth intensive its demand on the networks that deliver its traffic becomes. And with the introduction of new competitive technologies that help enable a richer Web, we’re likely going to only see the rate of innovation in this space increase even faster than it already has been.

July 5, 2007 11:59 AM

Video Thursdays: Innovative Integration in Fort Wayne

Digging back into materials from the Killer App Expo, wanted to share another video from the Technology Evangelist guys, this one an interview with Todd Plesko, CEO of Fort Wayne-based triPRACTIX.

Interview is pretty techy, but triPRACTIX’s primary business is providing IT services, hardware, and applications to healthcare facilities, which can include EMR, data hosting, VoIP, etc.

They’re resellers of various products from companies like: GE, Cisco, Microsoft, IBM, and Hewlett Packard.

But they also develop some of their own applications in house, as evidenced by their Extension product discussed in the video above.

Now, I have to admit, this interview dives a bit deeper into tech speak than I feel comfortable with, but I still wanted to share it as it highlights a pair of interesting observations.

One, it’s great to see more companies creating solutions that make different technologies work together. One of the biggest things holding Internet applications back is the lack of interoperability, whether they’re EMRs or videocalling technologies, most applications that were developed independently don’t talk to each other even though they provide analogous functionality.

Making independent technologies work together is going to be a key driver of the usability and efficiency of broadband applications, and I think we’ll continue to see more companies like triPRACTIX stepping up to try and fill that void with innovative solutions like Extension.

Two, when you think of Fort Wayne, IN you don’t exactly conjure up images of a vibrant hotbed of technology, yet here’s a prime example of a cutting edge broadband company that’s thriving in a smaller city.

I’ve been fortunate in my writings to converse with a number of entrepreneurs who have set up shop all the way from Winthrop, WA to Greer, SC. One of the greatest things about a broadband-powered Internet is that innovation can happen anywhere there’s sufficient connectivity, whether that’s Silicon Valley, Fort Wayne, or anywhere in between.

July 6, 2007 12:06 PM

Friday Fun: Broadband Powers Art

TGIF everybody! As we head into the weekend I wanted to take a moment to share some lighter fare with you, something fun to explore and get you ready for a weekend that's hopefully carefree and fun-filled.

Today I'm pointing you to an interesting implementation of Flash that combines thousands of pictures to create an interactive mosaic, which serves the purpose of promoting the first in a series of films called A Moment On Earth.

It takes a minute to load all the images, but once it's up and running you're able to scroll your cursor over the mosaic with what amounts to a magnifying glass. Click on any image that catches your fancy and it'll expand so you can get a better look. Click on the "caption" button and you're able to read a short blurb about the circumstances under which the photo was taken.

The images you're perusing were captured from all over the world, so in one sitting you're able to circumnavigate the globe.

This may not be the coolest app in the world, and it's not overly bandwidth intensive, but I do think it highlights how through broadband the Internet we're seeing constant innovation in the creation of media and new forms of communication.

You couldn't make this in a book form or put it on TV without it becoming a radically different experience. Only online could something like this be created. And really only through broadband is it feasible due to the size of the file.

July 9, 2007 11:52 AM

The Global Hype of Live Earth's Audience

As the thermometer began to boil, I decided to spend much of Saturday indoors, which gave me ample opportunity to enjoy some of the daylong global concert event that was Live Earth.

While the hours were filled with top notch performances coordinated across all seven continents (yes, even Antarctica), I found myself fostering a growing annoyance at the oft-repeated claim that these concerts were reaching an audience of two billion people.

Let’s put that into perspective for a minute. The Super Bowl only reaches a potential global TV audience of a billion people, with an actual estimated viewership of closer to 200 million. The final match of the FIFA World Cup was watched by 715 million people. And to step back even further, 2 billion people represents nearly a third of the world’s population.

Now, I’m willing to concede that perhaps when you add up all of the TV coverage of these concerts around the world that the overall potential reach may hit upwards of 2 billion households, but there’s really no way for anyone to accurately say how many people were actually tuning in as even the best guesses of things like Nielsen television ratings are rough approximations that can only account for the US audience.

But since Live Earth was also simulcast online by MSN, perhaps by considering the audience that tuned in over the Internet we can better understand the reach of an event like Live Earth.

According to a press release put out by MSN on Saturday, “The over 10 million streams MSN has delivered so far today represents a milestone in live Internet broadcasting.”

Delivering 10 million streams of live video is a very significant achievement, but as pointed out by streaming industry guru, Dan Rayburn, in his Business of Video blog on StreamingMedia.com: “The title of the [press] release says ‘Live Earth Global Concerts Reach More Than 10 Million Online at MSN…’ More than 10 million what, people? Apparently not.”

His point is that serving 10 million videos does not mean you reached 10 million people as the same person can request a stream multiple times. For example, say someone starts watching at home on their desktop and then continues watching on their laptop in a coffee shop later on in the day. There’s also the potential that they’re counting an additional stream served every time someone clicks to watch the London concert vs. the one in Tokyo.

And since MSN chose not to release the actual number of simultaneous users they peaked at, other than to say that they had “the most simultaneous viewers of any online concert ever,” it makes it nigh impossible to ferret out just how much reach this event produced.

What frustrates me is that now because I heard over and over again how these concerts were reaching 2 billion people, I can’t help but feel underwhelmed upon hearing that the online webcast of the event only served 10 million videos, especially since the Internet was the only place you could have any control over what you were watching.

And that’s really sad as by many accounts, watching Live Earth online was a tremendous success, both when looked at from the quality of the viewing experience that was enabled and from the fact that no matter how many individual people it reached serving more than 10 million live video streams in a single day is a tremendous accomplishment.

At the same time, this event again highlights the challenges of getting a large-scale audience to show up for a live event online as even if 10 million individual people had shown up, that still only represents less than 5% of that over-hyped total potential audience of 2 billion.

July 10, 2007 12:52 PM

Northern Ireland Promotes Itself as Center for Digital Content

Yesterday a press release crossed my desk that caught my eye, namely the launch of Crea8ivity.

Crea8ivity is an awareness campaign aimed at raising the profile of Northern Ireland as a center for the creation of digital content.

As part of a larger broadband content initiative, the Department of Enterprise, Trade, and Investment will be building “interactive communities at the leading edge of digital content,” gathering international market intelligence, developing a buyer database, and participating in “key overseas trade events and exhibitions in Europe, North America, and the Pacific Rim,” according to this press release.

Another aspect to this Crea8ivity campaign was a contest where more than 80 companies in Northern Ireland submitted proposals to be selected as exemplar projects that “demonstrate the creative and technical talent available in Northern Ireland.”

Out of that group, six projects were chosen. They include: online training in music production, watercolors, and continuing professional development for medical and pharmaceutical professionals as well as an online interactive aerial combat game, interactive services for golfers on the golf course, and “the world’s first totally interactive 24/7 comedy show."

These six projects will receive in total £1.4 million, or up to £250,000 per project.

While the press release and their website are somewhat scarce on the details, on the surface this campaign seems like a great example of how government can step in and proactively foster the development of the digital economy.

This isn’t about tax breaks or streamlining cumbersome regulation, this is purely about promoting and supporting the efforts of creative entrepreneurs and small business owners, helping elevate their businesses to better compete on a global stage.

And as the world catches up to, and in some cases surpasses, America’s adoption of the Internet, it will become increasingly important for our broadband businesses to be able to gain traction for their content and/or applications in international markets. So I think it’s vital that we take note of the fact that the creative energy being put into developing new applications and content is one of our country’s most valuable resources and do whatever we can to support it.

I’ll be keeping a close eye on Crea8ivity’s development over the coming months to ensure that my readers are kept up to date on its happenings. I’ll also be contacting these guys to learn more about this endeavor with the goal of trying to glean knowledge from their experiences that perhaps can be applied to helping continue to foster the digital economy here in America.

In the meantime, here’s a short promotional video about Crea8ivity. Enjoy!

July 11, 2007 1:46 PM

Going In-Network the Future of the Internet?

Wanted to take a moment today to ponder my most recent KillerApp.com article, which details how video sharing/editing provider HomeMovie.com is evolving their business by moving in-network with their application.

Without rehashing too much of that article, the benefits of taking an application out of the cloud that is the Internet and bringing it into a network are simple: a higher level of quality of service.

In the case of an application like HomeMovie.com—which, as its name suggests, involves sharing home movies—that means instead of a 300-500Kbps stream, users can share video that’s up to 2.5Mbps, essentially the bitrate you experience when you pop a tape into your VCR.

Despite the Internet’s ability to destroy all barriers of distance, the further traffic has to flow over the Internet the more likely it is to hit areas of congestion that slow down and hinder performance. Simply put, the closer you are to the server you want to access the better the application or piece of content will run.

That’s the underlying premise behind some content delivery networks, most prominently Akamai. What they’ve done through their EdgePlatform is partner with ISPs to drop more than 25,000 servers in more than 1,000 networks across the US and in 69 different countries, all with the goal of shortening the distance between server and end user. (As an aside, this is in contrast to most CDNs, which rely on a handful of large, centralized server farms.)

But Akamai is still primarily interested in serving general Web content to mass audiences. What I’m more excited about are opportunities like UTOPIA, as I discussed a few weeks ago in this AppRising post, where there’s space ready and waiting for applications to come in-network and start taking advantage of the huge capacity of a full fiber community.

I know in talking with network operators both large and small that they’re seeing tremendous opportunities to help enable this next generation of the Internet by bringing applications in-network where they can leverage their ability to manage their network and deliver better quality of service, lower bandwidth costs, and the potential to drive new revenue streams from innovative Internet applications.

I truly believe that by encouraging and tracking this nascent trend of applications moving in-network that we will bear witness to the Internet finally living up to its potential, no longer restrained by the vagaries of insufficient bandwidth, in particular in fiber communities where the Internet’s bandwidth limitations essentially disappear.

July 12, 2007 10:18 AM

Crea8ivity Not Alone; and Video Interview with HomeMovie.com

Working on getting myself out the door for a weekend away with the wife to visit Lafayette, LA, so no major deep thoughts today. Instead, I wanted to add on a couple of items to posts I made earlier this week.

Two days ago I wrote about Crea8ivity, a campaign by the government of Northern Ireland to promote their country as a hub for digital content creation.

What I didn't realize is that at the same time I was posting that, an event was being held in San Francisco by KIICA Silicon Valley.

KIICA Silicon Valley is one of 8 "global IT Korea promotion organizations founded by the Ministry of Information and Communication of Republic of Korea. Our primary goal is to deliver value added localization services to Korea's emerging information technology companies seeking presence in US markets," according to their website.

These services include everything from legal education to investor relations to public relations to helping provide office and warehousing space.

They've been around since 2000 and seem to have found a fair amount of success as they're able to boast more than 100 companies that have gained assistance from KIIA Silicon Valley in order to establish a sustainable market presence in the US.

I'm going to continue learning more about these specific initiatives while trying to unearth other similar ones around the world as I think they hold great insight into how we can continue supporting our digital economy.

In the meantime, if anyone knows of any analogous programs in other countries or here in the US, please contact me at [email protected] so we can discuss further.

------------------------------------

One last thing: I just discovered that I missed out on an opportunity to include some video with yesterday's post about HomeMovie.com of an interview with their president, Steve Smallman.

It's another fantastic video by the Tech Evangelist crew, and it helps highlight some of the points I made in yesterday's post. Enjoy!

July 18, 2007 4:25 PM

Back from Lafayette and Pondering Municipal Broadband

After a long weekend in Lafayette, LA, I’m back in the saddle and ready to share stories from my first immersion in this Cajun community that’s on the verge of deploying a fiber-to-the-home (FTTH) network.

But before going too far into details about who I met and what I learned during my visit, I want to lay out my position on municipal FTTH initiatives like Lafayette and UTOPIA.

To frame this discussion, let’s start with a basic premise: the Internet is powered by fiber optics.

True, broadband is often defined by slower copper-based last mile access networks like DSL and cable, but when you send data over the Internet the vast majority of the miles it traverses is spent riding on fiber optic cable.

Because of this, I don’t see FTTH as something new and different so much as an effort to extend the full power of the Internet to your front door.

But I look at this truism in an even broader sense, acknowledging that while FTTH is the logical endgame it’s also worth supporting any and all efforts to push fiber further into the network, no matter if that fiber’s running all the way to the home, to the node, or even just in a loop around a community.

Simply put, the more fiber we have the better prepared we will be to support the oncoming exaflood.

Because of this belief, I’m a supporter of anyone who’s interested and able to lay that fiber closer and closer to homes and businesses, be they private, public, or somewhere in between.

Unfortunately, not everyone who’s interested is able, and many of those who are able aren’t interested.

Deploying fiber, and in particular FTTH, is an expensive, complicated, time-consuming, and therefore somewhat risky process.

Many believe it’s something better left for the private sector lest local government entities become overwhelmed by the technological and operational hurdles of deploying and running a fiber network.

And there’s no denying the huge risk associated with this process as no one wants to see a community cripple its financial well-being in pursuit of a project that may be bigger than they can bite off.

But that does not mean municipal broadband is infeasible as another means for enabling the deployment of fiber, especially when a community has the proper elements in place to make it happen, as seems to be the case in Lafayette.

The first of these elements is the Lafayette Utility Services (or LUS). LUS was created 100 years ago under remarkably similar circumstances as it stands today: new networks (these for transmitting electricity) were being laid in the region but not to their city, so instead of waiting for private entities to deem their city worthy of investing in, they created LUS and built out their own electrical grid. As a result, their community has been able to establish itself as a dynamic, progressive counterweight to Baton Rouge and New Orleans.

More recently, LUS has gained experience laying and operating fiber through an effort in the mid-90s to deploy a fiber ring around the city. So it’s not as if they’re walking into this blind to the many challenges before them.

The second important element is an engaged community, in this case one that doesn’t like to be told by outsiders what they do and do not need. This past Monday marked the two year anniversary of a public vote that garnered more than 60% approval for the deployment of a FTTH network to every home in the city’s metro core.

Even with these two elements in place, though, Lafayette’s deployment of FTTH is still a risk, as evidenced by the $110 million in funds LUS recently had to obtain through bond sales to support this buildout. That’s a big number and there will be a lot of pressure on them to get the fiber in the ground quickly and efficiently in order to make good on their initial business model.

Will Lafayette be successful with its fiber build? It’s far too early to tell as they’re still 18 months away from serving their first customer. But can they succeed and should we continue to encourage all entities that are willing to take the risk to invest in next-gen advanced fiber networks? I say yes.

Now with that out of the way, you can look forward to posts tomorrow and Friday that highlight some of the innovation and applications I discovered during my travels in and around Lafayette.

So stay tuned!

July 20, 2007 11:17 AM

On the Prowl for Apps in Lafayette

While in Lafayette, my host, Abigail Ransonet, founder of Abacus Data Exchange, arranged for me to meet with a series of local entrepreneurs and leaders in her community.

I chatted with a lot of different people and learned about a number of different things related to the current and potential utilization of broadband in Lafayette and beyond. Here’s a rundown of some of the people I met and things I learned:

- On Tuesday during breakfast I met with Ray Abshire, a local entrepreneur who runs Magnolia Torque & Testing, a company that provides services to the oil industry, and who has begun to invest in the area’s digital economy.

He also happens to be an accomplished accordion player, noted for his efforts to keep the spirit of traditional Cajun music alive. While chatting over some sweet potato banana pecan pancakes he introduced me to a site he uses called SonicBids.com.

What this site enables is for musicians and music promoters to come together and exchange information online rather than through the mail. Artists can create electronic press kits complete with bio, audio, video, photos, and more. And promoters can search through them. Over 70,000 artists have signed up and over 6,000 festivals, music conferences, and clubs from over 100 countries.

- My first meeting Monday morning was with Joe Abraham. Joe began his career as a medical doctor, but he found himself growing disillusioned with the fact that while he could cure patients’ bodies he couldn’t change their attitudes.

He had a strong desire to contribute to the world in a greater way, which led him to create a non-profit online bookstore called BooksXYZ. Their mission is to provide a way for consumers to not only find and buy great books, but then be able to direct a percentage of their purchases to the school of their choice.

They’ve made the effort to build a database of every secondary and post-secondary educational institution in the country. They offer more than 2 million books for sale. And to date they’ve been able to pass along more than half a million dollars to schools.

It’s a pretty straightforward site right now, but they’ve got a number of interesting community features being worked on in the background. And I love supporting projects like this where the focus is on the greater good rather than personal profit. So if you’re looking to buy a book online, go check out BooksXYZ.com first and see if it might be able to provide you with not only a great new book but also a greater sense of personal satisfaction.

- Lunch Monday was a highly tasty shrimp po’boy shared with Casey Deshotels, CEO of Teamwork Ad Group, a local multimedia ad agency. During this convo I learned about his interest in developing applications based off of the Flash Media Server that can replace traditional desktop applications. He did mention one in particular that they’d done, though I’m rather embarrassed to admit that I can’t remember the specifics as my attention was too diverted by devouring that fantastic po’boy. (Casey, if you read this, please drop me a line with the info I’m missing!)

But I do remember one other very interesting element to his plans for building his business. He’s nearing completion of renovating a 5000 sq. ft. space he owns, equipping it as a sort of utopia for creative designers of all sorts. His plans are to bring together creative types who may currently be running their business from their home to come in and locate their businesses in his space, in turn providing him with a dynamic cadre of talent to leverage for his ad business.

- Monday evening I had the opportunity to address a group of 15 people from the community who were interested in educating themselves further on how their community can be making the most of this new fiber network they’ll be laying. (To whet your appetite, next week I’ll be posting some video excerpts from my remarks.)

At this get together, I had the opportunity to meet Logan McDaniel, CIO of Layette Parish School District. He shared with me an interesting online service that he’d encountered a year or so back that enables schools to receive multiple versions of the same article geared towards different levels of reading comprehension. These articles can include national news as well as local. Unfortunately, the name of this service escaped him and I have not yet been able to find it on Google, but it is compelling nonetheless.

- Earlier on Monday I sat down with Howard Chaney, IT Manager for Copy & Camera Technologies, a local reseller of Canon products (scanners/printers/copiers/etc.). When you think of scanners you may not think broadband, but we discussed briefly the possibilities of scanners that can send data as it scans a document, rather than scanning, saving, and then sending. Doing this can be fairly bandwidth intensive as it requires sending data as fast as it’s scanning.

There aren’t necessarily a lot of uses for this yet, and it may not end up being incredibly significant, but there may be some interesting possibilities for this technology to supplant the fax machine as the most trusted way of sending documents as it takes out of the equation the possibility of someone scanning, saving, manipulating and then sending.

July 23, 2007 12:15 PM

Immersing Myself in Lafayette in 3D

One of the highlights of my trip to Lafayette was a visit to LITE, aka the Louisiana Immersive Technologies Enterprise.

LITE “is a state-of-the-art resource center encompassing the world’s first six-sided, digital virtual reality cube as well as the world’s largest digital 3-D auditorium. This leading-edge visualization environment is powered by one of the most powerful graphics supercomputers in the world,” according to the Lafayette Economic Development Authority’s website.

This 70,000 square foot, $27 million project opened in the fall and is unique not only for its facilities but also because of the fact that they’re available to anyone to rent out. The “cube” in particular is a technology that has previously existed only behind the walls of private corporations or research institutions until now.

During our tour, I tried my best to capture video of the experience, though the combination of 3D not translating well into 2D and suspect camera work on my part leave something to be desired with final product. In any event, here’s a 10 minute video that shows you what I saw during my tour:

The video starts in their large auditorium, featuring a 37’ foot screen that can display up to three different sources simultaneously. It’s also 3D-enabled, so throw on a pair of stylish 3D glasses and the images pop off of the screen.

The demo of this model for tracking the growth of fires in a building virtually is more proof-of-concept than finished product, but it was still compelling to see how an entire room full of people could share the same experience in 3D.

Next up we went into their Flex space, which is an open room that includes a large screen, control wand, and tracking system. Notice as the person with the wand moves around how the perspective of what’s on screen changes. Being able to draw something and then walk through it was fascinating. And it was so easy to use, my host’s 9-year-old granddaughter, Amelie, was able to pick it up and start interacting right away.

From here we moved on to the piece de resistance, affectionately known as “the Egg”. There’s a brief clip in the video that shows how they built out the housing of the Egg into an eye-catching outgrowth of the building. Even though the space inside the six-sided “cave” isn’t incredibly large, it needs a large housing in order to accommodate the powerful cameras that display images on each side of the room.

My clips from inside the Egg don’t really do it justice as there was a definite sense of being in a vast, open space while moving around the chapel. I loved watching Amelie get excited over her sense that we should be falling while floating around the space.

These spaces and visualization technologies really felt futuristic, almost space age, though I have to admit a conflicting sense of being underwhelmed, driving in large part by expectations set up by years of dreaming about the holodeck on Star Trek, which we’re still a long ways from achieving.

But we learned about a number of practical uses of spaces like these, including being able to walk through a shopping center before it’s built or scouting a deep sea dive in order to create a better dive plan. John Deere has also used similar technology in the past by creating a model of a tractor design that was having trouble keeping the cab at a consistent temperature; they were able to visualize the airflows therein and determine where the vortexes were that were causing this imbalance.

Near the end of the tour I asked about their use of broadband in association with these technologies. While there was another room for which I didn’t grab video that enables HD videoconferencing, they admit that they have not yet had any great examples of leveraging the fiber optic connectivity available in the building in order to enable new forms of networked immersive environments.

But when you consider the amount of processing power and storage installed in their data center, and the amount of information necessary to power 3D images and environments, one can see how sometime down the road, technology like this could begin driving huge demands for bandwidth, especially as we get into virtual worlds where people can interact online and in 3D.

July 24, 2007 10:52 AM

Tracking Broadband Adoption in Lafayette and Beyond

Towards the end of my meetings last Monday in Lafayette I had the great fortune to sit down to chat with Andre Comeaux. Andre works in the area as a VP for Regions Insurance and, more pertinent to the matter at hand, he serves on the Greater Lafayette Parish Chamber of Commerce.

What we spent most of our time talking about was his efforts to champion a study of Lafayette’s current adoption and use of broadband.

In his mind, we can’t know where we’re going and/or how far we’ve gone without knowing where we came from, and in order to understand that we need to have a fuller understanding of how, and if, the Internet is being used today.

I think he’s spot on in his focus on this area, especially in a community like Lafayette that stands on the verge of making a major investment in its fiber infrastructure. I say this not only as a way to hopefully justify the cost of the fiber down the road, but also because of Andre’s savvy belief that if they can chart where they are today and then compare that to where they end up tomorrow, they’ll then have hard data that can be used to spur government officials into action, either through championing the successes that have been realized or stepping up to more fully support underachieving areas.

Andre’s not alone in understanding the need to get more information about how people are using the Internet today.

A couple of weeks ago RVA Market Research released the results of a survey it conducted of 100,000 people who live in fiber communities. You can download the Powerpoint slides of their results here.

While most of the questions in this survey had to deal with issues like speed, cost, and satisfaction of fiber vs. other last-mile technologies, there was one slide in particular that caught my attention, lucky number 13 in the presentation.

I found this graph to be highly eye opening. Looking at the numbers, the application with the highest adoption rate is downloading full-length movies, followed by gaming and video-on-demand services (I’m assuming these refer to everything from YouTube to TV shows).

The fact that these three were the highest isn’t surprising, what is are the rates at which they’re being adopted. Only full-length movie downloads currently top 20%. Remember, that isn’t 20% of all Americans, or even 20% of everyone who’s online. Out of the people who have fiber optic connectivity, only a fifth of them are downloading movies. Less than a fifth are gaming or using video-on-demand services.

Looking further down the list, the numbers become even more startling. Personal videoconferencing at 6%; home security at a mere 2.5%.

Then you get to the bottom and discover less than 1% adoption of distance education and telemedicine, though with the caveat that the survey appears to have narrowed these topics down specifically to videoconferencing rather than leaving it open to the full scope of applications that help enable distance education or telemedicine.

(Also worth noting on a more positive side, though, is the fact that this survey found that FTTH communities tended to have more people working from home more days than prior to the deployment of fiber.)

There’s endless talk about the need for new advanced broadband networks with more capacity in order to support all the wonderful things the Internet can do, yet with numbers like these it makes me think we’re not doing enough to identify how people are actually using the Internet today.

Case in point, take a look at the Broadband Data Improvement Act (you can download a PDF of its contents here), which aims to increase the granularity of our understanding of how much broadband deployment we actually have in this country. Despite this bill’s many positive elements, and the promise of millions of dollars in federal funding on the way to fund this research, not one word in this bill is devoted to ferreting out more info about how people actually use the Internet.

That’s astonishing to me. Time after time when people talk broadband they’re only discussing the supply side of the equation. It’s high time we start paying closer attention to what’s driving demand for bandwidth and how we can create more of it in order for us to more fully realize the true potential of the Internet.

As I continue my own personal hunt for how broadband is having a positive impact on people’s lives, I’m headed out to a brown bag luncheon featuring the Alliance for Public Technology’s Broadband Changed My Life series, with this session focused on benefits for older and disabled peoples.

More on my experiences at this event tomorrow!

July 25, 2007 4:48 PM

Overcoming Disabilities Through Broadband

Had the great fortune yesterday of attending a brown bag luncheon focused on the topic of how broadband is benefiting older adults and people with disabilities as part of the Alliance for Public Technology’s ongoing Broadband Changed My Life Series.

Wanted to share a rundown with you all about the many things I learned, starting with the remarks made by Jenifer Simpson, senior director, telecommunications and technology policy for the American Association of People with Disabilities.

She began by pointing out that just because someone is disabled doesn’t mean all they do online is health related, rattling off a string of mainstream applications that have proven popular among the disabled, including online gaming, eBay, YouTube, and Facebook.

She then suggested that the simple reality of broadband being faster than dialup is having a profound affect on how the Internet is embraced by people with disabilities as it allows them to surf the web and check their emails more quickly and therefore not become frustrated with waiting for pages to load.

The anonymity of the Internet is often maligned as a potential cover for ne’er-do-wells, but according to Jenifer it also enables a leveling of the playing field for others as it means people don’t have to reveal they have a disability to when communicating with others online.

At the same time, she highlighted the fact that despite the many benefits of broadband, most people with disabilities are still not connected, citing a small growth between 2002 and 2006 from 26% to 30%, as compared to those without disabilities growing from 57% to 62%.

A key limiting factor she sees in this equation is the fact that adaptive hardware and software, which help make computers more useable for people with disabilities, are still expensive, not readily available, and it can be difficult to find qualified help to assist in setting them up.

(Later on in the discussion, a blind gentleman spoke to this point, stating that purchasing a screen reader for his computer can cost as much as the computer itself, with the same paradigm holding true for equivalent technology on mobile phones.)

Also worth mentioning here is a new organization called COAT (Coalition of Organizations for Accessible Technology), which unites more than 90 groups all focused on ensuring the widespread availability of IP-based technologies that ensure people with disabilities are not left behind. I’ll be sure to follow up on their efforts at a later date.

Jenifer then moved on to a series of areas where broadband is having a profound impact on the lives of people with disabilities.

She recounted an anecdote of a hospital that wired itself for broadband and made the effort to ensure patients had access, and as a result they found patients starting to communicate with other patients to a greater degree and becoming more proactive in learning about their own journey through the healthcare process.

She then cited the serious issue of jobs among the disabled, a group that is saddled with an unemployment rate of 53%. While she couldn’t necessarily point to a great number of people for which this is true today, she acknowledged the tremendous potential for the disabled to establish work-from-home businesses because of broadband. Additionally, she mentioned a Goodwill program that trained people with disabilities to use the Internet to research and apply for jobs, an increasingly important skillset given the growing number of companies moving their job applications procedures online.

She referred to a project called Having a Voice, where mental health patients were able to connect with counselors over the Internet, though I’m still working on trying to find more specific information about this. (A link to the program she may have been referring to can be found here.)

In general terms she mentioned the increasing use of video of people communicating in sign language and of audio streaming as ways in which the Internet is becoming more usable for deaf and blind people, respectively.

Getting into more specifics, she cited the use of video relay services, through which the deaf can conduct more real-time conversations by routing the call through a sign language interpreter. In many instances, this technology is replacing the use of TTY boxes, which leverage transcribers to turn voice into text, but which also tend to create long pauses in conversations.

Technology like this is something that’s very reliant on having a fast broadband pipe, both to minimize the time it takes to route messages through the interpreter, and because for the deaf person to understand the interpreter, the video has to keep up with their hand motions.

Video remote interpreting is a kissing cousin to video relay services, only its purpose is to allow access anywhere with broadband to an on-the-spot interpreter to help facilitate communications in instances where someone who’s deaf is trying to communicate with someone else in person.

While the deaf community has found this technology to be very useful, most still prefer having interpreters on-site, plus these services can be prohibitively expensive. As a result, their primary use to date has been in either rural or emergency situations.

All in all, it was a fascinating overview of how broadband is being used by the disabled community.

July 26, 2007 11:58 AM

Attempting to Open Debate Over Federal Broadband Legislation

I’ve been tuning in to the fascinating, though somewhat disappointing, experiment in social legislation going on over at OpenLeft.com.

On Tuesday, US Senator Dick Durbin sat in for an hour responding to comments/suggestions as to ways in which the federal government can be enacting legislation that promotes the deployment of broadband, competition between services, and fairness in the treatment of Internet traffic.

Half of his responses were pretty generic politix, but you could also tell he was really trying to engage with the debate, and there was a fair amount of quality posts made by the public.

Unfortunately, technological limitations hampered their ability to conduct a true debate. For example, you had to refresh your browser to get new posts (though there were helpful red “new” tags next to fresh content).

Last night, Ben Scott, Mark Rotenberg, and a couple of Durbin staffers manned the helm in a lively “debate” about net neutrality.

I put debate in quotation marks to describe this instance as I’ve always believed that real debates demand a more vigorous effort by both sides of an issue to put forth new thoughts that support their cases rather than endlessly rattling the same old thematical sabers.

The Internet needs to be free and open. Someone has to pay to support new network buildout. The big telcos are evil. The Internet guys don’t understand the potential unintended consequences of preemptive legislation.

Yet in many ways this lack of a true debate is indicative of a larger problem in our political system: how the increasing polarization of politics has led us to a mentality of us vs. them, black vs. white, if I’m right then you must be wrong.

I couldn’t be more excited about the potential for what OpenLeft is starting, leveraging the Internet’s inherent democratization to open more direct lines of communication between the public and Congress.

But as we explore these new opportunities at what they have termed “Legislation 2.0”, all sides of a debate must recognize the need to continue driving the conversation forward with new ideas rather than just shouting back and forth at each other without listening to and seriously considering the other side’s point of view.

Next week I’ll be diving into a number of policy issues as they relate to broadband in the hopes of accomplishing just this feat: establishing positions that don’t adhere to one side or the other of an issue but instead attempt to mine new ground in that space in between, which is where I believe the best legislative answers will ultimately be found.

July 27, 2007 12:24 PM

The Tower of Babel for Telepresence?

Back in the first week of June, a new conference held court called Telepresence World, which, as its name suggests, focused on the emerging technology known as telepresence.

So what is “telepresence”? Generally speaking, it refers to videoconferencing technologies that enable users to feel as though the person they’re talking to was sitting right in front of them.

This space is evolving rapidly with a number of different, though similar, takes on this same concept from companies like Cisco, TANDBERG, Polycom, and Teliris.

What sparked me to dive into the topic of telepresence today was an interview I found on TechnologyEvangelist.com of Teliris’s CEO and CTO, Mark Trachtenberg, conducted by Howard Lichtman, founder of the Human Productivity Lab, one of telepresence’s biggest cheerleaders, and a presenter at the Killer App Expo back in May.

Unfortunately, the interview appears to cut off prematurely, but even still there’s something quite interesting to pull out of this. Namely, the announcement Mark and Howard discuss regarding Teliris’s recent launch of technology that enables telepresence systems from different vendors to talk to each other.

While Mark stresses how this product is aimed at customers of Teliris’s own telepresence solutions rather than people who only own competing technologies, this announcement still holds a large amount of potential significance.

Telepresence solutions pretty much always run into the tens of thousands, and often hundreds of thousands, of dollars. Many big corporations have deemed the investment as worth it, installing not one but multiple instances of this technology throughout their distributed enterprise workforce.

But the usability of these products has been severely limited by the utter lack of interoperability. One corporation may spend millions with Cisco and another millions with TANDBERG, but in the end they can’t talk to each other, only within their own companies.

I’d imagine if Teliris can live up to the promise of this new paradigm in the interoperability of telepresence technologies it will not only make their products more enticing to potential customers, ultimately it will also force everyone else in the industry to start taking the integration of interoperability into their products more seriously.

And as the interoperability paradigm takes hold, it will undoubtedly make the technology more enticing to more companies, increasing both the reach and impact of this cutting-edge and highly bandwidth-intensive application.

July 27, 2007 3:45 PM

See World in New Light Through Online Map-Making

Found two separate but related articles I wanted to share with you all this afternoon.

First up, a great New York Times article about how web applications are reshaping the world of cartography, aka making maps.

It highlights a handful of the tools that are available for both finding and creating maps, while also delving into a discussion of how the old paradigm of maps not being easily accessible and taking too much effort to generate, especially for niche interests, is being upended online.

Serendipitously, today I also came across this listing on Mashable.com of more than 50 tools and resources for online maps.

In this list you’ll find an incredible array of things to do with online maps, including:

- Build custom Google maps to post on your website

- View and/or add to a user-editable map of the world

- Check out a map that reflects current traffic conditions in your area

- Plot out a custom running course

- Analyze a map showing the paths of tornados

And so much more.

One particularly interesting area of growth from a bandwidth-intensive perspective is the increasing number of ways in which to incorporate multimedia elements into a map, like video from a particular location or point in time. Like this site.

At first blush, maps may not seem to demand all that much bandwidth, but that’s just not the case anymore as they're gaining new functionality each and every day.

What I also started to think about while reading through this list and the New York Times article was a conversation I had during my trip to Lafayette, LA when I sat down with Keith Thibodeaux, the parish’s CIO.

During our talk, his GIS—or global information systems, which are essentially maps, though much more detailed and dynamic—guy came in to chat. He began discussing how the government is using GIS systems to do things like track government vehicles in real-time and gain a better understanding of how the different pieces of the city fit together.

While most of the work they’re doing with GIS today involves running the application locally, only pulling in little bits of data from outside sources like the wireless sensors they’re installing on government vehiciles, he shared with me that one of the primary files they use, which includes a high-res image of the parish layered with things like where underground cables are located, is 60GB.

That’s a huge file, and one that will only grow as additional information and higher resolution images are added. It’s also a file that at some point in the not too distant file will likely start needing to be pushed around over the network to remote offices and on-site personnel.

So to make a long story short, mapmaking is undergoing a technological revolution, and as it continues to do so its demand for bandwidth is going nowhere but up, up, up.

July 30, 2007 12:58 PM

Down with Wireless, Long Live Wireless

There’s a big vote coming up at the FCC tomorrow regarding the upcoming auction of wireless spectrum.

You won’t find me writing much about wireless as I’m much more of a wireline kinda guy, but I wanted to share one important belief I have regarding this space.

One of my greatest frustrations in the debate around broadband deployment is the idea held by some that wireless is the answer to all of a community’s broadband needs. That some day everything will be wireless and we won’t have need for fiber.

That’s simply not the case, and it all comes down to one word: capacity.

The next big thing in wireless coming down the pipeline is WiMAX. Unlike Wi-Fi, which enables wireless broadband over small areas, like a coffee shop, WiMAX promises multi-megabit speeds over miles of coverage.

Let’s analyze this for a moment, though. WiMAX has promised to enable up to 70Mbps per antenna when it first launches. But how many people will that antenna be serving, especially in urban areas?

Wireless has been touted as the answer to the speedy deployment of broadband in rural areas, but like DSL its speed drops the further away a user is from the nearest tower, topping out around 30 miles.

In either scenario, what happens when 100 people are trying to stream high quality video using the same antenna? Even if we’re only talking about video encoded at 1Mbps, the numbers just don’t add up.

This isn’t to say we shouldn’t be deploying wireless broadband, though. There’s no denying that it’s faster and cheaper to deploy than wireline, and it enables anywhere, anytime access to the Internet, which will be vital for encouraging the continuing development of Internet-enabled handheld devices.

In the near-term wireless broadband could serve as a viable competitive force relative to DSL or cable, and down the road the opening up of spectrum caused by the transition from analog to digital TV broadcasts will enable new levels of broadband service.

But in the end, only the capacity of wireline, and in particular fiber, technologies will be sufficient for a world where HD video is being pushed around the Internet in all directions.

So fiber vs. wireless is not an either/or equation. In my mind, the answer is quite simply yes to both.

As a final thought, wireless is not some magical technology that does away with the need for wireline because ultimately those wireless antennas need to connect to the Internet somehow, and more often than not that will be through fiber optics.

July 31, 2007 1:28 PM

Microsoft To Get Connected; Hosted Applications Take Off

Last week, Microsoft CEO Steve Ballmer discussed the software giant’s plans for adopting a “software plus services” model over the coming years, which will push Microsoft’s traditional desktop software into the realm of hosted applications.

Generally speaking, hosted applications run on servers in the network or on the Internet instead of locally on a user’s computer. In this model, the majority of the computing work is done remotely, using the computer as a terminal for receiving input from the user and displaying images on the screen.

Microsoft’s stated plans don’t fully embrace this concept as they’d prefer to retain the importance of the desktop and running applications locally, but these remarks prove that they can no longer ignore the increasingly competitive hosted alternatives that can run through a standard web browser.

Salesforce.com is a leader in this space with their hosted CRM solutions. What I find most interesting about their solution is their AppExchange, which has created a virtual marketplace for applications that add onto their core functionality.

Google is another major force in this space as it continues to build out its hosted alternatives to Microsoft stalwarts like its Office suite.

There are also a host of exciting newcomers, including Iceberg on Demand, a startup currently in private beta that promises to enable anyone, regardless of their knowledge of coding, to create custom hosted business applications.

Hosted applications offer a number of advantages. They lower the need for having super fast computers to run applications; they eliminate having to continually update software locally; they often provide anytime, anywhere access to applications, unleashing them from being tied to individual computers.

At the same time, they tend to rely entirely on broadband networks in order to work. Hosted apps can run great, but only as long as their Internet connection doesn’t go down.

While for the most part they’re not overly bandwidth intensive, they are very sensitive to lag. The idea of hosted applications is to make them feel as though they’re running locally, but that demands very low latency in order to not incur issues like having your cursor not be able to keep up with your mouse as you try to navigate an application.

Alongside all this growth is a push towards enabling hosted applications to also run locally, which can ensure they will continue to work regardless of the available connectivity. The biggest player in this push is Adobe with their AIR platform (until recently known as Apollo, AIR stands for Adobe Integrated Runtime), which will enable Flash applications to be built so they can run locally while offline.

In some circles, there’s a belief that one day in the not too distant future we’ll be in a world where all applications are hosted and run over the network. I’m not willing to go that far, but it is a fascinating space that’s driving a tremendous amount of innovation and that will likely increase demand for reliable, low latency broadband connections both in the near and long term.