September 2007 Archives

Web Apps Hit Mainstream...Sort of

| No Comments | No TrackBacks

Just found this interesting report released yesterday by Rubicon Consulting about the adoption and use of web applications.

Not necessarily all that much new in these numbers: people are using apps like email and games more than they ever have, but they're not using other apps all that much.

The biggest reason they cite for this is the perception of a lack of security associated with these web applications, as well as a sense that there's no great advantage using an online document creator vs. a desktop application like Microsoft Office.

Interestingly, though, Rubicon alludes to the fact that the barriers to entry for web apps isn't incredibly high. In other words, it's easy for users to adopt technologies quickly if they see reason to.

They just haven't found a reason to yet in large part. The primary breakdown seems to be that for the most part web apps aren't solving practical, real-world problems, so there's no compelling reason for users to adopt them, other than early adopters who'll start using technology simply because it's new.

While some of this certainly has to do with consumer adoption and education, I think we don't focus enough attention on the responsibility of the applications developers.

It never ceases to amaze me how many people are out there trying to create applications that are essentially iterations of existing apps. I come across way more apps that do analogous things to other apps than apps that attempt to do something new.

I think applications developers need to stop fighting for bigger slices of the same size pie and instead realize the opportunities that exist for creating applications that increase the size of the overall pie.

Only by doing this will we get more people engaged on the Internet, and in turn will we drive the Digital Economy to the heights it's capable of achieving.

More Cries for National Broadband Policy/Strategy

| 2 Comments | No TrackBacks

Apparently two FCC commissioners appeared at a US Senate hearing yesterday, making strong arguments for the implementation of a "national broadband strategy," according to this Ars Technica article.

Have to admit, I'm finding myself growing tired of the continual cries for a national broadband strategy.

You know why? Because I've heard a heckuva lot more about how important it is and how badly we need it, yet I've heard very little about what it might entail and how we might accomplish the goals many have stated we need to be shooting for.

Universal broadband access is a big one on this front for me.

So everyone should have access to broadband...of course.

So we need to get broadband access to everyone...no doubt.

But how do we actually accomplish that?

How can we incentivize private companies to invest in less economically attractive rural areas? I hope there are better answers than the RUS fund, giving out government handouts to those companies that can navigate the paperwork and regulatory hurdles to qualify.

How can we get private companies to invest in a full-fiber infrastructure? Let's cut the crap and acknowledge that that is the logical end game for Internet access. If you want evidence of this, just notice the fact that every major wireline broadband providers touts their fiber optic network, no matter how close that network runs to consumers' houses. And only through full-fiber will we be able to not only reach the goal of a 100Mbps Nation but surpass it.

How do we balance public vs. private deployment? Some call for publicly owned infrastructure, but I've spoken with many local governments who have no interest in entering the telecom game. And forcing mandatory buildout requirements on fiber deployers can be a troublesome proposition. Plus it's not like we can make the private operators just give up their networks and go away.

Yes there's a problem. Yes we need more investment in infrastructure. Yes we need a national broadband strategy. But now we need to be figuring out what that actually means and identifying ways to accomplish these goals.

We should also understand that unless the federal government's willing to step up and invest hundreds of billions of dollars in upgrading the country's broadband infrastructure that we'll need to rely on private interests to some degree to get where we need to go.

But for now, let's give ourselves somewhere to start:

- We need to know where broadband is deployed so we can identify underserved areas.

- We need to redefine "broadband" as 200Kbps is not sufficient to support the vast majority of cool new applications the Internet enables.

- We need to protect consumers when it comes to them better understanding what it is they're buying when they sign up for broadband and what they can do with broadband once subscribed.

- We need to support all deployers of fiber, whether private or public, making sure we're not putting artificial barriers that slow investment in front of either.

What else should we add to this list? And what thoughts do people have on how we should pursue accomplishing these goals?

I'll be sharing a series of thoughts on these and related matters next week, along with my experiences writing from the FTTH Conference.

It should be an exciting week!

Watch Out Internet: Halo's Here

| 1 Comment | No TrackBacks

Thousands of people standing in line at midnight across the country. A million pre-orders waiting to be shipped. A cultural phenomenon has launched in the gaming world: the release of Halo 3.

For my non-gaming readers, Halo is an incredibly popular gaming franchise for the Xbox and Xbox 360 that follows the adventures of a futuristic soldier named Master Chief as he tries to save the world from evil aliens. The gameplay is that of a first-person shooter, where players run around, grabbing guns and shooting stuff. (My apologies for any Halo fanatics for the lack of nuance in this description.)

Halo has really made a name for itself, though, not for the single player story mode but the intense multi-player battles, where players from around the world fight with each other over the Internet.

I wanted to bring this story up for a pair of reasons:

1 - Friday night I was flipping through and paused on G4TV, a TV channel focused heavily on gaming news, as they were having a discussion about the pending release of Halo. At the end of a conversation about just how big an event Halo's launch would be, the host made an offhanded remark about how when everyone tries logging on to play their new copy of Halo this week that there's no way the Internet will be able to support it. He even went so far as to predict the Internet crashing under the weight of thousands of gamers trying to get online to play.

I wrote last week about one researchers belief that warning cries about the exaflood were overblown, but to have someone on national TV talking about the Internet crashing from too much use seems to suggest that maybe the specter of the exaflood is not a boogeyman but a real world problem.

At the same time we mustn't be imprecise in how we describe things: in the case of Halo, if players aren't able to log on from too much use it likely will have little to do with the larger Internet and mostly stem from overloaded gaming servers.

I think this point gets muddled too often. The servers on which things like games run are most certainly a key part of the Internet, but to refer to them as the Internet obscures the real issue: lack of capacity by whoever is offering an online service.

This is a matter of needing better planning and/or management of incoming users rather than an inherent lack of capacity on the Internet.

2 - If ever there was evidence of the Internet's ability to attract mass audiences quickly, it's the release of a game like Halo. By the time the Christmas shopping season is over, there may be more than 10 million copies of Halo in use in the US. And with online multiplayer a core element of the game, that means potentially 10 million people wanting to head onto the Internet to do the same thing likely at similar times.

Now, games like Halo aren't typically all that bandwidth intensive as the only data being pushed over the network are bits and bytes describing where things are positioned in the game, rather than trying to move around video of the game being played.

And in some ways this 10 million number is somewhat less consequential as most of those users are already online for other reasons, so it's not like this game is expanding the pie of broadband users all that much.

Even still, it's a significant happening that highlights the reality of the exaflood, the need for better understanding of how the Internet works, and the fact that a singular event like the launch of Halo can drive millions of people to use the Internet for the same purpose.

Where's My Coffee? Virtually Speaking...

| No Comments | No TrackBacks

I was on plane ride back home last week chatting with the gentleman next to me about the book he was reading called "The 4-Hour Work Week" by Timothy Ferris.

He raved about the information found therein, which covers a range of strategies and resources for enabling people to work more effective virtually, and I'm looking forward to picking up a copy myself.

One thing in particular my seatmate mentioned was the suggestion to use a virtual assistant, which essentially means outsourcing your secretarial/personal assistant work to someone somewhere over the Internet.

I thought it was an interesting idea, and one that I could personally well make use of, but I shelved it thinking I'd have to wait until I read the book to know how to pursue it for my own life as a freelancer.

Then, in one of those wonderful moments of serendipity, a colleague passed along this ABC News story to me, which details the benefits of entering the workforce as a virtual assistant from a woman's perspective.

Needless to say, virtual assistants are a prime example of the Internet's ability to create new opportunities for people that breaks through any limitations imposed by physical distance.

I'm not yet sure if I'm ready to take the plunge and hire on my own virtual assistant, but rest assured that when (if) I do, I'll share my experiences here on AppRising.

How Patent Reform Can Help the Internet

| No Comments | No TrackBacks

One of the hottest debates in DC over the past few months has been the contentious issue of patent reform. Last week the latest salvo in this battle was launched as a group of 20 noted inventors and US company executives took to the Hill to plead their case against the current legislative frontrunner, a bill that recently passed through the House and is up for debate in the Senate.

Instead of going into detail about the nuances of this proposed legislation, and to avoid falling too far down the rabbit hole of explaining how the patent system works (or doesn’t work, depending on who you ask), I wanted to chime in with what I see as the two areas most in need of reform from the perspective of enabling the continued growth of the online industry.

The first is to find a way to make the patent office more effective and efficient.

There’s little doubt the current patent system is broken. I know of at least one broadband application developer who has been waiting for his patent application to work its way through the system for more than 5 years, and he’s still got hundreds of thousands of applications left ahead of him.

The Internet is such a fast-paced environment that these lags can be devastating as by the time patents are granted the technologies they describe and terminology they use may either be outdated or already surpassed, making the considerable investment needed to file for a patent a complete loss.

Much has been made about the oncoming exaflood of Internet traffic, which threatens to overwhelm broadband networks, but at least one pundit is wondering what all the fuss is about.

Andrew Odlyzko, director of the University of Minnesota's Digital Technology Center, has been studying data from 100 public data traffic hubs around the world. His research shows that while Internet traffic may have been doubling in the early years, recently it has only been growing about 50% per year.

In this Startribune article he suggests that the warning cries of an overwhelmed web are overblown. (If you want to get into the specific numbers, check out his results here.)

This belief leads him to suggest that efforts by network operators to explore new business models for the delivery of Internet traffic to not be in preparation for overwhelming demand but instead merely an effort to exert more control over the Internet.

While his research is both interesting and valuable, I have a few issues with his conclusions.

First off, Odlyzko's claim that the lack of overwhelming demand equates to there being no need for managing bandwidth allocation--essentially an argument in favor of net neutrality--highlights a pervasive misunderstanding of how this issue, and government regulation in general, tends to work: passing net neutrality means infringing on the freedom of network operators to run their businesses, just as not passing it holds the threat of doing the same to Internet entrepreneurs.

As with most polarized debates around government policy, it's my belief that the ultimate answer isn't picking between an either/or situation but finding the best solution for encouraging the freedom and growth of both sides.

Secondly, while I don't dispute his numbers, I do take issue with his claim that since the Internet hasn't been doubling that there no longer is any problem with the exaflood. In fact, he even contradicts this thought in the same article: "If everybody rushed to download video from the Internet, it couldn't be done," he said.

I had thought we were all shooting for the same goal of eventually all forms of data transmission going over the Internet. If that's the case, then how can we not have a capacity issue if even today we can't support the delivery of video-on-demand to everyone, let alone all the other broadband applications that demand bandwidth, and not to mention the millions of people who have not yet discovered the Internet but hopefully will in the near future.

All this being said, I do heartily agree with two other assertions he makes.

More News on Comcast's High Bandwidth Policies

| 1 Comment | 1 TrackBack

Just found this Consumerist post about Comcast's policies towards its heaviest users.

They claim to have talked with a former Comcast employee who cites that Comcast's invisible bandwidth limit is actually 200GB, rather than the 100GB I postulated earlier this week.

Also, this limit is only applicable to certain parts of Comcast's network, which are often aging and therefore have less capacity.

(As an aside, in an article I read earlier this week there was a comment by someone who identified themself as a Comcast employee who cited that the reason Comcast doesn't give a hard number for their cap is that there isn't one. Instead, what triggers Comcast to cut off heavy users is when one particular user begins consuming more than 10% of the capacity of the node they're connected to. Though like this other post, it's all speculation at this point.)

What I found even more interesting in this Consumerist post is the suggestion that Comcast does have a mechanism ready to deploy that would prompt heavy users with the option to purchase more capacity if they exceed this invisible limit.

According to this unnamed source, what's holding them back is hesitation over being the first to implement this policy and potentially angering their customers.

The thing I don't get, though, is how would it be worse to offer a reasonable way for heavy users to pay more to continue their service versus cutting them off with little to no warning?

Grid Computing Leverages Broadband to Fight Disease

| No Comments | No TrackBacks

Came across an interesting story on grid computing this morning.

To step back for a second, grid computing is a network of computers that combine their processing power to form virtual supercomputers. Rather than have one super-powerful computer trying to crunch huge datasets, grid computing splits the datasets into small chunks and sends them over the network to less powerful computers, leveraging their idle processing power. By doing this, grid computing can aggregate more processing power than the most powerful of supercomputers.

The story that caught my eye was an Ars Technica article about the World Community Grid's latest project: a search for molecules that block the reproduction of flaviviruses, which is a family of viruses that includes West Nile, yellow fever, and hepatitis C.

I hadn't encountered the World Community Grid before, but am glad to have done so now.

Their goal is to create the "world's largest public computing grid to tackle projects that benefit humanity."

Sponsored by IBM, the World Community Grid only provides its services to public and not-for-profit entities in need of massive computing power.

The Grid is currently being used for projects like: developing more accurate climate models of specific regions of Africa, finding a cure for muscular dystrophy, and designing more effective drugs in the treatment of AIDS.

I find this whole space to be utterly fascinating. By using nothing more than average computers--even gaming stations like the PS3 have contributed to grid computing efforts--we're able to analyze massive amounts of data in order to realize positive change for society.

And what makes this all possible? Broadband.

While I think these systems may work over lesser connections as the datasets often aren't overly large, the important part of broadband for grid computing is its always-on connection, which allows computers that are part of the grid to continue pulling new data and pushing back out their results 24 hours a day.

Grid computing is a prime example of an application that relies heavily on readily available connectivity in order to do much more than allow you to watch Britney Spears' latest performance.

Considering the rancor associated with most any telecom policy debate over the last year, it’s refreshing when an issue comes up that bridges this digital divide to garner support from players in multiple corners of the online arena.

The issue in question here is the Internet Tax Moratorium.

Instituted in ’98, it’s been renewed twice over the years and is set to expire yet again November 1st.

This post by Larry Irving on the Internet Innovation Alliance blog sums up nicely the history of the moratorium while stating an eloquent case for extending it permanently.

This post on the Google public policy blog details what the moratorium currently does.

Support for extending the moratorium cuts across some interesting lines, as evidenced by the supporter list of a group called Don’t Tax Our Web, which has come together to push this issue.

For example, Verizon, Google. Amazon, and AT&T; are all listed as supporters.

Considering the debate we’ve seen between companies like these on other telecom-related matters which shall remain nameless, it’s a relief to find an issue on which they can find common ground.

More Applications Get Hosted

| No Comments | No TrackBacks

Came across two articles describing further moves by big players into the hosted application space.

Today, Google launched Google Presentations, an online alternative to PowerPoint. You can use it to create presentations, collaborate on them, give presentations, organize presentations, and more.

Last week, details got out about Adobe’s free, ad-supported online version of Photoshop—Photoshop Express—which will launch later this year, giving consumers another option for editing photos online.

Both companies have been making aggressive moves into the hosted application environment.

Earlier this year, Google launched Google Apps for Business, which includes document creation, email, calendaring, and more.

Also earlier this year, Adobe launched Premiere Express, an online video editing tool.

Microsoft has lagged behind all this, only recently launching Windows Live, which allows for online blogging and a couple of other things.

No one’s yet claiming that these hosted alternatives can match the feature set of a desktop application suite like Office, but they’re getting closer and closer. Plus, many of these hosted applications offer functionality not found on a desktop, in particular as it relates to collaboration and having anytime/anywhere access to your files.

With all the different people moving into this space, we’ve reached the point where it’s becoming feasible to transition over to an online-only world where the only desktop application you’re running is your browser.

And as this transition continues, what that undoubtedly means is more people spending more time online relying more heavily on broadband to enable them to do business.

In fact, the trend of hosted applications puts an even higher premium on reliable broadband as without Internet access most of these applications simply can’t run.

In the latest news on Comcast’s moves to cut off service to its heaviest users, they’ve made an official, though somewhat obtuse, statement about their policy towards “excessive use."

In this article, a spokesperson for Comcast defines “excessive use” as a customer who downloads the equivalent of 30,000 songs, 250,000 pictures, or 13mill emails in a month.

I found it kind of odd how they phrased this. Rather than give a specific number, they allude to stats that seem to try and suggest that anyone who’s downloading more than 30,000 songs must be abusing the network. I mean, why would anyone ever need to download that much data?

But let’s break this down a little further.

The average size of an MP3 file is around 3-3.5MB. So according to the back of my napkin that puts Comcast’s bandwidth cap around 100GB per month. (30k songs times 3-3.5MB equals about 100,000MB, or 100GB)

That does sound like a lot of data, but is it really?

Not when you look at it from the perspective of the next big thing: HD movies.

HD movies today are delivered on discs that utilize next-gen blue lasers to carry 15-50GB of data, as opposed to red laser DVDs that can only handle 5-10GB.

An average full-length HD movie delivered on these new discs is 20-40GB.

So by my estimation, a 100GB bandwidth cap equates to 3-5 HD movies if they were to be delivered over the network, which isn’t yet happening en masse but also isn’t that far off in the future.

I wrote a couple of weeks ago about Home and Community Options receiving a LightSpeed grant from the Blandin Foundation, which will help them support the development of a remote monitoring broadband application to use in their residences for disabled people.

I’ve now come across a post on Blandin’s blog that includes an interview with Bill Coleman about this program.

I think he sums up the intent of this LightSpeed program nicely:

“Some advocates romanticize instantaneous adoption of advanced technologies throughout the community. In fact, once connectivity is in place, other deployment challenges rise to the top, like specialized equipment, software, and end-user training.

The LightSpeed program provides funding to overcome these challenges and encourages the adoption of new broadband intensive application, especially in the education and healthcare areas.”

Additional recipients of these LightSpeed grants are:

- Lakewood Hospital in Staples, which will apply the grant to increasing their use of video to monitor home care and hospice patients in their home

- Windom Schools, which will be building a video-ready classroom to facilitate video production classes as well as new video-based home learning opportunities like a video homework hotline

- Little Crow Telemedia Network, a distance learning cooperation of K-12 school districts, who will be implementing a system for delivering video on-demand of content that’s currently only available during live broadcasts.

Programs like Blandin’s LightSpeed grants are key for fostering the next generation of the Internet, where its promise is not just talked about but actually realized in real-world scenarios having a real impact on people’s lives.

If anyone knows of other similar programs that are getting started or already found success, write a comment to this post and share your knowledge! I sometimes thinks we don't give enough credit for the work being done on projects like this.

And that does everyone a disservice: the people who might have applied but didn't know about it, as well as others who have the capacity to set similar projects in motion, if only they knew how much good was being done through broadband.

Connecting the Nation Starts in Kentucky

| No Comments | No TrackBacks

Had a colleague pass along a nice article on the ConnectKentucky project earlier today.

ConnectKentucky’s been getting a lot of press recently for their efforts to map broadband in their state. One of their primary goals is to make affordable broadband accessible to all.

At the Broadband Properties Summit I had the great fortune to listen to and chat with Joe Mefford, the program’s statewide broadband director, and I came away very impressed.

What I found most significant in what they’re doing is how they’ve taken something that’s historically been seen as radioactive to network operators—divulging where their networks reach and customers are—and turned it into something that has inspired these same network operators to invest more money in infrastructure across the state (over $600 million).

ConnectKentucky accomplished this because through their mapping process, they unearthed numerous underserved areas, many of which the incumbents seemingly never knew existed. And now that these areas have been identified there’s a rush to serve them as the incumbents try to expand their customer bases.

Also noteworthy is the fact that mapping broadband, while the most widely touted aspect of what they’re doing, is actually only a small sliver of the overall ConnectKentucky effort.

They’ve set and achieved the goal of equipping counties with websites, believing this to be a crucial step in engaging citizens with broadband and creating stronger bonds of awareness within communities.

They’ve set up teams across all facets of communities—like education, healthcare, and government—to address and pursue opportunities to encourage the use and adoption of broadband.

And they’re working on spearheading statewide projects like the implementation of an EMR system.

They truly understand that having the networks in place is not the last step but the first.

It will be interesting to watch their progress as they continue to realize greater successes in Kentucky. And their influence is already reaching beyond their home state through the creation of ConnectedNation, which is helping other states interested in following this model to resolve their own broadband issues.

Meet Rich - Verizon's 100Mbps Man

| 1 Comment | No TrackBacks

Don’t know how I missed this: earlier this month Verizon allowed regarded tech columnist Om Malik to send questions to a Verizon employee named Rich who’s been wired for 100Mbps access to his home to discuss how it’s changed his life.

The full transcript for the questions and answers can be found on Verizon’s Policy Blog. And Om’s writeup about it on GigaOm.

Long story short: Having 100Mbps doesn’t make all that much difference as nothing on the Web is optimized to take advantage of all that capacity.

Truth be told, though, Rich probably wasn’t the best test subject. All he used that capacity for was watching some video, doing a little VPN, and other basic web surfing.

What I want to know is what would someone from a younger demographic think about having it? The ones who are gaming, downloading multiple movies from BitTorrent, making videocalling, and so on.

Also, while Verizon’s tests are notable and prove their commitment to pushing the bandwidth envelope, they’re far from alone in achieving the FTTH Council’s goal of a 100Mbps Nation.

I’ve spoken with Paul Morris, head of UTOPIA, on many occasions about how he uses the 100Mbps connection he has at home. He’s a big proponent of videocalling technologies like VSee and TVBlob. And he’s an avid user of Slingbox while on the road. In fact, the use of applications like these has resulted in him now uploading more bits than he’s downloading.

And 100Mbps isn’t even the ceiling any more. At the Broadband Properties Summit I had the opportunity to catch up with Phillip Clark, head of Paxio, a FTTH provider that’s offering 1Gbps connectivity to its customers.

Open Access Picking Up Steam?

| 1 Comment | No TrackBacks

Following my trip to UTOPIA, I wrote a post about the as-of-yet unrealized promise of open access networks, or networks where multiple service providers compete over the same pipes.

This concept is one I first encountered at my first Broadband Properties Summit in 2005. It was then I heard a presentation by Robert Kjellberg of Vasteras, Sweden, where their open access network, powered by PacketFront’s technology, enables users to pick between multiple Internet, voice, and TV providers as well as a host of other services and applications, and self-provision these services through a common portal interface.

From the first time I learned of this it piqued my interest. I like the idea of focusing competition in telecommunications on the merits of services rather than the physical characteristics of last-mile access technologies (cable vs. DSL vs. fiber, etc.). And ultimately the Internet is one massive open access network where multiple services are being provided over one broadband network any way.

But the market hadn’t warmed up to this. Everyone I talked to, even if they agreed with the idea philosophically didn’t think it was feasible, practical, or realistic. Many pointed to the uphill battle UTOPIA has faced in attracting service providers to its network as proof that the model was doomed.

But the winds of opinion appear to have shifted this year at the Broadband Properties Summit.

On Tuesday in talking with both Glen Lang, CEO of Connexion, and Diane Kruse, CEO of ZoomyCo—two leading competitors in the greenfield FTTH space—I heard different renditions of the same theme: they want to own the last-mile access infrastructure and then allow multiple service providers to come and compete over their network.

Now I can’t say whether or not their respective business models are fully open access, but they weren’t the only ones there talking about the possibilities of multiple service providers competing over the same pipe. In fact, PacketFront’s booth was swamped throughout the event as seemingly every attendee wanted to at least learn more if not start talking about how they can be pursuing this model.

It was a fascinating contrast from years past to see so much excitement over open access networks that enable competitive service environments.

Because of my traveling travails I missed the many interesting sessions on Monday, but luckily the Broadband Properties Summit had a lot more great content in store for me!

The first session I attended was a Tuesday morning blue ribbon panel on the exaflood, which including Dave McClure, president of the USIIA; Larry Irving, co-chair of the IIA; and Bob Whitman from Corning.

I’ve written about the exaflood previously and some of the challenges I saw with the name, how it’s positioned, and why I’ve come around to embrace the concept.

During this session, I heard a number of other facts that build a strong case for the fact that the exaflood is real and oncoming:

- There are 100 million people in the US currently between the ages of 3 and 24. Everyone knows this range holds many of the heaviest Internet users, as well as its most demanding when it comes to things like slowdowns. As older generations are replaced with newer generations, bandwidth consumption will inevitably increase.

- The average size of a webpage used to be less than 100KB; now it’s 1.5MB, a 10+ fold increase in the amount of data transferred every time you click on a link.

My apologies for the silent Monday. 11 hours spent on a plane ride from DC to Dallas to reach the Broadband Properties Summit didn’t leave much time for blogging.

But today’s another day, and I’m excited to see what it has to offer!

Until there’s more to report from Texas, here’s a handful of links from last week to tickle your bandwidth fancies:

Telescopes Unite Around the World Through Broadband
In a first-of-its-kind attempt, telescopes in China, Australia, and across Europe combined their power to form a global superscope that produced images clearer than have been obtained by the famed Hubble. How’d they do it? By pushing 256Mbps streams of data around on superfast fiber optics. http://www.csiro.au/news/TelescopeNetworks.html

Comcast’s Penalizing Practices Hit Front Page
Last week I wrote a post on Comcast’s alleged practice of cutting off its heaviest users. At that time it was a topic primarily discussed on discussion boards and tech sites. Comcast officially denied the practice in this News.com article. But now the story’s hit the Washington Post. This is a big story that’s only going to get bigger as the battle over who controls the Internet rages on.

Worm Creates Botnet of Epic Proportions
A worm is a virus that can infect your computer and cause it to do things beyond your control and often without your knowledge. Like send out spam, a lot of it, often to the people closest to you. Well there’s a big one out there digging its way through the Internet, and according to this article, the combined computing power of its infected network surpasses that of the most powerful supercomputers on earth.

Mapping the Web

| No Comments | No TrackBacks

Ran across an article about the efforts of a Japanese firm called Information Architects to pull together a map of the biggest things on the Internet, using a subway map as its inspiration.

To access the map, click on the image on the left and a new window will pop up.

Notice the key in the lower righthand corner. It tells you how to read the map.

Each colored line represents a different thread that links common sites (like sharing, community, or news).

In all honesty, I'm not sure if I find much utility in this map. It's hard to read and nigh impossible to get anything out of at a quick glance.

That said, it is kind of fun to pick a line and track it through all its different stops. Many of the sites cited are big names that are immediately recognizable, but they've done a good job of incorporating a fair number of fringe players with significant offerings.

I'd encourage you to look through this map with another browser open, probably pointed at Google, so as you encounter sites you haven't heard of you can quickly look them up.

I do wish they'd built this dynamically so that each name would link to the site, though.

Future Web Trends - But What About the Applications?

| No Comments | No TrackBacks

Fantastic article on Read/Write Web about future web trends written by Richard McManus.

In it he cites ten trends he believes will be most noteworthy over the next decade.

Included in these are: virtual worlds, websites as web services, and online video/internet TV.

And what do all three demand? A whole lot of bandwidth.

Virtual worlds, especially for gaming, are becoming increasingly graphics-intensive and reliant on ultra-low latency connections. If you're going to slay the Great Dragon of Gargoth your flame-throwing centaur better be able to react quickly!

Websites as web services are threatening to leverage the browser to dethrone the operating system as the place where apps are run. In fact, it's possible today to forgo all desktop applications and only work through your browser and still have close to the same functionality.

The rising tide of Internet video needs no introduction. You can get movies, TV shows, music videos, user-generated content, original online shows, and more. You can stream it live or downloaded it on-demand. BitTorrent continues to drive bandwidth usage higher and higher each day.

The one thing that disappointed me about this list is lack of any mention about applications like videocalling, or webcasting, or security. They're almost alluded to in a reference to "rich Internet applications" but in reading that description you quickly realize he's referring to the trend towards hybrid desktop/web apps, not the many apps too often left unmentioned.

Maybe these things just aren't sexy and new enough; videoconferencing's been around for decades, as has various forms of video security, and even webcasting isn't really new compared to what's been done on TV.

But even still I'm never amazed at how little attention this whole world of applications seems to get among both the techno-elite as well as the unwashed masses.

Now This is Telepresence...

| No Comments | No TrackBacks

Found this great article yesterday about a unique take on enabling more effective telecommuting: a mobile, remote controlled robot with camera and screen attached that the telecommuter can use to navigate the office when they're working from home.

While the technology doesn't appear to be all that cutting edge, and the article doesn't mention any plans to commercialize the product, I still found this to be an ingenuous solution to the problem of small group collaboration for teleworkers trying to interact with people at the office.

And this is especially interesting as it relates to more than an idea but an actual deployment, one that has proven successful enough that other employees in this company--iAnywhere--are clamoring for their own IvanAnywhere (the name of this creation, referring to its primary user, Ivan Bowman).

While it may sound hokey, I wouldn't be surprised at all if we started seeing this concept gain some momentum and eventually become an actual product. Only time will tell!

Another highlight from my trip to Winona last week was the opportunity to sit down for lunch with Peter Walsh of Home and Community Options.

Home and Community Options provides services to people with disabilities in the area, in particular residential care at 20 different homes/buildings in and around Winona.

This visit was precipitated by the recently received news that Home and Community Options had been awarded one of the first Light Speed grants by the Blandin Foundation, an organization focused on supporting the growth of Minnesota communities.

This Light Speed grant will provide matching funds for Home and Community Options to upgrade its facilities to be better prepared to take advantage of the bandwidth of fiber, which Hiawatha will be laying for them in the near future.

But this grant is about much more than improving the networking capacity of a few buildings.

The impetus behind applying for the grant was to support the continued efforts of Home and Community Options to develop a cutting edge broadband application of its own.

Earlier this year a group called the New7Wonders Foundation held an online contest to construct a new list of wonders, seeing as how the only remaining structure from the original roster are the Great Pyramids in Giza.

On July 7, 2007, they announced the winners: the Great Wall of China, the Colosseum in Rome, the Petra, the Taj Mahal, Machu Picchu, Chichen Itza, and the Christ the Redeemer statue in Rio de Janeiro.

Just came across a site called Panoramas.dk, which specializes in creating virtual panoramas, which allow users to feel like they're standing in the middle of somewhere far from home.

They've put up a series of these panoramas highlighting all 7 Modern Wonders of the World.

I've certainly seen virtual panoramas like this before--they're use is on the rise in rental and real estate markets, for one--but I thought these were especially cool as the picture quality is better than most I've seen, plus their subject matter is so fascinating and on such a grand scale.

Back to School with Broadband Applications

| No Comments | No TrackBacks

While a decade removed from the last time Labor Day meant the beginning of my school year, I can’t help but feel nostalgic around this date.

Of course, things are much different for students of today, as evidenced by two applications/services I stumbled across that leverage broadband to help support the educational aspirations of students.

The first is a site called Tutor.com. As its name suggests, this site provides tutoring services over the Internet.

The way it works is pretty straightforward: set your kid up with an account, then when he or she runs into trouble while doing their homework, they can initiate a session with a tutor, no scheduling needed.

Effective tutoring is enabled by things like chat, an interactive white board, and file sharing, not to mention the simple fact that since this is all done online students can pull in other resources from the Web.

Sessions are recorded for students to review at a later time, and parents can gain access to transcripts to ensure everything’s on the up and up.

To date, Tutor.com has hosted more than a million tutoring sessions in subjects ranging from English to arithmetic to chemistry.

About this Archive

This page is an archive of entries from September 2007 listed from newest to oldest.

August 2007 is the previous archive.

October 2007 is the next archive.

Find recent content on the main index or look in the archives to find all content.