October 2010 Archives

One of the biggest stories in communications today is the ongoing standoff between Fox and Cablevision over the rates Cablevision has to pay to carry Fox content. Cablevision refuses to pay what Fox wants to charge and therefore millions of Cablevision subscribers have missed out on programming like the MLB playoffs over the last few weeks.

While there's been much gnashing of teeth over the need for the FCC to step in forcefully to resolve this situation, there's only one permanent solution to problems like these: ala carte TV.

"Ala carte TV" generally refers to the ability for customers to decide what individuals channels they want to subscriber to, rather than having to pay for a package of channels determined by their cable provider.

Though far from a new idea, it's most often talked about in terms of how its potential for maximizing consumer value in terms of what they get for what they pay. But I want to point out today how valuable ala carte TV could be for ridding us of these silly disagreements between content providers and cable operators.

To start with it's important to understand the basic model of how TV works.

To deliver TV service, cable operators must pay content providers a monthly per channel fee. This fee can vary dramatically based on number of subscribers and how many additional channels are bundled from a particular content provider. The cable operator compiles a package of channels and then charges customers to cover these costs and hopefully turn a profit.

Where this system starts breaking down is during the yearly negotiations when content providers try to get cable operators to pay more to carry their channels. While you can't blame content providers for trying to maximize revenue, especially those that boast the best ratings, the challenge for cable operators is they have little leverage.

A cable operator can't be competitive without key channels, like ESPN for example. So if ESPN wants to jack up their rates, there isn't necessarily a whole lot cable operators can do other than eat the costs or pass them on to their customers.

It's this dynamic that leads to an annual hike in cable TV prices as well as the tension that creates situations like the current standoff between Fox and Cablevision. Cablevision decided it didn't want to pay what Fox is demanding, so Fox cut off the spigot of their content.

So how can ala carte solve these issues?

It boils down to the simple fact that the current paradigm of TV doesn't allow for true market dynamics to work their magic on keeping value high for consumers.

Sure a channel's viewership has some impact on pricing, but there's no direct relationship between content provider and consumer. Instead, everything's run through the middleman of the cable company.

What I'm postulating is that in an ala carte world where consumers pay directly for the channels they want, content providers will have to deliver enough value to justify charging higher prices, otherwise they'll risk losing viewers.

Right now content providers can raise rates and either get the cable companies to pay them or they take the battle to the court of public opinion, casting cable companies as the bad guy for keeping content away from viewers. Cable companies, then, have been increasingly trying to demonize the content providers for constantly raising rates.

But what if instead of fighting that battle they just got out of the way and instead of being the middle man became the facilitator for establishing direct relationships between content providers and subscribers?

This way instead of having to deal with managing ever higher rates from content providers, they could establish a model whereby they take their cut of whatever revenues are generated through their system and leave it up to the marketplace of content providers and consumers to determine what's a fair price for any given channel.

Of course consumers stand to benefit by having greater control over what they pay for, but more importantly we could establish a new market dynamic for TV content that could solve issues like Fox-Cablevision forever.

Now, I'm not so naive as to think this is really going to be that simple. For starters, I know the contracts content providers have with cable operators are really convoluted and pricing's based on maximum distribution across a system. Also, cable operators have been reluctant to cede their middle man position and embrace the ala carte paradigm because they think they need to play this role to maximize profit.

But I say let's take a hard look at the current paradigm, realize that it's broken, and that there's a better way. Most importantly we have to come to grips with the fact that until we can establish a true marketplace for content, where the interests of consumers are what's directly determining what content providers can charge, we'll continue to have issues like Fox-Cablevision popping up.

So if the FCC really wants to resolve this situation, let's not take the easy way out and just try to fix the symptom. It's time we acknowledge that what's best for consumers is trying to cure the disease of a broken marketplace that's divorced from demand and driven too much by supply.

With all this talk about the need for more spectrum to keep up with demand for wireless broadband, I can't help but revisiting a thought that's stuck with me for a while: why haven't we been reinvesting the proceeds from our previous spectrum auctions into building up our nation's broadband infrastructure?

For those who don't know, every so often the FCC auctions off a slice of our country's airwaves. This spectrum is used to enable all manner of wireless technologies, from cell phones to mobile broadband.

I did a quick rough review and it looks like the FCC has raised about $50 billion over the last decade selling off our country's spectrum assets.

So where does that money go? Back into the federal treasury, never to be heard from again.

But why are we doing that? The most common refrain in debates around how to overcome our country's broadband limitations is the lack of federal dollars to make a real difference. We've been told that we've got the $7 billion a year in USF to reform and rededicate to broadband, and the one-time $7 billion of the stimulus and that's about it.

But everyone knows reforming USF is going to be a mess that'll likely take years to resolve before those funds are being invested in broadband. The sad reality is that we've spent $70 billion over the last decade largely subsidizing non-broadband communications, which is really pretty silly given that this represented the first ten years of the 21st century. But I digress.

And at the same time, everyone acknowledges that the broadband stimulus is but a drop in the bucket relative to what our country's needs are.

So I have to ask again: why haven't we been reinvesting these spectrum auction proceeds into building up our country's broadband infrastructure?

Imagine if instead of just dumping that money back into the treasury we'd instead had an ongoing comprehensive plan to build up an open fiber network connecting every school in the country. We probably could have that built by now for $50 billion.

That'd mean that today all students would have access to robust Internet in the classroom, instead of the reality we're in where a huge percentage of schools are stuck paying too much for not enough bandwidth.

With that universal fiber network in place we'd have backhaul available to make more affordable any private or public entities that wanted to build off of it to deliver last-mile service directly to end users.

With all that open fiber everywhere, businesses could locate anywhere in the US and know that they'll have access to the kind of connectivity they need to be successful.

Unfortunately at this point I worry that this opportunity may have passed us by. I'm not sure how many more big spectrum auctions there are left to hold. And despite the commonsense nature of this proposal I'm not sure if I can see this government be so bold as to retroactively allocate all or some of that $50 billion back to broadband buildout.

So for now all I can hope is that by pointing out this issue we realize that there absolutely is money for continued broadband buildout. We just need to recognize opportunities when we see them.

And moving forward, if we're selling off communications-related public assets, let's reinvest those proceeds in building up our broadband infrastructure. Not to be too dramatic, but our future depends on it.

Tonight in we start an exploration of what the future of healthcare can look like in Lafayette, LA. The new non-profit I've founded call FiberCorps is producing an event called CampFiber | Healthcare.

It builds on the foundation of the first two CampFibers I've held in Lafayette, which were general interest affairs with an audience primarily of app developers with the goal of facilitating open discussions about what's possible when you have every home on a 100Mbps LAN.

Now we're taking the concept to the next level, focusing the conversation on a specific industry vertical and more purposefully showcasing local innovators who are making the future of fiber-powered healthcare happen today.

The purpose of CampFiber | Healthcare is simple: to have innovators share stories about what they're doing to spark discussions among a diverse audience of industry professionals, entrepreneurs, community leaders, students, and technologists.

We chose healthcare for the first of this new series because its an area of real strength for Lafayette, which serves as both a regional hub for all manner of healthcare services as well as home to some of the top healthcare services companies in the country.

To show how rich the area is in healthcare innovation, in less than three weeks of planning we were able to find ten qualified speakers, and that's only scratching the surface as there are still so many innovators in the area.

The agenda is below. It features presentations on everything from advanced video-based telemedicine applications to health information exchanges to interactive online learning to the next generation of workplace health monitoring.

The event will be held at LITE (537 Cajundome Blvd), Lafayette's 3d visualization center and technology accelerator. Doors open at 5pm with a beer and wine reception. The program starts at 5:30pm and runs until 7:30pm.

For those not lucky enough to be living in Lafayette to attend in person, the event will be webcast at www.fibercorps.com and the video will be recorded for on-demand playback.

These CampFiber events are an opportunity for the community of Lafayette to come together to engage in a conversation about what the future of fiber-powered healthcare can look like in their area. But the issues we're dealing with down here are relevant to every community in the country.

So look forward to hearing more from Lafayette as this community in the heart of Cajun country continues to show the way to move boldly into the 21st century embracing all that fiber has to offer to drive economic development and improve our quality of life.

Tom Winchell
LSU Health Sciences Center

Bob Miller
LEDA Accelerator

Sean Thompson
Henry Safety Systems

Blaine McManus
Advanced Telemedical Services

Brenda Ikerd
Louisiana HealthCare Quality Forum

Jay Pierret
Acadian Ambulance--National EMS Academy

Ramesh Kolluru
UL Center for Business and Information Technology

Doug Menefee
The Schumacher Group

Wendt Withers
Apex Innovations

Barbara Lamont
Calls Plus

As this DSLReports article reports, there's an ongoing argument between the FCC and cable companies about the gap between advertised and actual broadband speeds that broadband subscribers are realizing.

The FCC claims that that gap is as high as 50%, in other words broadband subscribers are only realizing half the bandwidth they're paying for.

The cable companies cite data from Ookla that shows that their subscribers are realizing 93% of what they're paying for.

I say they're both right and they're both wrong, what matters is how you're looking at the problem.

If you average the speeds a consumer realizes across the course of a day it wouldn't surprise me if that 93% is close to accurate. The reason for this is that during periods of low usage many broadband subscribers are actually getting higher speeds than what they're paying for, at least in bursts.

But the reality is that the issue of the gap between advertised and actual speeds is for the most part a matter of what time of day you're trying to use your broadband connection. During periods of high usage in a neighborhood, like when all the kids come home from school, there's no denying that the performance of many broadband connections suffers as there just isn't enough capacity to go around. I've heard many instances anecdotally where colleagues were able to surf the Internet just fine from their home office until 4pm when their speed fell off a cliff, only offering a fraction of the bandwidth they're paying for. So in this way the FCC's right.

Where they're both wrong is in how they're framing these numbers.

For the cable companies to claim that their customers can expect to see 93% of whatever they paid for is disingenuous. They know their networks' performance suffers during the busy times. If enough people try go online at the same time they simply don't have enough capacity to support limitless simultaneous usage.

For the FCC the issue is that they're not focusing in enough on the real problem of the how much the reliability of various broadband networks can vary based on usage.

The problems with this are twofold. First, by speaking too generally about the 50% number they risk losing credibility and their argument not being heard. Second, by painting ISPs with too broad of a brush they gloss over the clear differences in the capabilities of various broadband technologies as well as the decisions that ISPs make in how they manage their networks and how that relates to end user performance.

As such, I think it's a dangerous mistake to try and boil down this issue of the gap between advertised and actual bandwidth to a single number. What we need is more granular data about the performance of broadband networks throughout the course of a day. Sure the networks may work great at two in the morning, but what about at four in the afternoon?

This is an extremely important question as it relates to our efforts to optimize society through the use of broadband-enabled technologies. If we end up with a broadband infrastructure that doesn't work as well at certain times of the day, what impact will or should that have on how we use the Internet? Does that mean we can only use videoconferencing technologies at certain times of the day? Do we want to live in a world constrained by the bandwidth consumption of our neighbors, or a world where we can use what we want when we want it?

If the FCC wants to make a difference on this issue, these are the kinds of questions it needs to be addressing. I hope that they're looking at this kind of information in the field tests they're conducting now.

And if the cable companies want to move up in the consumer satisfaction rankings, then I think it behooves them to be more honest with their subscribers. Don't try to claim your networks work 100% all then time when your users know that's not the case. Why not go ahead and fess up that at certain times of day network performance may be affected? By doing this they can set better expectations for their customers while at the same time potentially nudge some of their users to offload their heavy lifting bandwidth wise to off hours.

There are solutions to these problems, but only if we correctly acknowledge what the problem is in the first place. And in this area, the problem is the varying gap between advertised and actual bandwidth.

How Fiber Hurricane-Proofs Lafayette

| No Comments | No TrackBacks

One of the realities of living down in Cajun Country is the threat of hurricanes. While we haven't had to suffer through one yet, I know at some point we will and that means most likely losing power for some period of time, on average a few days after a bad storm.

But there's also something reassuring about living in Lafayette, namely that just because we don't have electricity doesn't mean we won't be able to still get online.

The reason for this is simple: the LUSFiber network that I'm subscribed to is a passive optical network (or PON for short). This means there are no electronics in the network other than in the headend and on the side of my house.

What this means is that even if the power went out across the whole city, so long as the headend is still powered (which it always should be being located alongside the utility system) and I have a generator to power my own house, I can still get online.

That's a huge distinction from other forms of broadband that I think is really neat. Most wireline broadband technologies require electronics throughout the network to move traffic. So when the power goes out to your neighborhood you're only going to be able to get online for however long the battery life is for these electronics, which is usually a matter of hours. This is even true in different kinds of full fiber networks.

But in a PON network, the light that's carrying your data moves around the network without any additional assistance.

On so many levels that delivers so much peace of mind. It means that if we take the plunge to install a permanent generator that's fed from our natural gas line, we'll be able to stay online indefinitely, to get information about the storm and cleanup, to be able to continue collaborating both inside and outside the city, to be able to get information about any health problems the storm may have exasperated.

And think about the profound impact this could have on people who have really serious medical conditions, for whom a blackout can be a life-threatening situation. Now they'll still be able to stay online and connected with their loved ones and healthcare professionals.

Basically it means that at least in theory the entire city can stay connected and operational even without its electric grid operational.

In many ways this mirrors the importance of landlines for telephone service that have often served as a last means of communication during emergencies. Yet now we're talking about being able to do so much more than just talk to one person.

Of course, the reality of this doesn't quite live up to the promise. For example, we don't yet have a generator of any sort at our house, and if we don't have it installed ahead of time they're likely to be sold out everywhere as soon as a hurricane starts heading our direction. It's also more likely we'll start off with a movable generator that will be limited by how much gas we keep on hand rather than sinking the dollars it'd take to install a permanent one.

But I don't think we've spent enough time considering the ramifications of what a PON network like LUSFiber can mean to a community in the middle of a crisis, whether it's a hurricane, a different natural disaster, or one that's man-made.

Because with fiber, our communities can achieve a whole new level of preparedness for emergencies based not just on the capacity of this 21st century infrastructure but its reliability.

Sometimes politicians have this way of making my blood boil when it comes to how their rhetoric around broadband doesn't match up with reality.

The latest instance of this is a passing quote from President Obama where he claims that he's learned that there's no such thing as a shovel-ready project. In particular he was referring to the economic stimulus package and the speed at which federal dollars have turned into actual job creation.

Well, Mr. President, I'm here today to tell you that at least when it comes to broadband there absolutely were and are shovel-ready projects all across this country. What was missing wasn't the projects, it was having a shovel-ready government.

Let's quickly review the timeline of the broadband stimulus. Legislation was passed in February; the first awards weren't given out until December. Still to this day I'm not sure what percentage of awarded funds have actually been distributed, but I do know there've been a lot more awards than distributions.

The White House was insistent that these funds be routed through existing agencies and prioritize shovel-ready projects, and yet the speed at which government moved was at complete odds with this. The simple reality is that existing agencies weren't ready to handle this monumental task. As a result, the whole process took longer than it needed to. An outcome of this pace combined with the esoteric application process is that the pipeline of "shovel-ready" applicants got clogged up with entities racing to throw together projects.

So we ended up in a situation where we had slow-moving, overwhelmed government agencies struggling to figure out how to separate the wheat from the chaff. What this led to is that many of the best truly shovel-ready projects got left by the wayside, overrun by the scramble for free government money by entities with more money, more influence, and more expertise applying for federal dollars.

In reality there were, and still are, all sorts of shovel-ready broadband projects. There are rural networks across the country that have laid out plans for expansion that the only thing holding them back has been access to capital. I consider myself fortunate that I can call some the best rural fiber deployers in the country my friends, and I know for a fact that if someone handed them a check today they could be deploying tomorrow.

And yet unfortunately many of them missed out, their projects ignored because of technicalities or because they weren't able or willing to play the politics to get themselves to the head of the line.

Looking back, if government really wanted to support shovel-ready projects and get money out the door as quickly as possible, they should've just set up a first-come-first-served system. They could've started accepting applications before they even setup the rules. By organizing a crack team of experts they could've quickly determined if projects passed the smell test, and if so could've awarded funds immediately.

This way, rather than spending all that time crafting rules that were ultimately misguided, then having to manage an overwhelming number of applications all coming in at the same deadline, then having to wade through those stacks of applications in a reasonably timely manner, government could've been turning dollars into deployment sooner rather than later.

I know what I'm suggesting here may be way outside the norm and likely runs askew of some rules associated with how government doles out money, but I want to point out that there were alternatives. That shovel-ready broadband projects absolutely existed and still exist to this day. So the issue isn't whether or not there were shovel-ready broadband projects; the problems were all associated with not having a shovel-ready government.

Which brings us back to President Obama. I hope that somehow these thoughts make it to your ears as it's high time government face up to its limitations, admit its mistakes, and start looking at doing business a different way. If you allow yourself to paint in such broad strokes then you're going to be increasingly divorced from the realities of this great nation you lead. The mistakes that were made in the broadband stimulus were preventable, but only if we learn from them.

How Healthcare Reform Could Stunt/Kill Telehealth

| No Comments | No TrackBacks

Recently I met Jamey Hopper, president of Dexcomm, a local call center service in Lafayette that answers calls for doctors, funeral homes, and others.

While what they do isn't all that bandwidth intensive given that they do voice and not video, what was interesting is that until recently they were growing their workforce more at home than at the office.

The economics for them were simple: why pay for office space when employees can do just as well working from their own homes?

And the overarching impact of this trend on society is profound as imagine what happens when workers can spend more time at home and less in transit, and as a result we have fewer cars filling up roads and polluting the air.

It seems like such a natural, no-brainer thing to be doing for a business like Dexcomm's, and yet they've recently had to reverse course and start pulling that at-home workforce back into the office.

Why? Because of the recently passed healthcare reform.

In that reform are new rules aimed at protecting the privacy of patient health information. While well-intentioned, their result is that in order to keep workers out of the office Dexcomm will have to spend a significant amount of money upgrading the security of their systems for gathering information from their at-home workforce.

The problem is that this expense is significant enough that it makes more economic sense to bring them back into the office than upgrade these systems.

So because of government regulation, Dexcomm is having to stop its transition into a 21st century virtual business and go back to having to maintain a large office.

What's especially chilling about this example is how little data these operators are actually exposed to. It's not like they're pulling up full patient records. All they're doing is answering the phone when a patient calls in and their doctor isn't on duty, taking a bit of information down about the patient's current condition, and then finding the right medical professional to connect them with.

What stopped me in my tracks about this is what does it mean for the potential to see a new class of at-home healthcare workers?

Ten years into the 21st century I'd think we'd want to see a wave of nurses, for example, able to work from home to be the first line of defense in identifying ailments in the home through HD video. This could create tons of new jobs that afford unbelievable flexibility and convenience to the nurses, better care for patients, and benefits to society as a whole by lowering the number of cars burning fossil fuels, and yet how is this even going to be possible if government regulations either prevent it or at least overly burden these initiatives with red tape?

While I'll admit I don't know the healthcare reform legislation well enough to specifically assess the necessity of the language that's included relative to privacy that's affecting Dexcomm's business, anecdotally I know that when it comes to privacy healthcare rules can get a bit wacky.

The example I cite to exemplify this is when I had an allergy test, forgot the paper results at the doctor's office, and when I asked to have them sent to me via email or fax was told they weren't allowed to do that. Mind you, I was trying to give them permission to do so, going so far as to say that I didn't care if the whole world knew the results. It didn't matter though; their hands were tied by government regulations.

I wanted to point these examples out to highlight that despite the limitless promise broadband holds for reforming healthcare, we risk unnecessarily hamstringing ourselves if we allow government regulation to become a barrier rather than an enabler of these possibilities.

Regardless of the intentions of these regulations, they've obviously gone awry when businesses can't grow in the ways that the 21st century makes possible. And if we're not careful, government's going to kill telehealth at the precise moment its claiming a desire to support it.

About this Archive

This page is an archive of entries from October 2010 listed from newest to oldest.

September 2010 is the previous archive.

November 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.