April 2009 Archives

Goodness gracious, the broadband stimulus is starting to get absurd. Check out this article about WildBlue trying to position itself for stimulus dollars.

They want money from the BTOP program to "develop" a new satellite broadband service. Not "deploy" but "develop." Even better, they admit that even if they get this government funding it'll take "about" three years to translate that support into offering improved broadband services to customers. And I can't help but think what that really means is "at least" three years assuming everything goes right, but potentially longer if anything goes awry, which has happened in the company's past with the first satellites they put into space.

Are you freaking kidding me? Have they totally missed the point of the stimulus package?

The BTOP program is supposed to be about spurring deployment now, not in a few years. There are even specific provisions saying that the money's supposed to be spent within two years of receiving it.

This story gets fishier the more you consider the details. For example, WildBlue claims that the two satellites it currently operates can serve about 750,000 subscribers at current speeds. And yet earlier in the very same article they admit that "the take-rate of its current service has reached a point where it's starting to experience capacity constraints. The company serves 400,000 customers..." Notice a bit of a disconnect there?

Another issue is the speed they're delivering now and want to deliver. Their current top-end package only goes up to 1.5Mbps down and 256Kbps upstream, and it costs $80 a month. They claim their new and improved satellite will be able to deliver up to 18Mbps down but they make no mention of how much upstream capacity that means. They've even admitted in this article that that 18Mbps is a pipe dream, with speeds topping out at 10Mbps being more likely. And who knows how much that top-end service will even cost.

So let me see if I'm understanding this correctly: WildBlue wants tens if not hundreds of millions in government handouts to do R&D; for a service that may or may not be ready to start serving customers in three years offering service that may or may not deliver download speeds up to 18Mbps that they haven't disclosed how much upload capacity it will have nor how much the service will cost.

To be brutally honest, that sounds like an even more foolish way to spend limited broadband stimulus dollars than investing in BPL (broadband over power lines). And at least satellite's been proven to work on a large scale whereas BPL has not.

My point in calling out WildBlue is not to denigrate satellite technology or even BPL. It's to shine a light on the fact that we need to be focusing government support on technologies that can get rural residents connected sooner rather than later, and that we shouldn't be risking these resources on unproven technology plays that don't at least put us on a path towards a bigger, better tomorrow.

I think that's the thing that gets me the most: these technologies have clear limitations. Even if this WildBlue project or a BPL project like the one IBM's pushing works the best we can hope for is that they get people online with the speeds that are available today. But what about the speeds and capacity that's going to be needed by tomorrow's generation of applications? How can we be satisfied with good enough connectivity and not bringing the best networks to all Americans.

That's not broadband I can believe in. And these are not the kinds of projects that should receive stimulus dollars.

Broadband Speed About Latency Not Bandwidth

| 1 Comment | No TrackBacks

Whenever anyone's talking about broadband they invariably refer to its speed, asking questions like how fast your service is. The answers are then given in terms of Kbps, Mbps, and Gbps.

But in reality these measurements have nothing to do with speed. Instead they refer to bandwidth, which is more accurately described as carrying capacity. Think of it this way: Kbps equals a motorcycle, Mbps equals a car, and Gbps equals a semi-truck. All these vehicles can travel at roughly the same speed, but they each can carry a different amount of stuff.

The importance of bandwidth becomes greater the heavier the files you need to transfer. Video, which are typically the heaviest files, requires the most bandwidth. And if your vehicle can't carry the whole thing then it needs to take it in pieces, which equates to when you have to wait for a video to download to start playing.

So bandwidth does not equal speed, it equals capacity.

But the speed of broadband does make a difference, and its quantified more directly by the measurement of latency, which tells you how much time it takes for a user's actions to make it through the network.

There are multiple instances where the latency of a broadband connection matters.

The first is in online gaming. In this case the lower the latency of a user's connection the shorter the time between when they try to move their character and when the character actually moves. Put another way, the lower your latency the faster you're able to move, so whoever has the lowest latency playing a game like World of Warcraft will have a clear advantage over those playing on higher latency connections.

The second is remote desktop applications. Here the same principle applies in that lower latency narrows the gap between when you move your mouse and the mouse actually moves on the remote application. So low latency is vitally important for creating a desktop-like experience that responds to user input in real-time.

The third is live two-way video communications. The more latency in a network the longer the pauses between speakers and therefore the more unnatural the conversation. If you've ever used a videocalling app over a mediocre broadband connection you know what I mean as you can see how long it takes for what you're saying to actually make it over the network. This high latency prevents conversations from flowing naturally, with users often talking over each other. So the lower the latency the better the conversation that can be had.

So now that we've laid out how the speed of broadband refers more to latency than bandwidth, and that low latency is vitally important to a wide range of applications, it's important to note that not only does fiber have exponentially more bandwidth than other broadband technologies, it also features far lower latency.

Just take a look at this graph:

status_rrd_graph_img.php

I got this from Nicholas Istre, who's a proud new customer of LUSFiber down in Lafayette, LA. This graph shows the latency of his broadband connection. The first half of it is when he was on Cox, with latency averaging 10ms and spiking up over 20ms. Then you'll notice his latency dropped down below 2ms and held steady; that's when he switched over to LUSFiber.

While the difference between 2ms and 10ms (ms = millisecond) may not seem like much, imagine if everything you did were every so slightly delayed, whether that's hitting the brakes in your car, catching a baseball, or anything. And making matters worse these delays can be cumulative, so the longer you're trying to use real-time apps on a high latency network the worse the experience gets.

The reason I'm taking the time to point out this issue is that it's vitally important to understand in terms of how broadband is used yet it's a technical enough topic that most average users and policymakers have no understanding of what this is or why it matters. That's why we so often hear people (myself included) conflating speed with bandwidth.

Yet not only is this an important issue to enabling real-time applications to run better, it also adds another layer to our case for why we need fiber, beyond fiber's near infinite capacity.

So I encourage everyone who cares about fiber to start talking about latency so that it becomes as well-recognized an issue as bandwidth.

Was hoping I wouldn't have to say this, but I think so far the broadband stimulus is doing more harm than good.

There are two primary reasons for this.

The first is that from a policy perspective there's a whole lot more discussion going on about how to distribute these limited BTOP dollars than there are conversations about how to craft a national broadband strategy. Given that we have less than a year to create and come to a consensus around that strategy, we can't afford to have the stimulus distract us from pushing this larger dialog forward.

The second, and much bigger concern, is that as things currently stand the stimulus is doing more to slow down deployment than speed it up. I've now heard from multiple people of projects that could already be deploying but instead are waiting to see if they can leverage stimulus dollars to help fund their projects.

On the one hand, I don't blame anyone for doing this. Why put in your own money when the federal government might pick up the tab for you?

But on the other this is an unbelievably bad unintended consequence of making government dollars available. Now we don't just have to deal with whether or not government can distribute its dollars quickly and fairly; we also have to worry about the entire broadband industry slowing its pace of deployment in the hopes of snagging some subsidies.

Of course, it's foolish for any project that could already have been deploying to wait around with their hands out as the legislation specifically states that this money should only go to projects that would not be deploying otherwise. But that hasn't been enough to stop anyone from thinking they might qualify and therefore we're still facing the potential of slower non-subsidized deployment as well as more stimulus applications muddying the picture for NTIA and RUS to decide who gets what.

And the longer it takes for NTIA and RUS to get their rules out, the worse this situation will get as now all new projects will also be looking towards the stimulus for support, which goes against what I want to see, which are projects that are viable without government writing checks.

One other double-edged unintended consequence is that now every community without robust broadband is trying to get their act together around a plan to get stimulus dollars.

On the one side this is great as it's spurring an interest in deployment that many communities just didn't have before.

But on the other it's troubling both because we're again muddying this issue with projects that aren't truly shovel-ready fighting for position over those that are, plus we don't have a support structure in place to help these communities properly prepare themselves.

What this last bit opens up is the possibility of a lot of bad projects getting created. As communities try to figure out what to do they're exposing themselves to a lot of charlatans who claim they can solve their broadband problems but aren't actually qualified to do so. If every community wants to now get broadband, how do they go about figuring out how best to do so?

Previously municipal broadband efforts were driven by an ad hoc network whereby a community interested in broadband went and talked to those who have already done it. But that ad hoc network can't support the demands of every community needing information as they will quickly overwhelm the capacity of successful broadband communities to respond.

That's why our rural fiber group was initially advocating that some of these government dollars be given out as matching grants to cover preplanning costs. In this way we don't just create demand for broadband, we support communities' ability to create plans to spur deployment.

All in all I'm not against the broadband stimulus. I just wish we'd done more to think through the unintended consequences of making a big pot of government money available without any real direction. It would've been great if instead of tossing the hot potato to NTIA and RUS Congress would've been more prescriptive in its legislation.

In lieu of that we need these agencies moving quickly and giving clear directives as to who's going to be eligible to get what. Otherwise if we ignore the reality of how this legislation is influencing the market, the stimulus could end up doing more harm than good.

Why I Don't Trust Wireless Networks Anymore

| No Comments | No TrackBacks

In January I joined the iPhone Nation. And for the most part it's been a life-changing experience. Prior to this my cellphones had just been cellphones, so having access to email, Google Maps, and webpages while on the go has been a revelation that I now wonder how I once lived without.

But I'm forced to say "for the most part" due to a series of experiences that highlight why I'm wary of anyone claiming that wireless is the ultimate future of broadband.

Over the last three months I've had at least three scenarios where it didn't work, not because of anything wrong with the iPhone itself but instead because of the failure of AT&T;'s network to deliver data.

The first has been an ongoing struggle to get it to work while on the Hill during my guerrilla lobbying runs. In particular while in Senate and House office buildings, data transfer gets very sluggish. While it hasn't crapped out on me entirely, it's often unusable. And that slowdown sometimes extends to even when I'm outside so it's not just about wireless signals having issues getting through the marbled walls of power.

The second was my first acute network failure. I attended the opening game for the Washington Nationals, and my iPhone stopped working entirely. It wasn't just that I couldn't check my email, I couldn't send text messages. I couldn't make phone calls. Sure I could still play checkers, but anything that relied on the network stopped working. I eventually found a corner of the stadium where I could get some connectivity, but even then it was hit or miss. Occasionally stuff would get through, other times not.

The third and most recent instance was a series of failures while enjoying myself at Festival International de Louisiane down in Lafayette, LA last week. Multiple times in different areas around the festival (which included five stages of music spread across downtown) I ran into the same network failure. I could still make phone calls, but I couldn't send text messages, couldn't access email, and couldn't get online.

During one of these episodes I was sitting beside David Isenberg, and this experience led him to write a tremendous blog post entitled "When is normal use a DOS attack?"

He makes a number of highly salient points in this post in that there's something wrong when normal usage has the same impact taking down a network that a coordinated denial of service attack has.

Now I know some may argue that the extremely high density of people all trying to use these same wireless networks at the same time doesn't qualify as "normal usage" but let's consider that a bit further.

In the Capitol Hill example, that's very much an every day issue. Sluggish connectivity has hit me on multiple days in multiple locations in and around the Hill.

In the Nationals example, sure the stadium was much fuller than normal since it was Opening Day, but network operators should know that having tens of thousands of people all wanting to get online in the same location is going to happen at least 80 days a year for baseball games alone so it's not like they couldn't plan for it.

And in the Lafayette example, Festival is by far their biggest event of the year, so while it's understandable why a network operator may not want to invest in sufficient capacity to handle that spike in usage that will only last for a few days, at the same time it's not like they didn't know this was coming.

These instances all highlight the same undeniable fact: wireless fails when too many people in the same location are trying to get online at the same time. It doesn't just get slower; it can outright stop working.

Also, these examples show how it's difficult to build wireless networks that can support large spikes in simultaneous usage.

And remember, these spikes in usage were driven almost entirely by mobile phone usage. Imagine what would be happening if everyone were also trying to use wireless broadband to connect their desktops and laptops as well!

If wireless can't even support the demands of the iPhone, how can we responsibly rely on it to handle all Internet traffic?

Finally, as a note to AT&T;, from a customer perspective these experiences were so absurd as to be surreal. Here I had the most advanced phone on the planet in my hand, only I couldn't use it as a phone. Here i had the most advanced mobile computer, and yet I could check my email. It honestly made me want to throw my iPhone into oncoming traffic in frustration. Though I wasn't frustrated with Apple or the phone itself as I know these issues weren't their fault. The blame from my perspective as a consumer fell solely on the shoulders of AT&T.;

And that's really bad news for AT&T.; While they're the only ones who can offer the iPhone in the US, most iPhone users I know love the phone but don't particularly like AT&T.; This suggests that if AT&T; were to lose its exclusive iPhone deal and/or if another phone were to come about that could be a true competitor to the iPhone, that these network failures could cause AT&T; Wireless customers to flee in droves.

And on a higher level, I can't see these network failures on the Hill doing anything good for engendering good will among policymakers towards AT&T.;

So I encourage AT&T; to work diligently and speedily to resolve these issues. If you're going to build your business based on mobile access, then those mobile networks need to be resilient and reliable.

And to policymakers, take note of what's happening with these wireless networks as demand for the iPhone grows. There are lessons to be learned here about the limitations of wireless access in general as no matter the wireless technology they all suffer from the same issues if too many people in the same area try to get online at the same time.

And I can't accept that what's best for America is a broadband future that can't be relied on.

No State Can Afford To Outlaw Municipal Broadband

| 2 Comments | No TrackBacks

While not a new trend, recently there's been a relative explosion in efforts at state legislatures across the country to introduce legislation that will impede or outlaw municipal broadband projects.

I'm here to say outright that no state can afford to embark on any effort to stop municipal broadband.

My reasoning is simple: we face too big a challenge in wiring our country to force any players to the sidelines.

Imagine if we'd left the building of roads or electric lines purely to the private sector. How much longer would it have taken to get the job done? This is particularly true in less economically attractive areas, like rural communities and the urban poor.

In tackling the challenge of getting every last American connected to the best possible broadband, outlawing municipal broadband will force us to fight with one hand behind our back.

And it's important to note that one key commonality between every country currently considered a world leader in the availability of broadband is that government played a strong, active role in the deployment of networks. So taking the option of municipal networks off the table puts us at a disadvantage in the global economy.

This all being said, I completely understand why private operators would want to quash all municipal broadband efforts. If I ran an incumbent network I wouldn't want government coming in, overbuilding my market, and competing with me. I'd look at the many advantages a government-run network has and feel like not only is this a new competitor but that they're not operating on a level playing field with me.

And I'm a strong supporter of privately owned and operated networks. Whenever possible if the market can drive a continuous cycle of investment to support the best broadband, I'm all for that.

But to ignore the reality that in many markets competition is not driving this investment in the best broadband--and not just in rural but also urban and even some suburban areas too--is a dangerous thing to do as it sets the stage for these communities to be permanently left behind in the digital revolution.

In some areas, the only way they're going to get the best broadband is through a locally-driven municipal effort. Without the ability to build their own networks, these communities will never be able to compete with neighboring cities and their counterparts elsewhere in the world.

Because of this I firmly believe that any state that outlaws or significantly impedes municipal broadband efforts is making the wrong decision by taking the fate of their future competitiveness in the digital economy out of their hands. Any state saying no to municipal broadband is also likely saying no to the possibility of every one of its citizens ever getting equal affordable access to the best broadband.

And if private providers care about anything beyond their own bottom line, if the future of the country that's supported their growth matters at all to them, they will back off from these efforts to ban municipal broadband. Instead I'd rather see them expending their energy upgrading their networks and improving their services.

If they want to stave off government-sponsored competition then all they need to do is deliver the kind of connectivity communities are demanding. There are hardly any municipal broadband initiatives in places that already have a private full fiber network. So get the networks people want deployed and you can head off municipal competition at the pass.

And if private providers are going to focus on anything legislatively I'd encourage them to instead advocate for municipal networks to be built in an open way so that private providers can offer their services over them. I'm not saying we should be mandating the wholesale-only model as it hasn't been proven to work as of yet. But if I'm an incumbent provider I'd welcome a municipality building a cutting edge fiber network that I can run on as it saves me a huge amount of capital costs that instead of putting into the ground I can invest in developing new services that take advantage of the capacity of fiber.

So while I can understand where private providers are coming from, I believe strongly that it's irresponsible for any state to seriously consider outlawing or impeding municipal broadband. The challenges we face are too great to leave anyone who can help solve them on the sidelines.

In talking with some of my north-lying fiber-deploying friends I'm learning that spring has arrived and they're ready to start deploying, only they can't. Why? Because the dollars they need to bring the best broadband to rural America are still locked up in the gears of government.

In no way, shape, or form am I trying to begrudge our hard-working administrators and officials at NTIA and RUS for taking their time to think through and develop the rules that will determine who can get what in terms of subsidies.

Instead I first feel obligated to reiterate that if we want to see any serious BTOP-stimulated deployment this year in northern states the clock is officially ticking.

Cold can set in as early as September up there--and I should know being born and raised in Minnesota--so the window of opportunity to get networks built is much narrower than in southern states, some of which can continue deploying throughout the year.

Yet if you look at the pace things appear to be going at, with the rules coming out the beginning of June and then grants not going out the door likely until August at the earliest, you can see how we run the risk of little to nothing getting done to connect the underserved this year in rural states.

This is especially troubling as the intent of the overall stimulus package was to get money into the system quickly. To create as many jobs as fast as possible.

So the question becomes: is this the best we can do? Can government not find new ways to get money out the door in a timelier manner? Can government move at the speed of the market?

The answer that I've been advocating for is a resounding "Yes!" And the way to accomplish this goal is to establish a fast-track partial loan guarantee program.

It's possible that RUS in particular could institute a program like this as early as the beginning of May. By mid-May they could be distributing guarantees, and that could then get private capital flowing by June.

So in other words, by the time government's ready to start accepting grant applications, these guarantees could already be stimulating deployment.

I urge everyone reading this, especially our friends in government, to not accept that the best we can do is get money out by the end of the summer, because doing so essentially means we're not going to stimulate any deployment in northern states this year.

Instead we must be strongly considered all options to get these subsidies transformed into deployment ASAP. In this way we can both live up to the goals of the overall stimulus while also not limiting the opportunities for northern states to benefit from this just as much as southern states.

Witnessing the Future of Education in Lafayette

| 2 Comments | No TrackBacks

Writing from Lafayette, LA, over the last few days I've witnessed an amazing confluence of events that suggest the future of education is being built down here in Cajun country.

On Friday and Saturday I attended the Digital Workforce Conference at LITE (short for Louisiana Immersive Technologies Enterprise, which is a fantastic 3D visualization facility), which is put on by 3D Squared, a non-profit focused on workforce development for the digital media and gaming industries.

Friday was the culmination of what had been an 8-day intensive workshop where high school kids formed teams tasked with conceptualizing, designing, and building a videogame. The final step then was for them to present their games to a panel of industry experts as if they were pitching them to publishers.

The games they created were built on a new gaming platform that's currently in closed beta called Metaplace, which enables the creation of virtual worlds. And the kids' presentations helped showcase the multifaceted process it took to create these games, including developing the concept and game mechanics, creating the graphical style, recording sound and voiceovers in many cases, building the world using the Metaplace tools, and even some actual programming to make the worlds act in different ways.

On Saturday a series of panel discussions and presentations helped reinforce the fact that by teaching kids how to create games you're teaching them to think critically and creatively in a lot of different ways while also keeping them engaged with a process that's fun.

But I should also say that just because the kids were having fun doesn't mean they weren't taking it seriously. In fact I got a chance to poke my nose into one team's meeting and was amazed at the professionalism with which the students conducted themselves and the level of the discussion they were holding. And I learned that in order to create these games they were putting in 12+ hour days, working effectively in teams, and being respectful of their peers and others.

While I'm not sure if teaching videogame development is the panacea for all students and to fix all our educational problems, this event helped showcase how it can foster learning and why we should seriously consider supporting efforts like these as we work to redefine what education means in the 21st century.

Moving forward in my trip to last night I had the good fortune to be invited by Kit Becnel to attend an experiment in using online technologies to connect students on different sides of the country.

The setup was at a local public library, with cameras on the students in Lafayette and a screen showing video from students in San Francisco.

The application they used was called Vievo, which enabled both videoconferencing as well as the ability to view what was playing on computer screens remotely.

The agenda started with the San Francisco kids taking turns showcasing the work they've done creating 3D models using software called Cinema4D. The two projects they'd worked on were first a short animation of moving text that said, "My name is X, and in ten years I want to be a Y." The other project was to create what they think their bedroom would look like in 10 years.

Their teacher stressed that the students built all their 3D models from scratch, and there was some really impressive work going on in there.

Then on the Lafayette side they showed off the Metaplace world that one of Kit's students had helped create at the previous week's Digital Workforce Conference.

Throughout the students were asking each other questions.

This experiment really started showing the potential of what's possible by establishing collaborative relationships between classes in different cities. I could see the students on both sides being not just intrigued but also inspired by the work their counterparts had done.

That being said, I should also admit that this experiment also highlighted how far we still have to go to make this a reality. For example, despite both endpoints having lots of bandwidth, at some point out on the Internet the stream was getting choked causing some significant issues with the video stream. And both sides experienced local technical issues that caused the stream to break.

Luckily Kit had some great help from Kris Wotipka, who volunteers his time and technical expertise to the FiberKids project, and Adam Melancon, Lafayette's library system network guru. So these were only minor bumps, and overall I think everyone involved felt like this was a really positive first step towards establishing a collaborative relationship between these classes using fiber networks.

As a quick reminder, the FiberKids initiative has grown out of Kit Becnel's Academy of Information Technology program at Carencro High School, and it's setting out to enable a framework through which Lafayette students can become a testing ground for educational applications, in particular those that leverage the capacity of their fiber network. It's a tremendous program with unlimited potential, and you can look forward to hearing a lot more about that in the near future.

To that end, while I can't provide all the details as of yet, I can say that the final big event regarding the future of education being built here in Lafayette was a presentation and discussion I was able to participate in earlier on Tuesday about establishing a series of pilots in Lafayette to showcase a new model for 21st century classrooms that leverage the capacity of fiber networks.

I'm not yet at liberty to go into more details about this initiative, but do know that there's some significant energy behind doing this, that it should be moving within the next few months, and that it's not just going to involve Lafayette but also a global network of classrooms. I wish I could say more, but you'll just have to wait a little bit longer.

Until then, know this: the future of education's being built in Lafayette, LA.

That future means a world where students aren't limited to what's in textbooks, where they can collaborate not just with their classmates but with their peers around the world, where they're able to not just regurgitate data points but synthesize and create new things, where the technologies that capture their attention out in the world are brought into their classroom to facilitate teaching that's more interactive and engaging.

It's a future that won't be easy to achieve and still requires a lot of work, but fixing our education system is a challenge that we should all want to see tackled. And I know I for one am going to be doing what I can to help Lafayette and other communities bring about this new paradigm in how we educate future generations.

If you want to step up and help be a part of this process, add a comment below and I'll contact you so we can continue this dialog!

To Create Demand We Need More Local Content Like PEG

| 2 Comments | No TrackBacks

The aspect of the broadband stimulus that's getting almost no attention is the up to $250 million set aside to spur demand for broadband. It's shocking to me how little people seem to be talking about it despite the fact that increasing the market for broadband may have the most profound effect on spurring deployment by creating more customers.

Admittedly I've been guilty of focusing more on deployment than demand in recent weeks as well, so as a first step towards remedying that let's dive into one of the potentially best ways to increase that demand: by fostering the creation of more locally relevant content.

If you want people to go online you need to give them a reason to do so. If we want to reach the 50% or so of households without broadband access we can't get there by talking bitrate and bandwidth. We need compelling local content that they either can't get anywhere else or that it's at least more convenient to watch online.

That content could be anything, from local sports to local government meetings to local music to local healthcare information and beyond. The idea is having content that's relevant to someone living in a particular geographic area, and then making that content available online on-demand.

Because of this, it's my belief that NTIA should seriously consider any proposals from PEG stations to use BTOP demand funds to facilitate the creation of more local content.

PEG stations are already producing a lot of this local content and these funds could help them not only expand their productions but also enhance their existing and/or establish new online distribution platforms.

These funds could be transformative in enabling PEG stations to move out of a 20th century TV mindset to a 21st century online mode of all content available everywhere at any time. And in some cases these funds could help save PEG stations threatened by losing their funding to state video franchises from extinction, turning these dark days into new opportunities to make PEG more robust and relevant.

What kinds of PEG projects NTIA should fund are a bit up in the air. I think to be eligible there has to be an online distribution component of any proposal. And I'd think it makes sense to mandate that these funds be used to either create new content or to make existing content more accessible online and to help raise awareness about it.

But like the BTOP deployment dollars, which many have suggested will be used to fund pilot projects from which data can be collected as to their efficacy, I think the same mindset should apply to PEG.

We don't just want more of the same. We should be encouraging outside-of-the-box thinking, fostering new ideas for how to make local content production more sustainable, how to better reach local audiences, and how to create the kinds of compelling locally relevant experiences that will create a lot of demand for broadband.

While we can't focus all of our attention on this given that many communities have contentious relationships with incumbents, wherever possible we should support efforts to get local content producers, stakeholders, and network operators all working together in new ways. In this way we can start to showcase how PEG can enhance a network operator's business and not be a burden, which too many network operators currently consider their PEG obligations to be.

If we can help fund some new models for PEG content creation and distribution that cause more people to demand broadband then PEG should quickly become network operators' best friend as that means more customers for them. But in order to prove this argument we also need to make sure we're collecting hard data on usage so we can track various programs' success.

And if we can establish new paradigms for how PEG stations and network operators can work together in harmony then we can secure PEG's future while simultaneously bolstering the business of broadband providers and most importantly of all get people more engaged with what's happening in their communities and more interested in using broadband for other purposes.

For these reasons and more I'm a firm believer that funding progressive PEG stations could have a profound affect on increasing demand for broadband. So I encourage NTIA to seriously consider applications from these corners.

Is it just me or are all the discussions about how the BTOP program should distribute its dollars way off track?

During the debate leading up to the stimulus passing, the new administration was quite clear that their intention for this legislation was to avoid policy debates and focus on how to get money into the system stimulating the economy as quickly as possible.

But ever since the stimulus passed our industry has been embroiled in a series of policy debates around things like defining the terms unserved and underserved. Getting dollars out the door quickly to stimulate the economy has taken a backseat to discussions of who deserves what.

We're now sidetracked by the scrum of communities and deployers all trying to position themselves as most worthy of government support and as most in need.

While in general these are worthwhile discussions, as it relates to the BTOP program I think we're missing the point.

BTOP is not about getting everyone online; it's about spurring deployment to stimulate the economy as quickly as possible.

With this in mind, determining who should get what at this initial juncture should be a lot easier.

What we should be doing is rounding up all of the projects that are truly shovel-ready, those that would've already been deploying if it hadn't been for lack of available capital, and funding them to get moving.

For this first round of subsidies let's not get too caught up in unserved vs. underserved definitions and let's not welcome in all applications equally. In the same way, let's not allow ourselves to settle for lesser forms of broadband.

Instead let's focus on those shovel-ready projects that bring connectivity with the capacity to serve a community for decades to come, that pledge to strive for operating open networks, and that have been developed from the perspective of the public good rather than solely for private profit.

As I've suggested before, if we end up not getting enough applications that fit these characteristics then we can always expand eligibility in future rounds.

But for now it's vitally important to remember what the point of the BTOP program is supposed to be: stimulating broadband deployment not solving it.

Time Warner's Higher Speeds And Lower Caps Collide

| No Comments | No TrackBacks

Through all the consternation surrounding Time Warner's new bandwidth caps their limitations are always discussed relative to downloading high quality movies. While this is a practical way of thinking about these caps, there's another angle that's even more striking to consider by comparing these caps against the increased speeds of Time Warner's service.

Let's start with some numbers: 100GB and 50Mbps. The first is their top-level cap, and the second is the fastest speed they offer.

Now let's consider some time in the not too distant future where someone invents an application that requires 50Mbps of constant throughput. Let's put aside the fact that there are no apps like that today, and that cable networks have trouble sustaining their maximum throughput. Instead let's assume an app with these requirements exists.

How long would it take to use up 100GB?

First let's convert GB into Gb by multiplying 100 by 8 (there are 8 bits in a Byte). So we have 800Gb.

Now divide 800Gb by 50Mb and you see that it'd take 16,000 seconds of data transfer at 50Mbps to hit the 800Gb cap.

Now divide 16,000 by 60 to get 266.67 minutes. Then divide that by 60 again to get a bit under 4.5 hours.

So tying this all back together, if you were to turn on this 50Mbps application, you'd eat through your monthly allotment of bandwidth in less than 4.5 hours.

This is an absurdly short amount of time.

Even if you assume that there isn't an app that requires a constant 50Mbps but instead that a home would need that much capacity for only 10 minutes a day you still wouldn't get to the end of the month without going over the cap.

I should say at this time that I'm not necessarily against bandwidth caps. I know a number of fiber operators who think that caps are essential to protecting their ability to deliver quality service and to insuring they can afford to pay off their networks. Some of these are even municipal networks so there's no profit-motive driving these beliefs.

But there's an inherent tension that exists between introducing caps at the same time you're increasing speeds. It'd be like introducing cars that can go 200MPH but only for 200 miles. Sure it's faster but you can't really use all that speed.

It's worth noting that other network operators have set higher caps, like Comcast at 250GB, but even these will be overwhelmed as faster networks beget more bandwidth-intensive applications.

Unfortunately our current broadband paradigm doesn't necessarily have an answer for this dilemma. Network operators either need to cap everyone, rate limit the heavy downloaders, or have the cost of their usage eat into their profitability.

The only clear path forward I see to this problem is to push forward aggressively with deploying more networks with more capacity and lower interconnection fees, which ultimately means full fiber networks peered up with each other.

Because without a whole lot more capacity we'll still be stuck in an era of bandwidth scarcity, which means having to manage a limited resource. But if we can get fiber everywhere and fiber networks peered up and acting as a nationwide LAN, then we can infinitely more bandwidth without it being infinitely more expensive.

The takeaway from this post is not that what Time Warner's doing is wrong and evil but that this all highlights the limitations of the current broadband paradigm, which will only be exasperated over time as networks become faster and applications hungrier for bandwidth. And the only way to overcome this is to push towards the goal of realizing a Full Fiber Nation.

Two Good Events Tomorrow On Broadband Stimulus

| No Comments | No TrackBacks

Quick note to anyone interested in learning more about the broadband stimulus, there are a couple of events for you to take note of.

First is a webinar featuring Craig Settles and Tom DiFiore at 2pm Eastern. I've met Craig and have long been impressed with his work. So while I don't know Tom, I do know that if you're trying to get as much insight into what's happened and happening with the stimulus then this should be an hour well-spent. Especially since you can't beat the cost: free!

Register to attend here: https://www2.gotomeeting.com/register/720675119

Second, there's the latest edition of the Broadband Breakfast Club at which there'll be a discussion about states' role in allocating BTOP funds. While not free you'll get some good food and be able to listen to some fantastic speakers, in particular Karen Jackson, who heads up Virginia's telework program, among many other hats.

Register and pay to attend here: http://broadbandbreakfast.eventbrite.com/

Dear RUS/NTIA: There Will Be Losers And That's OK

| No Comments | No TrackBacks

Just read through some of the other comments being filed for the BTOP program, and can't help but say this to our friends at RUS and NTIA: there will be losers in this process and that's OK.

I continue to be amazed at how many people, companies, and entities are trying to claim that it'd be a horrible thing if not every network operator could apply and how we want everyone and their mother to be putting in an application.

Even more galling as I've discussed before is how absurd it is to suggest that we shouldn't be requiring minimum speed levels just so that we can extend our arms wide enough to welcome all applications.

This is the absolute wrong way to go about thinking of these things.

First off, there's not enough money for everyone to get some, so it's unavoidable that there will be losers in this process.

Secondly, by setting the bar too low and allowing marginal applications the same opportunity as the best applications we're making life harder for RUS and NTIA as this creates more work for them to vet everything and more hassle weighing the relative merits of projects. But by raising the bar we can cut out a lot of the noise and allow them to focus on truly worthy projects.

Third, while some incumbents most definitely deserve subsidies to keep doing the good work they've already been doing, not every incumbent deserves subsidies. Put more bluntly, I can't see the sense in rewarding incumbent inaction with taxpayer-funded checks to keep doing business as usual.

Fourth, not every broadband technology deserves funding. By going overboard trying to be technologically neutral we risk funding dead-end technologies with no great hope for future development and giving subsidies to technologies that claim to deliver a certain speed but can't actually do so in a reliable way. Plus there's the likelihood that some of these dollars will go towards saddling unserved areas with last-generation technology, which will cement their position as second-class citizens in the Digital Age.

Fifth, not every municipality deserves funding, at least not right now. I'm not denigrating municipal broadband in general or any community in particular, instead I'm pointing out that there are a lot of places who haven't become serious about broadband until BTOP came along and therefore aren't likely ready to move forward quickly with deployment. Instead we should be rewarding those communities that already have their act together and that already have been working towards finding ways to wire themselves.

Now I should say that I'm a pragmatist at heart, and I do realize that by adding restrictions we reduce the pool of applicants which could potentially mean not having enough viable projects to fund. But I think it's irresponsible to assume that that's absolutely the case and therefore lower the bar for the quality of projects BTOP funds.

Instead I'd rather see us set a high bar that respects the public interest and see what applications show up. My guess is no matter how high it's set there'll still be plenty of projects to fund. But if not we can always lower that bar at a later date.

Getting back to the point of this post, we can't allow ourselves to set policy based on the lowest common denominator here. We can't feel sorry for those applicants who may lose out on eligibility because their projects aren't good enough. We can't worry about protecting private interests over elevating the public good.

It's unavoidable that there will be losers in this process. So instead of wasting our time making sure that everyone can play on their terms, let's focus on making sure that these limited funds go to the best possible projects that don't just deliver what private companies want to deploy but instead bring about the kind of connectivity that the public needs.

[Note: These are the comments I just filed to the BTOP program on behalf of the Rural Fiber Alliance. They build on the arguments I've made on this blog previously about the many benefits of loan guarantees, and it lays out a specific action plan for how both RUS and NTIA can implement a fast-track partial loan guarantee program to make more money available more quickly then a straight grants-only approach.]

BTOP Fast-Track Partial Loan Guarantee Proposal
  The Rural Fiber Alliance represents rural fiber deployers and their supporters who believe that all Americans deserve equal access to the global gold standard of broadband: full fiber networks. RFA members are pragmatic practitioners focused on creating self-sustaining networks that strive to maximize the impact of every dollar spent on deployment, be it public money or private.

The biggest obstacle to laying fiber to every last building in rural America today is NOT the economic viability of rural fiber or project sustainability.  It is the lack of sufficient private capital, a reality exasperated by the current credit crunch. In fact many RFA members have rural fiber projects that would already have been funded and deployingif it weren't for the private capital markets seizing up in the fall of 2008.

To that end, the RFA has developed the following proposal to transform a reasonable portion of the BTOP budget at RUS and NTIA into fast-track partial loan guarantees that can both free up billions in private capital and do so in a matter of weeks.

Benefits
Unlike grants that only deliver a dollar-for-dollar worth of deployment, or loans that require government to take on all the risk, partial loan guarantees use public dollars to leverage private investment to make more capital available while lowering government risk by sharing it with private investors.  And because guarantees require private investment, approval can be fast-tracked, relying on the private investors who must assume real risk to vet projects. So guarantees deliver more money with less risk at the speed of the market instead of the speed of government.
 
Action Plan
RUS: Carve $500 million out of its $2.5 billion BTOP budget to create a fast-track partial loan guarantee program that will enable the distribution of at least $5 billion in guarantees.
 
Approve projects within 30 days of receiving a qualified application using a simple checklist to confirm  the project has the pieces in place to be market ready, including a business plan, financing plan, proven management team, etc.
 
Attach a 90-day shot clock to these guarantees to ensure that once awarded they're used quickly to raise funds and start deploying.
 
Change the current guarantee from an 80/20 split on all losses to a ratio that better incentivizes private investors to assume risk.  RFA proposes that the federal guarantee instead cover 100% of losses from dollar one but only up to a maximum of 50-60% of the total value of the loan. Most projects would be guaranteed at 50% of a loan's value.  But the program should offer higher guarantees for less dense projects on a sliding scale up to a 60% guarantee. Making these changes will lower the interest rates of guaranteed loans and increase lenders' interest in funding projects.
 
The loan guarantee should be available for the full amount of the project capital costs.  The willingness of the private sector to capitalize the project, accepting 100% of the 40-50% residual risk, is the appropriate test for whether a project is eligible for a guarantee.  There should be no arbitrary restrictions on the exact form of the private capital participation.
 
Open up these guarantees to all comers on a first-come-first-served basis. If multiple applications come in for the same community at the same time preference should be given to previous RUS borrowers, but only if they're delivering comparable connectivity, otherwise whichever project is delivering the best broadband should get priority.
 
The intent of these guarantees is to support shovel-ready projects that would've already been deploying absent the credit crunch.  RUS (and NTIA) should accept the presumption that pre-packaged private financing that satisfy the above criteria and were not funded because of the credit crunch should be eligible for the government guarantee.  The presumption can be challenged if the agency believes there is sufficient reason to doubt the claims of the applicant.
 
To help protect against liar's loans, the applicant must provide sworn affidavits from the project manager and from the financing institution that sufficient due diligence has been performed on the financial viability of the project to allow the financing to go forward and that the financing will be paid back under the terms and conditions outlined in the financing package.
 
Let these guarantees subsidize all broadband technologies so long as the speeds they're delivering are higher than what's currently available in an area and/or the networks they're building already have a significant amount of local support/demand.
 
Still analyze the financial viability of projects but do so after the guarantee is granted and applied to a loan, working under the assumption that if a project can raise private capital that it's worth subsidizing. This after-the-fact vetting would be used to gauge the risk of a project for purposes of determining how to score it against the budget.
 
And the RFA calls on Congress to take action in legislating a waiver for municipal projects so that they don't lose their tax-exempt status by using a federal guarantee.
 
NTIA: The NTIA has two options for leveraging its dollars to realize the benefits of guarantees,
 
1 - Set aside a pool of funds dedicated to the same fast-track partial loan guarantee program as RUS but make it available to non-rural areas.
 
2 - Include specific language authorizing that NTIA grants can to be used to purchase letters of credit that can serve as partial guarantees for larger loans, and putthe applications for these letter of credit guarantees on a fast-track, first-come-first-served basis.
 
To ensure that Option 2 isn't abused, NTIA should require appropriate safeguards be put into place such as requiring applicants to have lenders apply alongside them to prove the lender's interest/commitment, and/or have the government buy the letter of credit directly on behalf of the applicant to ensure funds are spent appropriately.
 
In either case, by dedicating between $500 million and $1 billion of budget authority towards this loan guarantee approach, NTIA can free up billions in additional capital to spur deployment.
 
Both: Not only do these guarantees maximize leverage, lower exposure, and speed up the approval process, but implementing a guarantee program can be done in such a way as to retain flexibility in how those dollars are spent. If these guarantees aren't sufficient to free up private capital and there's still money left in these programs after the rest of the BTOP dollars are spent, then NTIA and RUS can convert the budget authority back to grants.
 
In this way instead of NTIA/RUS leaving dollars to sit around waiting for projects to be thoroughly vetted and grants to be awarded, NTIA/RUS can support immediate deployment of shovel-ready projects on an ongoing on-demand basis as market-ready projects get to the point where they're prepared to go to the private capital markets. 
 

www.RuralFiberAlliance.org

All Nations Agree: Fiber's The Global Standard

| No Comments | No TrackBacks

Previously I've argued that everyone agrees that fiber is the gold standard of broadband. The only questions being can other technologies deliver the same capacity, do we need that much capacity, and can we afford to bring the gold standard to everyone. (With the answers being: no, fiber's by far the best; yes, video beyond HD requires fiber; and yes, in fact we can't afford not to bring fiber to everyone.)

But now let's take this conversation a step further and acknowledge that full fiber networks have become the global standard for broadband.

Just look at ITIF's 2008 Broadband Rankings (link is to a PDF).

Notice which countries have the highest average broadband speeds with the lowest price per Mbps. They all have one thing in common: they've all committed themselves to full fiber networks.

So there's a direct relationship between building out fiber and not only realizing higher speeds but also lower prices, and that's in spite of fiber's higher upfront costs.

Now let's take a look at some global broadband-related headlines from the last two weeks:

Singapore ready for full-scale FTTH deployment

New Zealand Unveils High-Speed Broadband Network Proposal

Australia Announces $31 Billion Fiber Network

That's three countries in two weeks announcing major national broadband initiatives all of which are based on deploying full fiber networks.

Quite frankly, you just don't hear the same kind of buzz about any other broadband technology. You don't hear countries proudly committing themselves to a broadband future that doesn't include full fiber networks.

And we can't deny that the common trait held by all of the countries considered to be global broadband leaders is that they've all committed themselves to a full fiber future.

Bringing this back to the US, with this context I don't see how we can responsibly set our national goals at anything less than the global standard of broadband, which are full fiber networks.

That doesn't mean we shouldn't support further DSL, cable, and wireless development and deployment. And in fact if we can realize a future where these other technologies have evolved at a rapid enough pace to keep up with fiber so that there is robust intermodal competition then that'd be great.

But what we can't afford to do is allow ourselves to not set the aspirational goal of a Full Fiber Nation just so that we can wait around for these other technologies to prove themselves.

So let us not shy away achieving great things, accept the reality that the rest of the world has chosen fiber as the global broadband standard, and start working towards bringing the best broadband to every last home in our great nation.

Well America's refusal to embrace a Full Fiber Nation as our ultimate goal just got even more absurd: Australia announced they're laying fiber everywhere.

The reason I use the word "absurd" is threefold.

1 - Everyone always says America's too big, too thinly spread, and too rural to lay fiber everywhere, yet Australia has more than two-thirds the land mass but less than one-tenth the number of people. So if they can do it, why can't we?

2 - Everyone always says it's too expensive to lay fiber everywhere, yet if you look at the $30 billion that's to be invested in Australia, multiply it by the difference in GDP, and you see that an equivalent investment by America would be $600 billion. So again, if they can do it, why can't we?

3 - Everyone always says that we shouldn't pick a broadband technology, yet the rest of the world has already embraced the fact that fiber is our future and are moving full steam ahead on plans to achieve the goal of creating Full Fiber Nations. So one more time, if they can do it, why can't we?

When will America wake up and realize it's being left behind by continuing down the path of technological neutrality and focusing all attention on facilities-based competition?

Will this be the wakeup call that we need to stop our squabbling and start focusing on an action plan to get us where we need to go?

I sure hope so because every day we're not moving forward aggressively in a united way is another day we're falling further behind. And we have to acknowledge that unless we commit ourselves to the same standard as the rest of the world, then we'll always be behind, following the leaders in other countries as they push the envelope on developing the next-generation of the Internet.

If I sound alarmist then you're reading my words correctly. What will it mean to our economy in the 21st century if we're no longer the leaders in broadband innovation?

I for one don't want to find out.

Everyone's favorite advocate for municipal fiber is back, I sat down for a rousing discussion of the broadband stimulus, how it relates to municipal networks, and what technologies should be subsidized. Enjoy!

As always, Chris makes a lot of good points:

- Municipal networks hire local talent and add to the local tax base.

- Many municipal networks are too new to deem failures; need to give them time.

- We often forget that Verizon's lost a ton of money in its fiber deployment too; that's the nature of building new broadband networks, which require lots of upfront capital before revenue starts flowing.

- Municipal networks could pay off their debt more quickly if they didn't have as many customer support people and technicians, but they'd rather have quality service than pay off their debt a little faster.

- "Evil" is absolutely the wrong term to describe big private companies. For the most part it's the nature of private enterprise to maximize revenue, though there are some private deployers who see their mission above all else being to serve their communities.

- Taking this a step further, it's not the job of the big private deployers to look out for local community interests. In fact, they shouldn't have to worry about wiring every last small community. So because of this very large companies aren't the best equipped to address connectivity issues in small rural towns.

- The goal of this money is to spur the deployment of networks that weren't going to be built without government subsidies.

- There's too much focus on the stimulus; this new administration has more up its sleeve.

- We should require subsidy recipients to operate their networks under conditions of common carriage, at least for this first chunk of grants. Then if that proves to be too burdensome a requirement they can loosen the rules in future rounds.

- Fiber's the only technology that makes sense to invest in now. Last century government had to pick a technology as it relates to roads and we built highways everywhere; we didn't wait for flying cars to show up.

- Throughout history we've had to make policy decisions that locked in a technology, and that has worked out pretty well when we've done it intelligently.

NTIA/RUS/FCC Leaders Need To Be Confirmed ASAP

| No Comments | No TrackBacks

So we've got a few more major pieces of the broadband stimulus puzzle in place, namely nominees to head up NTIA (Larry Strickling) and RUS (Jonathan Adelstein) to go alongside the FCC's new chairman (Julius Genachowski).

But there's a very large fly in this ointment: none of them are yet confirmed.

Under normal circumstances this might not be a big deal, but at this moment it's incredibly important that they're able to take power sooner rather than later. The reason for this is that the NTIA and RUS in particular are deep in the middle of their rule-making, which will set out the who and what of how broadband stimulus grants will be awarded.

The problem is that there's a chance our new leaders won't be confirmed until after this process concludes, which would mean that they'd be saddled with rules they don't necessarily agree with. And while it's my understanding that these rules can be tweaked somewhat over time, in large part whatever crystallizes and comes out of this rule making will frame who gets what moving forward.

I for one would not want to be stuck administering someone else's rules, and it's especially crucial we don't allow this to happen now as then we'll miss the opportunity for these new leaders to put their respective stamps on these organizations, to make sure that we don't just accept business as usual but instead pursue new outside-of-the-box ideas.

And yet at various times some have suggested that we might not see these confirmations finalized for weeks if not months. Though on the flipside I've also heard that once we come out of this spring recess that things could move quickly.

So I implore the federal government to not drag its feet but instead recognize that these are extraordinary times that require extraordinary leadership.

President Obama has already taken the first big step of nominating extraordinary leaders, so let's now move with all due haste to get them confirmed and in place before these rules are finalized.

Something interesting happened this week: Cox Communications launched its first DOCSIS 3.0 service tier, offering speeds up to 50Mbps down and 5Mbps up.

But what's interesting isn't that Cox did this, it's where they did it: Lafayette, LA.

The same Lafayette whose utility is currently in the midst of building a full fiber network with a top-end speed of 50Mbps symmetrical for less than $60 a month.

What this says is that municipal fiber deployment doesn't just bring the best broadband to citizens, it also introduces competition that spurs investment by incumbent providers to upgrade their networks.

And in fact the citizens had already been reaping the rewards of its municipal fiber project before it even went live. After the fiber initiative started Cox stopped raising its rates for cable TV in Lafayette, but it kept raising them everywhere else.

So while even though they had to fight through a lengthy and expensive court battle to protect their right to build this community asset, they calculated that the citizens of Lafayette had actually saved an equivalent amount in lower cable fees as the city had to pay in legal fees.

And the cost savings seem to be continuing on today as while Cox's top tier Internet service is supposed to cost $130/month, in Lafayette it'll only cost $90, though that has been described as a launch price so it may raise back up to $130 over time.

Though frankly I'm not sure if Cox will be able to do that. If anything they may have to lower that price even further. How can they expect to sell one tenth the service on the upload side for more than twice the cost? Even at the lower price look at how they compare:

Cox - 50Mbps down/5Mbps up for $90/month
LUS - 50Mbps down/50Mbps up for $57.95/month

It wouldn't surprise me if to stay relevant we also see AT&T; deploying U-Verse in Lafayette in the not-too-distant future otherwise they could be pushed out of this arms race entirely.

To me what this is showing is that competition really can work. The point isn't that this is a municipal network, it's that there's one competitor who's made the decision to invest heavily and now the competition is having to react and increase their investment. The same thing is happening in markets where Verizon's deploying FiOS.

Everyone agrees we need more competition. So this suggests that especially in markets where the incumbents aren't investing, we should be encouraging any and all overbuilders to come in and get that dynamic cycle of competitively-driven continuous investment start, whether they're public, private, or somewhere in between.

I just got a chance to read the first couple pages of the FTTH Council's comments to the RUS on how they should spend their stimulus dollars and I ran across this troubling statement:

"...grants can be awarded more expeditiously, requiring fewer resources and expenditures by the applicants and the agency...because the agency incurs the risk of default with a loan, it must engage in substantial, time consuming, and costly due diligence in advance of any award (as well as engaging in strict post-award audits and reporting over the life of the loan)."

My question isn't "Does the system work this way?" it's "Why does the system work this way?"

Why is it easier/faster to vet and award a $100 million grant than a $100 million loan?

This doesn't make sense to me as why should incurring $100 million in risk be worse and require more due diligence than writing a $100 million check and walking away?

If anything I'd think the opposite would be true. In the case of giving out a grant don't we want to do everything possible to insure that the dollars are spent wisely, that they're given to qualified projects, that they don't lead to unjust enrichment?

Of course the variable in this equation is that it's easy to score grants as they count dollar-for-dollar against the budget whereas loans require being able to gauge how risky a project is to know how to score it against the budget.

But consider this, what if you just count the loans dollar-for-dollar against the budget? Then the only difference between it and a grant is that you'd have a chance of eventually getting the money back.

And guarantees are even better as government doesn't actually have to write a check to get capital flowing and so long as the guaranteed loan doesn't default guarantees won't cost anything.

Let me reiterate that this is not an anti-grant screed. I think grants are an essential tool in our toolbox for wiring all of America. But so are loans and loan guarantees. Each mechanism has its own pros and cons.

So because of this I think it's a mistake to take an approach that only uses one mechanism over another. It'd be like trying to build a house using only a hammer and leaving the saw and screwdriver in the toolbox.

Today around 12:30pm EST I'll be a guest on the MyTechnologyLawyer.com radio show. I've been invited to discuss the formation of the Rural Fiber Alliance, why it's important that we focus on bringing the best broadband to all Americans, and how government can help make this dream a reality for rural communities.

If you go to this site there's a link that'll let you listen in live online.

And if you want a real treat tune in a bit earlier to listen to Jerry Baxley of Optical Networks discuss his plans for Alabama. You won't regret it!

About this Archive

This page is an archive of entries from April 2009 listed from newest to oldest.

March 2009 is the previous archive.

May 2009 is the next archive.

Find recent content on the main index or look in the archives to find all content.