Biz & IT —

Op-ed: Salvaging Google Fiber’s achievements

Google Fiber hasn't changed the world—yet.

Boxes of equipment needed to install Google Fiber broadband network.
Enlarge / Boxes of equipment needed to install Google Fiber broadband network sit on a couch at the home of customer Becki Sherwood in Kansas City, Kansas.
Benoît Felten is a broadband expert and the chief research officer at Diffraction Analysis. His views do not necessarily represent those of Ars Technica.

In the wake of Google Access CEO Craig Barratt’s “goodbye Access” post on the Google Fiber blog yesterday, papers left, right, and center are predicting the end of Google Fiber. Barratt tries to sound upbeat, but in essence he’s announcing that Google Fiber won’t be expanding further (pending a strategic reevaluation), that people will be made redundant, and that he’s leaving. I don’t know Craig and can’t really comment on his tenure as Access CEO, but that doesn’t exactly sound like good news.

For analysts like myself, Google Fiber is a complicated project to track; any infrastructure venture on the scale of Google Fiber as currently announced would have had to disclose numbers by now. Wall Street would have asked for take rates and average revenue per user (ARPU) and all kinds of other metrics to evaluate the validity of the investment. Before the Alphabet restructuring, however, this was just another Google "pie in the sky" project. Now that Access is its own company, we might have expected these numbers to finally come out. So far, though, Google isn’t talking.

So we’re left to speculate. I’ve been thinking about this not just for the last few weeks but for a couple of years at least, and I finally want to share these thoughts now that Google Fiber seems to be at a turning point. Just to be clear: this is not me sharing information, this is me analyzing and speculating on the basis of what little we know and trying to think how what’s been achieved might be salvaged and expanded upon.

Laying the groundwork

Without going too far back, my impression has always been that Google Fiber was a schizophrenic project. At the very beginning, it felt like those Google decision makers who were more interested in achieving important policy goals wanted Google Fiber to be a catalyst, something that would shift the market with a bang and then be a shared experience for others (public or private) to take over. The idea of a "blueprint" was floated in the early days.

But there were also those who seemed to think that Google Fiber could become a new business for the company, something not just aimed at shifting market perceptions and shaking the complacency of telco and cable incumbents but a profitable business line in its own right. That has always seemed to me to be an unlikely proposition. I am confronted on a daily basis with the paradox of short-term focused telecom operators considering long-term fiber investment efforts, but their short-term is longer than Google’s core business short-term by an order of magnitude. Unless there was a long game plan to view this as the “pension fund” arm of Google’s finances, it didn’t really make sense to me.

And if I’m honest, I didn’t much care about that second proposition anyway. The US market is already plagued by a lack of competition in fixed. Replacing one closed monopoly by another (no matter how much sexier) didn’t seem to me like a particularly desirable goal. So while I fully wanted to believe in the potential for Google Fiber kicking the telecom anthill, I wasn’t so convinced by the relevance and likelihood of this becoming a fiber business like every other.

Now that it looks like there’s at least some serious soul-searching around the second proposition, I think it’s time to consider whether the first has worked and how things could go from there.

Infrastructure is expensive

My take is that there was one fundamental flaw going into this, one that’s probably still floating around: Google believed it could revolutionize the laying of fiber. It didn’t just think it could offer a kickass service; it thought it could deploy much cheaper and much faster than anybody else had ever done. That’s fully in line with the Google mindset, but unfortunately it ignores the fact that hundreds of companies had been deploying wireline access infrastructure for years by the time Google Fiber decided to give it a go.

I’d suggest we’re now seeing the fallout from that misguided assumption: Google is finally admitting (in a roundabout way) that despite all the clever people it has on hand, the company hasn't revolutionized fiber deployment. It still takes time to do the planning properly, to work with local authorities effectively, to do the outside plant layout efficiently. Did Google manage to do things cheaper than others did? Probably, but not by a wide margin. And as it decided to scale beyond Kansas City, it realized that the efficiencies it might have been able to find in KC didn’t scale well elsewhere because a lot of those things are down to local specificities and relationships.

So Google is deploying fiber in the access, it's doing it well, but it's not doing it so well that this is hugely more profitable for them than it would be for anyone else. In other words, the cost side of the equation is roughly on par with industry norms (again, my speculation). What about the revenue side?

Revenue perspective

On the revenue side, the two key metrics are take-up and ARPU—and the first one is much more important than the second. Google understood that and went with a frankly very cool product at a very affordable price point. I’ve never been really convinced by the need to have a linear TV play, but that’s beside the point; if Google wanted a chance at a high take-up, it needed a low price point and a kickass product. That’s not always enough, though; incumbents respond by lowering their price locally, migration is a painful process for the customer, etc. There are many reasons for inertia in customer acquisition even with a good product, a fantastic brand, and a collaborative local community.

My bet is that Google’s take-up is not that great in the markets it has started commercializing. Note that it may be very good by industry norms, but again, remember that Google expected to blow industry norms away from the get-go. If I had to guess a number, I’d say Google Fiber is in the 30-40 percent take-up range in areas that have been open for service for three years. Industry average is about 7 percent per year the last time we looked into it, so that’s very good, but again probably not enough by Google standards.

Pre-sales in Kansas City were astounding. When that data was still publicly available we scraped the website and analyzed it. Some areas had over 100 percent pre-subscription, and, if I recall, the average pre-subscription rate was already in that 30-40 percent bracket even before it had started deploying. The problem is that these people want you to connect them now when in fact it’s going to take months, if not years, to get to them. By the time you actually get there, they may have moved out, they may have finally had a good offer from their cable operator, or they may just be pissed off at you taking so long to serve them.

So that (in my opinion) is what’s happening at Google Access right now: costs are higher than planned (even though lower than industry norms would suggest) and take-up is lower than expected (even though higher than industry norms would suggest). Since Google isn’t really looking at this as an infrastructure player would, it’s time to reconsider.

The air up there

In parallel to that, wireless is starting to look like a potential solution to some of the problems. Don’t believe the hype about residential fixed service being substituted by a wireless access solution any time soon, at least not in most urban geographies. There are promising technologies ahead, but they’re far from mature yet.

Google’s acquisition of Webpass, however, is interesting. Few journalists took the time to try to understand what Webpass does. Webpass uses wireless solutions for urban aggregation, not access. In other words, it doesn’t connect homes wirelessly, it connects multi-tenant units wirelessly and uses existing in-building wiring to connect the homes from the rooftop antenna.

It’s a clever approach that solves two fundamental deployment issues:

– it eliminates the need to pull fiber along street poles or bury ducts in the pavement to pull fiber along the streets. This is both costly and time-consuming;

– it eliminates the need to deploy fiber inside the home, also expensive and time-consuming, by reusing the existing wiring.

However, that approach does not seem to me to be so universal as to be usable in any deployment scenario. There are a number of potential issues that I see with it:

  • First, you need to target multi-tenant buildings to make the economics work. I suspect (again, not knowing the exact costs of the solution) that the equipment necessary to install this on single homes would make the price point too high. Furthermore, you need line of sight between rooftops, which is comparatively easy when people live in high downtown MDUs, not so easy when they live in detached homes.
  • Second, you need to be able to reuse the existing cabling in the house. I haven’t had time to look into the specific regulatory aspects of this (and particularly to see if this varies from state to state or county to county in the US), but my bet is you can’t always bank on being able to reuse the cabling, especially if it has been deployed by an incumbent or a cable operator. I may be wrong here, and I will be doing my homework on this, but I’m flagging it as a risk.

Channel Ars Technica