How Google's disdain for utilities could ruin its Nest acquisition

Tools

1

A few years back, Google tried to get into the home energy management space. It initially tried to team with utilities, but quickly decided that utilities were too balkanized (making it too hard to aggregate a large market) and too clueless about consumer preferences.

 

But this disdain for utilities and their knowledge set may come back to bite them, as you will read in this guest editorial from Tom Osterhus. I hope you have time to read the whole piece, to understand Tom's logic. In essence, he argues that Google is focused on value streams from home appliances and energy services. But utilities have access to other value streams. One is comprised of what he calls "grid-side savings." Another could come from moving away from average cost to serve to true cost to serve based on the locational differences between premises.

                                                                          

(By the way, I think there's a fourth value stream – combining energy usage data with all the other data Google knows about consumers to create ultra-targeted marketing.)

 

Tom also has some fascinating insights into storage (including the penetration rate tipping point that could flatten load curves and do away with price spikes). And into time-of-use pricing and why it will NOT help Google. – Jesse Berst

 

By Tom Osterhus

 

The Google Nest deal:  What value was captured?  What value was missed?

After Google’s acquisition of Nest, I heard from utilities and consultants asking “What could Google possibly be thinking, paying $3 billion for Nest? I have no interactions with Google, but I offer my opinion via the following piece. We can certainly forecast some of how this will play out. It starts with the fact that the players don't understand how the other side makes money.

 

Google rarely interacts with utilities. Utilities tend to resist innovations from Google, Apple, Microsoft and a host of others (e.g., AT&T, Comcast, Verizon, etc.). Their business models, risks, margins and opportunities do not behave the same way. Failure to appreciate these differences is likely to depress the earnings potential on both sides.

 

What Google thinks it knows

Google's actions suggest the following:

 

First, they learned early that utilities are not keen on sharing customer usage data with third parties. This is an understandable, classic utility stance.

 

Second, Google learned with PowerMeter that even if they do get the data, customers are largely uninterested. Sure, some are curious about it, view it once or twice, and then move on with their lives. There was no groundswell of consumer passion, no barrage of clever apps, and so the music stopped. It was clear that, to change the industry, someone would need to change customer behavior. And to be consequential, it would need to be more than voluntary customer reductions of 1% to 2% savings and would need to not depend upon the persistence of customer vigilance.

 

Enter a host of other home energy management systems to fill that void, with automated dispatching of end uses. Hundreds of pilots and offerings over the past few years are plodding along at the typical utility pace.

 

Too risky for non-regulated firms

We saw very few firms interested in approaching this task from the non-regulated side of the utility due to the significant hardware investment costs (i.e., relays, sub-metering, dispatching systems, installers, customer service). For the typical non-regulated energy provider, it is too risky to chase a customer that can switch providers at will.

 

Google was no different. They opted to wait and placed a bet on Nest, instead.

 

In the meantime, the ISOs are trying to help out, with increased focus on service markets. But the pace is still slow and the ISOs primarily  value the commodity side, and only then above the bus, leaving the bulk of the grid side savings to the regulated utilities. Utilities are well aware of the significant value of the grid side savings. But they are very concerned about reliability issues with large scale rollouts. This slows the adoption pace as well as instill fears about modifying traditional margins.

 

Google watches and waits

In most industries, commodity margins converge to very low numbers. Google seems to believe the energy space is similar. It is therefore opting to forego participation in the commodity business in favor of value-added comfort and convenience marketing.

 

However, the energy space is unique in that we have no storage, no silo, no warehouse, in which the commodity can be reasonably held. This is why prices spike. Supply and demand are balanced every second.

 

Even branding stalwarts like Coke (where the commodity is sugar water) and Nike (textiles, rubber) know that some level of vertical back integration is necessary, despite having access to inventory and storage buffers. With energy, the need is even more important. In the energy space, there is 2X to 5X more margin to be had by using near real time dispatching analytics or arbitraging against ISO prices, wind forecasts, thermal inertia in AC and water heat, and other areas. Only those competitors that master these nuances will win, irrespective of the glitz or appeal of one’s thermostat. At some point, the money saved rules the market, or at least makes competition much more intense.

Which utilities will survive (or even thrive)

Those utilities that do understand the difference, and work toward preserving those customers where margin exists, are likely to survive, perhaps thrive, the threat of a death spiral in their franchise. But during this process, there will also be increasing pressure to accept the true cost to serve for their customers, likely driven by the regulators.

 

Does Google appreciate this? It remains to be seen. Utilities will be forced to move from class level pricing to individual pricing. It is a truer and more accurate cost to serve (though some cross-subsidy is likely to continue). Thus, the ability of new entrants to cherry-pick customers will evaporate. Whoever gets there first wins.

 

Smart meter usage profiles are already in place, and it is very easy for the regulators to establish an exact cost to serve per home versus the current model where we use “settlement shapes” which are an average load shape for all customers. If a death spiral seems to be in the offing, regulators will likely accept a utility movement to employ individual cost to serve methods versus the current use of averages.

 

I doubt Google is ready for that or appreciates the potential threat this poses to the Nest business case.

 

Why time-of-use pricing won't help Google

Time of use (TOU) pricing (upon which many programmable thermostats build their business case) won’t resolve the matter, either. Google may know that the current market acceptance of TOU is low. But they need to realize that it will always be low. Active TOU participation rarely exceeds 5% to 10%, and many of these active participants are largely “free riders” -- those who already use little energy on peak.

 

The utilities know this, but the regulators have not quite caught on. Most regulatory agencies and evaluation teams have a staff of well-trained economists, not marketers. While an economist may trust the rationality of the consumer, a marketer knows that non-price options generally win the day when commodities are comparably priced. Indeed, the TOU pilots where we see the greatest participation are those that beef up on non-price factors, such as TV advertising, radio, etc. (e.g., Flex Alert during California pricing pilots, etc.).

 

Google’s continued avoidance of utility partnerships might suggest that  they intend to push TOU-based transactive signals for NEST to receive. At the end of the day, most consumers are unlikely to respond to these prices on their own, without a third party doing it for them.  And the value of virtual storage can only be achieved via tightly linked, and optimized, end use dispatching in a dynamic fashion. 

 

But Google may be thinking along these (TOU) lines. What better way to unhinge the utility from their customer?  However Google may be leaving value on the table by under-appreciating local variation and the value of virtual storage. The maximum cost savings is achieved by optimal arbitraging of end uses directly versus hoping for a customer response to a transactive signal. In the end, the response will be automated by third parties, leaving consumers to live their lives without worrying about saving pennies each day by watching an hourly TOU signal.

 

Enter locational targeting

Enter locational targeting and locational cost-effectiveness. In late 2013, California began calling for more attention to the specific locations where EE, DR, solar and smart grid activity might yield a more significant efficiency bang for the buck. The regulators are currently determining how utilities should be required to respond to this -- I get questions about it every month.

 

This much is clear. We know that the cost to serve a home can often change dramatically if we pick it up and move it over a few streets. We know that local congestion on the network will spike local LMPs (locational marginal prices). What Google does not yet seem to appreciate is that this financial impact, due to location, can alter margins up to 10 fold, and be larger than the value of the commodity cost alone.

 

I suspect that the Nest business models are based on spot market energy and capacity credits in the range of $80 per KW-yr. What Google probably doesn't know is that there are very specific locations where the total avoided costs can be 2X to 10X larger than their averaged value assumptions.

 

Did Google overpay? By a bunch?

Or, maybe they do know this, and that is why they paid about $3,000 per customer for Nest. We often see the average avoided cost savings to be in this range, when we value the savings from all of the utility value buckets jointly (e.g., supply, transmission, distribution, ancillary services, voltage, losses, asset deferral, etc.). The normal range for the commodity margin of a utility customer is loosely $100 per year (residential) or perhaps $700 NPV. Still way less than $3,000. We only see $3,000+ valuations where we identify high cost to serve customers in specific locations. Then, the value can climb quite a bit higher than $3,000.

 

All of this assumes the current price volatility and grid costs persist. And they will until 20% of the load participates in virtual storage. But my guess is that Google is not valuing the utility cost savings or margin, but rather the value-added service markets and appliance offerings that can come after a Nest sale. In this sense, Google gets a great deal from the Nest purchase, and Nest has short-changed their investors. Most utilities don’t want to be in the value added services side of the business. But neither does Google see the money that they are leaving on the table by ignoring the large value to the utility from specifically targeted customers.

 

But what about high value customers

You might think that more of these high value customers just give Nest and Google more upside. Right?  Wrong.

 

We have performed analyses which show after a certain number of MWs of smart grid savings for a specific minimum number of hours per year, these smart grid “resources” can become the price setter in the market, as opposed to the iron in the ground.

 

If those resources were to be controlled by an unscrupulous third-party vendor, using Enron-like gaming tactics (pre-cooling, pre-heating, charging EV in the morning), the resource could be used to drive prices up in the morning to make a better business cases for reducing load during afternoon peaking hours -- in effect gaming the utility's DR incentives.

 

Many regulators have already decided that EV charging does not require utility level oversight. It’s not a stretch to believe they won’t anticipate the need to regulate NEST and other Google services. To mitigate that risk, utilities can ask to retain control or secure sufficient MWs of DR behind their own customer meters to mitigate the potential gaming.

 

The key point: There is a set of ideal market segments one would ideally pursue to insure competitive sustainability. Whoever secures these groups first, especially in specific locations, stands to gain the most by offering more money, value, and innovation. We have conducted several modeling projects where a utility can realistically flatten the load curve, and reduce marginal prices significantly, using only 20% of the customers.

 

So, it is the best 20% that will win the day, in the long run. All other contenders will be priced out of the markets. Prices will drop, and business models will be rendered moot for many current technologies.

 

The importance of tapping avoided cost to serve

Sure, glitzy consumer innovations may secure market share in small niches, but the ones that tap into the true utility avoided cost to serve will stand to benefit dramatically from the very large locational differences from street to street, town to town.

 

Will it be Google or others that secure demand reductions in the $20 to $40 per KW-yr range?  Where they can target the right customers in the right locations, year to year, they do stand to dictate the market. And in this scenario, if I only have to acquire 20% of the right customers to set the price in the market, paying $3,000 per customer is a bargain basement price. But to capture this value, Google will need to grow their utility analytics quite a bit, assure continued ISO cooperation, and attempt a more conciliatory relationship with both utilities and state regulators to achieve this. It is clearly possible. But it is far from inevitable.

 

And what about grid reliability

Google may not understand the significance of grid reliability. A recent Wall Street Journal report cites that the loss of 10 or so key substations would bring down the grid for month. Disaster ensues. Regulators cannot allow this. The ISOs have sensed this for some time, increasingly investigating how to forecast local loads for use in 10 to 20 day LMP forecasts. All they have today are next day LMPs.

 

But the rapid growth of large grid scale solar, wind and storage are placing increasing strains on system operators (e.g. Texas, California) to the point where I think ISOs may be worried about the long run reliability of the grid. The way to address this is to migrate from city-wide load forecasting (current state) to local acre-level forecasting. And the way to quantify this, ironically, is to use a combination of econometric modeling and Google Earth-style satellite imagery.

 

When we do this, we can identify very specific locations where smart grid resources yield the biggest bang for the buck. Moreover, we can forecast local LMPs by bus accurately for 10 to 20 years, inclusive of large scale solar, wind or storage. So the utility, the ISO, and third parties like Google could know exactly where the placement of micro resources returns the biggest bang for the buck. In essence, the ability to improve reliability (and uncover large commodity savings) at a fraction of the current projected costs does exist today, and we will be seeing a lot of it in the months ahead.

 

To the best tools go the spoils

There is a race to dominate this market. The winners will have a sophisticated set of tools. Some tools will use advanced analytics to value marginal customers instead of average customers, and do it locationally. Some tools will target and secure the optimal set of market segments that will survive the coming competition.

 

Has Google thought through these two issues fully?  I don’t see much evidence for it. Do they care? They should. If they don’t, someone will and they’ll lose the bigger market opportunity. Should they partner with utilities or ignore them?   Ignore the utilities, and Google potentially loses out on value buckets that might be greater than what one gets from value-added services and gadgets alone. Gadgets are replicable. Being first to market, to the right segments, year by year, wins the day, in my opinion. And so, let the race begin.

 

Tom Osterhus is the CEO of Integral Analytics, a software provider with over 20 analytically-focused products focused on the Smart Grid space, including DSMore, LoadSEER, and IDROP. He holds a BA from Dartmouth, and an MBA, MS in Stats, and PhD from the University of Cincinnati.

Filed Under