At what point do hamburgers reach cost parity with salad? Assume for a moment that this is a serious question and try to figure out how you’d answer it. What is the relevant metric of comparison? Cost per pound? Cost per calorie? Outside of a few rabid vegans, no one seriously tries to do that math, for self-evident reasons. But every time another story comes out about renewables nearing cost parity with fossil sources, that’s exactly what we do.

The problem is the metric. Competing power generation technologies are typically compared on a dollar-per-megawatt-hour ($/MWh) basis, but — like the cost per pound of your lunch — the fact that this number can be calculated doesn’t make it meaningful.

Reader support makes our work possible. Donate today to keep our site free. All donations TRIPLED!

Grid management 101

Assume that you are a grid manager, tasked to provide the most reliable power at the lowest possible cost. In order to meet that goal, you have to have a diversity of sources that are connected to a grid of sufficient size and interconnectedness, so that when your customer decides to do laundry, there is cheap, available electricity in the dryer socket rather than a neighborhood blackout.

Grist thanks its sponsors. Become one.

Since you cannot know before the fact when that dryer is about to turn on, you have to build and manage your system to ensure that you have (a) a diversity of generation sources with uncorrelated failure modes, and (b) a robust distribution network that can move the generated electricity between any two nodes in the system.

The key point is that your needs do not depend primarily on the all-in price per MWh. Rather, they depend on a series of prior capital investments in assets that, in aggregate, are operating at less than full capacity, with the ability to ramp up on a moment’s notice. Moreover, while you want to have a supply of cheap MWh at the ready, you especially want them at the ready in locations where the grid is constrained, and at times that are coincident with system peak demand.

From a cost perspective, you need to invest capital in assets that may not run but can when called on. The investment of that capital has value even if the asset isn’t generating any MWh. That’s why $/MWh a fundamentally flawed metric.

Typical non-renewable contracts

Grist thanks its sponsors. Become one.

None of this is any news outside of the renewable industry, where power contracts typically include differential time of use pricing per MWh, pay a $/MWh capacity payment tied to actual availability during peak periods, and may include additional payments for a host of “ancillary services,” from voltage control to “spinning reserve” for systems that are willing to stay in “hot standby,” ready to ramp on a moment’s notice.

But intermittent renewable sources are typically denominated only in $/MWh, rarely even with time-of-use adjustments, much less capacity payments, for the simple reason that they can’t be reliably counted on to be there when most needed, so neither seller nor buyer will commit to a contract that requires them to do so. (Note that biomass and geothermal contracts often include these more sophisticated contract structures, for obvious reasons.)

That isn’t inherently bad or good — it just is. But it does mean that comparing the $/MWh cost of a wind turbine with high capital costs, low operating costs, and intermittent generation to the $/MWh cost of a gas turbine with low capital costs, high marginal costs, and the ability to instantaneously ramp output in response to system needs is irrelevant. The two generators are providing fundamentally different services. Even if broccoli were free, I’d still pay for the occasional hamburger and a bowl of crème brulee.

So now let’s put your grid manager hat back on, and suppose you manage a system with two power plants. One is renewable, one is fossil-fueled, and each has the ability to generate 100 MWh/month. The renewable helped you meet your RPS targets, so you negotiated a contract that pays them to maximize their MWh output at a $100/MWh, per delivered MWh. The fossil plant, on the other hand, helped you maximize system reliability, so you negotiated a contract that pays them $5,000/month in exchange for guaranteeing availability during system peak and $70/MWh of delivered power. If both generators operate at full capacity all month, the renewable facility will cost $10,000 ($100/MWh x 100 MWh) while the fossil facility will cost $13,000 ($5,000 + $70/MWh x 100 MWh). So on an overall basis, the renewable is cheaper.

But that has absolutely no relevance to how you operate your system, as you adjust to moment-to-moment fluctuations in demand. If, in the next hour, demand goes up by one megawatt, you can’t pull any more power out of the renewable plant unless it has spare capacity, so you may be forced to pull from the fossil facility. However, even if both have spare capacity, the fossil plant is still the lower marginal cost source ($70/MWh vs. $100/MWh). So if your goal is to minimize total costs, you run the plant that is more expensive on an all-in basis — proving the flaw in all-in cost comparisons.

This isn’t to suggest that the falling price of renewable electricity isn’t a good thing. Of course it is. But it can’t take the place of a balanced (grid) diet.