Here’s the standard story about the U.S. power grid: It gets baseload supply from hydro, nuclear, and coal (in that order), using natural gas (and the occasional oil plant) as a swing producer to meet peak demands. Renewables play on the margin, but are neither big nor reliable enough to matter from a grid planning perspective.
On average, that story is true. In recent years, however, a steadily larger portion of total U.S. power supply comes from sources that we historically think of as “intermittent” — namely, natural gas and renewables. Is that the beginning of a new paradigm (the dream of both the renewable and natural gas communities) or an unsustainable deviation from the average (the dream of the coal industry)? To answer that question, we must review a bit of history.
The chart below shows the percent of total U.S. power that came from the “baseload triumvirate” of coal, nuclear, and hydro (C-N-H) from 1949-2010, plus the percentages that came from natural gas, oil, and other renewables over the same period.
Several observations, working backward in time:
First, notice that since 1985, we’ve fallen from relying on C-N-H for 85 percent of our electricity to just 70 percent. That’s a direct result of the fact that nuclear and coal power plants are cost-prohibitive to build in a post-Clean Air Act, post-Seabrook world. And so we haven’t built any — we’ve just maxed out the ones we already have. This, in a nutshell, is why the coal industry is so grumpy lately.
Second, virtually all of the recent fall-off in C-N-H production has been matched by an increase in natural gas-fired power. Gas, not wind, has been the biggest beneficiary. However, it’s worth noting that the growth in gas generation has been the result of lots and lots of new natural gas power plants getting built. The natural gas fleet has gotten much bigger, but in terms of capacity factor — that is, the number of MWh the fleet generates in a given year relative to its theoretical maximum — the fleet struggles to stay above 25 percent, which interestingly enough is about the same as the wind fleet:
Now compare the 20-year period from 1990-2010 with the 20 year period from 1950-1970. They look remarkably similar, with C-N-H falling from ~80 to 70 percent and natural gas rising from ~10-20 percent. Indeed, the only significant distinction between the two eras is in oil-fired generation, within which there are lessons for the current gas boom.
1950-1970 was the era of cheap oil. OPEC price shocks hadn’t happened yet and — hard as it is to imagine today — many utilities were actively converting their coal plants to oil in the name of cost conservation. (See, for example, here, or this history of National Grid.) Needless to say, that turned out to be a bad idea when oil prices spiked in the ’70s. Lots of oil plants were converted back to coal and, luckily for electric costs, lots of nuclear plants were coming online. Nuclear didn’t yet have its (soon to come) history of cost overruns to prevent regulators from approving further investments. The result was a fairly rapid transition away from oil and gas, increasing our dependency on C-N-H.
Now, fast-forward to the present. Much like in the 1960s, we’re becoming ever less dependent on C-N-H and ever more dependent on natural gas to meet our electricity demands. Where we once believed that oil would forever be cheap, we now believe that natural gas will forever be cheap. That seems unlikely.
Perhaps the biggest difference between the two eras is that, while the 1960s saw a steady addition to the U.S. baseload C-N-H fleet, our recent additions have been only to historically non-baseload sources:
As a result, when economics drove us away from oil in the 1970s, we had the option to fall back on lower (marginal) cost power generation sources. If history repeats itself, we won’t have that luxury this time. In other words, don’t get complacent about the recent downturn in natural gas and electric prices. It can’t last.
Get Grist in your inbox