Cost of adding tons of renewables to U.S. electric grid has been almost zero, say studies
Many states have "renewable portfolio standards" mandating that they produce a certain percentage of their electricity from renewable sources. Libertarian and tea bag-ish critics of these standards have said that they will cause electricity rates to "skyrocket."
But the numbers are in, and guess what? Science says the critics are wrong. Utility companies’ experiences vary a bit, but on the whole energy costs are barely being nudged by renewable standards, reports Midwest Energy News. In fact, far from being a financial drain, wind power is the most economic option in some places, such as the portion of the Dakotas and western Minnesota served by Otter Tail Power.
In the worst case scenario, Great River Energy of Minnesota and Wisconsin saw a 1.8 percent increase in rates, or about $18 per year for its average customer. But on the flip side, Xcel energy, which serves many Western states, projects a per kilowatt-hour increase of $0.003 by 2025. A 2008 study by Lawrence Berkeley National Laboratory said renewable standards had increased the cost of electricity by a fraction of a percent in most states, and just over 1 percent in Connecticut and Massachusetts. That study is due to be updated this year — after many quarters of explosive growth in wind power production — and its authors say they don't expect the results to be any different.
So why are renewables — primarily wind — coming out so cheap? Critics would argue that we haven't integrated enough of them to be forced to cope with their intermittency, and there's some truth to that. But utilities say that's more than compensated by the fact that wind reduces their exposure to volatile gas prices and the headaches of having to upgrade old coal-fired power plants to comply with new clean air laws.
Are renewable standards driving up utility rates?,
Midwest Energy News