Paul Krugman has a blog post about one of my favorite economists, Marty Weitzman. He has the central point right, which is that “on any sort of expected-welfare calculation, the small probability of catastrophe dominates the expected loss.”
But Krugman’s general lack of understanding of global warming — and his willingness to believe anything Bjørn Lomborg says — undermines his entire analysis:
Bjorn Lomborg … says that climate change will reduce world GDP by less than 0.5%, so it’s not worth spending a lot on mitigation.
Weitzman’s point is, first, that we don’t actually know that: a small loss may be the most likely outcome given what we know now, but there’s some chance that things will be much worse. (Marty surveys the existing climate models, and suggests that they give about a 1% probability to truly catastrophic change, say a 20-degree centigrade rise in average temperature.)
… Suppose that there’s a 99% chance that Lomborg is right, but a 1% chance that catastrophic climate change will reduce world GDP by 90%. You might be tempted to disregard that small chance — but if you’re even moderately risk averse (say, relative risk aversion of 2 — econowonks know what I mean), you quickly find that the expected loss of welfare isn’t 0.5% of GDP, it’s 10% or more of GDP.
Well, “yes,” on the final point, but “no” on every other point.
Indeed, a 20°C rise in average global temperature — which translates to perhaps 50°F warming over much of the inland U.S. — is “James Lovelock” territory where “the Earth’s population will be culled from today’s 6.6 billion to as few as 500 million.” Catastrophic climate change is anything significantly over 3°C, which is not a 1 percent chance, but a near certainty if we don’t reverse greenhouse gas emissions sharply and soon.
Lomborg, of course, does not have anywhere near a 99 percent chance of being right that “climate change will reduce world GDP by less than 0.5%.” Indeed, if we actually followed Lomborg’s do-nothing prescription, then he has precisely a zero chance of being right. He is a pure disinformer.
Weitzman’s analysis is, however, very important for traditionally economists — and everyone else — to understand, so let me reprint my September post, Harvard economist disses most climate cost-benefit analyses, below:
Harvard economist Martin Weitzman [PDF] has a new paper [PDF]in which he points out that the vast majority of conventional economic analyses of climate change should carry the following label:
Warning: to be used only for cost-benefit analysis of non-extreme climate change possibilities. Not intended to cover welfare evaluation of extreme tail possibilities, for which a complete accounting might produce arbitrarily different welfare outcomes.
In short, if you don’t factor in plausible worst-case scenarios — and the vast majority of economic analyses don’t (this means you, William Nordhaus [PDF], and you, too, Bjørn Lomborg) — your analysis is useless. Pretty strong stuff for a Harvard economist!
The extreme, or fat tail, of the damage function (see figure above) represents what Weitzman calls “rare climate disasters,” although as we’ll see, they probably aren’t that rare. For Weitzman, disaster is a temperature change of greater than 6°C (11°F) in a century, as he explains in an earlier paper [PDF] on the Stern Review on the economics of climate change:
With roughly 3% IPCC-4 probability, we will consume a terra incognita biosphere within a hundred years whose mass species extinctions, radical alterations of natural environments, and other extreme outdoor consequences of a different planet will have been triggered by a geologically-instantaneous temperature change that is signicantly larger than what separates us now from past ice ages.
Weitzman says the IPCC Fourth Assessment gives the probability of such an “extreme” temperature change as 3 percent, and that “to ignore or suppress the signicance of rare tail disasters is to ignore or suppress what economic theory is telling us loudly and clearly is potentially the most important part of the analysis” — more important than the discount rate.
For me, what is especially alarming about Weitzman’s analysis is that I have argued there is a far greater chance than 3 percent that we will have a total warming of 6°C or more in a century or so if we don’t reverse emissions trends soon. That’s because failure to act quickly means carbon cycle feedbacks will kick in by mid-century, escalating greenhouse-gas concentrations and temperatures well beyond standard IPCC projections. Put another way, if we don’t stabilize below 500 ppm of carbon-dioxide emissions (we are at 380 today and were at 280 preindustrial), we will probably soar to at least 800 ppm in a century, if not 1000 ppm or more. Losing either the permafrost or the Amazon is sufficient to take us to 1000.
Weitzman’s paper [PDF], “Structural Uncertainty and the Value of Statistical Life in the Economics of Catastrophic Climate Change”, is not for the general reader. His discussion of the Stern Review [PDF], however, covers many of the same points and is, I think, accessible to anyone who took an economics class or two in college, especially if you first read John Quiggin (here and here [PDF]).
It is worth noting that while Weitzman is critical of how Stern chose the key discount rate parameters, he still thinks that Stern is mostly right for the “wrong reasons” — because the “the implications of large consequences with small probabilities” — like the many scenarios of catastrophic climate change (ice sheet instability, tundra melting) — matter more than the choice of discount rate.
That said, the mainstream economic policy think tank — Resources for the Future (RFF) — wrote a major report [PDF], “An Even Sterner Review,” that concluded, “we find no strong objections to the discounting assumptions adopted in the Stern Review” (a point I have made, also, based on Quiggin). It also concluded Stern could have used “rising relative prices” from future scarcity to get the same result. The RFF report pointed out:
If we were to combine the low discount rates in the Stern Review with rising relative prices, the conclusions would favor even higher levels of abatement. This would in fact lead us to consider some of the levels of carbon content that Stern deems unrealistic, that is, aiming for a target of less than 450 ppm CO2 equivalents.
Now what I would like to see is a cost-benefit analysis combining a moderate discount rate with RFF’s rising relative prices and Weitzman’s “extreme climate change possibilities.”
I’m sure that such a comprehensive economic analysis would vindicate Stern again and drive us toward a target of 450 ppm or lower — which means we must peak in global emissions by 2020. The time to act is now. Economics demands it.
Update: Let me be clear that a 3°C to 4°C total warming from preindustrial levels — which takes us to the same temperature the planet had the last time sea levels were 80 feet higher — would be an unmitigated catastrophe for the planet — that is Hansen’s point. My point in this post is just that if we get that warm, the feedbacks will probably take us to 6°C warming a few decades later.
Update 2: Weitzman has toned down the piece in his next draft, so you won’t see his strong warning. Oh well. I guess I’ll have to dis Lomborg and Nordhaus myself! Stay tuned.
Hat tip to Mark Shapiro.
This post was created for ClimateProgress.org, a project of the Center for American Progress Action Fund.