The good news: We can avoid multimeter sea-level rise, the loss of the inland glaciers that provide water to a billion people, rapid expansion of the subtropical deserts, and mass extinctions — each of which is all-but inevitable on our current path of unrestrained greenhouse gas emissions.
The not-so-good news: We will probably need an ultimate target of 350 ppm (or lower) for atmospheric carbon dioxide — if you accept the analysis of ten leading climate scientists from around the world.
And yes, the authors of “Target Atmospheric CO2: Where Should Humanity Aim?” in The Open Atmospheric Science Journal are painfully aware we’re already at 385 ppm and rising 2 ppm a year. That is why they propose the self-described “Herculean” task of phasing out coal use that does not capture CO2 “over the next 20-25.” And that requires a global CO2 emissions profile that looks something like this:
(Note to Hansen et al: Big pet peeve — I think you confuse the general reader by labeling your y-axis “CO2 Emissions” while expressing the units in billion metric tons of carbon. This helps foster errors in the media and elsewhere.)
Actually, even the ultra-sharp emissions cuts depicted in the figure won’t do the trick. We would still need “reforestation of degraded land and improved agricultural practices that retain soil carbon” (aka biochar to the rescue) to “lower atmospheric CO2 by as much as 50 ppm.”
More not-so-good news: That kind of emission reduction isn’t going to happen, not even under President Obama, not even close. Heck, I doubt it would happen under a President Hansen. We just are not going to see 350 ppm this century. Unfortunately, the authors “infer from the Cenozoic data that CO2 was the dominant Cenozoic forcing, that CO2 was only ~450 ppm when Antarctica glaciated, and that glaciation is reversible.”
That is, if we stabilize at 450 ppm (or higher) we risk returning the planet to conditions when it was largely ice free, when sea levels were higher by 70 meters — more than 200 feet! Yet, “Equilibrium sea-level rise for today’s 385 ppm CO2 is at least several meters, judging from paleoclimate history.” Equally worrisome:
Theory and models indicate that subtropical regions expand poleward with global warming. Data reveal a 4-degree latitudinal shift already, larger than model predictions, yielding increased aridity in southern United States, the Mediterranean region, Australia and parts of Africa. Impacts of this climate shift support the conclusion that 385 ppm CO2 is already deleterious.
Some slightly good news: The paper does suffer from one inherent analytical weakness that makes it (a tad) less dire than it appears.
Even if you accept their analysis (which many will not), the authors do not know how long we can overshoot 350, which is a function of not just the duration of the overshoot, but the magnitude (i.e. how high concentrations go). They write:
This target [350 ppm] must be pursued on a timescale of decades, as paleoclimate and ongoing changes, and the ocean response time, suggest that it would be foolhardy to allow CO2 to stay in the dangerous zone for centuries.
Well of course it would be foolhardy, but “centuries“ is a long time. The ill-defined difference between decades and centuries is key here.
What if we could keep the peak at or below 450 ppm, and start concentrations declining by 2100, which would almost certainly require near-zero if not net-negative global emissions, and then get back to near 350 ppm by, say 2150 and then even lower by 2200? Would that be good enough? With a World War II scale effort for the next few decades, we could stay below 450. My take away from this paper is that we would need to keep up that level of effort well into the next century — to get back below current levels.
But in some sense whether the ultimate target is 350, 400, or 450 doesn’t matter as much as some people seem to think. You can’t hit any of those targets without strong and relentless action starting January 20, 2009. Further delay risks catastrophe:
Humanity’s task of moderating human-caused global climate change is urgent. Ocean and ice sheet inertias provide a buffer delaying full response by centuries, but there is a danger that human-made forcings could drive the climate system beyond tipping points such that change proceeds out of our control.
That, of course, is a central point of this blog.
The authors note that “it appears still feasible to avert catastrophic climate change,” but their final warning deserves notice:
Present policies, with continued construction of coal-fired power plants without CO2 capture, suggest that decision-makers do not appreciate the gravity of the situation. [Note to Hansen et al: That is the understatement of the year.] We must begin to move now toward the era beyond fossil fuels. Continued growth of greenhouse gas emissions, for just another decade, practically eliminates the possibility of near-term return of atmospheric composition beneath the tipping level for catastrophic effects.
The most difficult task, phase-out over the next 20-25 years of coal use that does not capture CO2, is herculean, yet feasible when compared with the efforts that went into World War II. The stakes, for all life on the planet, surpass those of any previous crisis. The greatest danger is continued ignorance and denial, which could make tragic consequences unavoidable.
OK, so we should have listened to Hansen two decades ago. The time to act is yesterday.
One final point. Much of the analysis in this paper is a refinement of Hansen’s earlier analysis arguing that the real-world or long-term climate sensitivity of the planet to doubled CO2 [550 ppm] is 6Â°C — twice the short-term or fast-feedback-only climate sensitivity used by the IPCC. [This post is a bit clearer on the difference between the two sensitivities.] Some people think Hansen is wrong about this issue (start here). If I am reading that criticism correctly, then I think Hansen responds to it in his new paper.
Also, if I am reading Hansen’s paper correctly, then I think he may be mostly right for a different reason than he thinks, which is to say, I think the carbon-cycle feedbacks (including the tundra melting, the peatlands drying out, and sink saturation) act as the equivalent of the amplifiers that he models: “Additional warming, due to slow climate feedbacks including loss of ice and spread of flora over the vast high-latitude land area in the Northern Hemisphere, approximately doubles equilibrium climate sensitivity.”
In short, if you get near 450 ppm and stay there for any length of time, you will shoot up to 700 to 1000 ppm, which certainly gets you an ice-free planet and other unimaginably catastrophic impacts.
Another way to put this is that the IPCC was right when it wrote last year:
Climate-carbon cycle coupling is expected to add carbon dioxide to the atmosphere as the climate system warms, but the magnitude of this feedback is uncertain. This increases the uncertainty in the trajectory of carbon dioxide emissions required to achieve a particular stabilisation level of atmospheric carbon dioxide concentration. Based on current understanding of climate carbon cycle feedback, model studies suggest that to stabilise at 450 ppm carbon dioxide, could require that cumulative emissions over the 21st century be reduced from an average of approximately 670 [630 to 710] GtC to approximately 490 [375 to 600] GtC. Similarly, to stabilise at 1000 ppm this feedback could require that cumulative emissions be reduced from a model average of approximately 1415 [1340 to 1490] GtC to approximately 1100 [980 to 1250] GtC.
We’re at more than 8 GtC/yr (billion metric tons of carbon per year) and rising 3 percent annually. We need to average below 5 GtC/yr — and maybe considerably less — for the whole century to avert catastrophe. We need to be near zero or below by 2100.
My Bottom Line: Let’s start working now toward stabilizing below 450 ppm, while climate scientists figure out if in fact we need to ultimately get below 350.
You can read the Yale University press release on the Hansen et al paper here.