OK, the Washington Post’s circulation will probably keep declining even in the unlikely event their coverage of global warming improves. But my headline is at least as scientific as the WP’s latest climate piece “Scientists’ use of computer models to predict climate change is under attack.”
Memo to WashPost: Scientists use of computer models to predict/project climate change has been under attack for a long, long time by the anti-scientific disinformers. That ain’t news. The real news, which you almost completely ignore, is:
- The models have made accurate projections (see NASA: “We conclude that global temperature continued to rise rapidly in the past decade” and “that there has been no reduction in the global warming trend of 0.15-0.20°C/decade that began in the late 1970s“).
- When the models have gone awry, it is primarily in underestimating how fast the climate would change.
- Staying anywhere near our current emissions path — i.e. listening to the disinformers and doing nothing significant to restrict emissions — removes most uncertainty about the future climate impacts and leads with high probability to human misery on a scale never seen before.
But what do you expect from an article that begins this way:
The Washington Nationals will win 74 games this year. The Democrats will lose five Senate seats in November. The high Tuesday will be 86 degrees, but it will feel like 84.
And, depending on how much greenhouse gas emissions increase, the world’s average temperature will rise between 2 and 11.5 degrees by 2100.
The computer models used to predict climate change are far more sophisticated than the ones that forecast the weather, elections, or sporting results.
Uhh, it’s not really that the climate models are more sophisticated. It’s that the climate is considerably easier to forecast than any of those other three.
Climate has always been easier to predict than the weather: We know with incredibly high certainty that July of this year (or any year) will be hotter than January of this year (or any year) — and we know with high certainty the 2020s will be hotter than the 2000s — but it is basically a coin toss as to whether July 15, 2010 will be hotter than July 15, 2009. As NASA notes, “When we talk about climate change, we talk about changes in long-term averages of daily weather.” Long-term averages simply don’t change as rapidly as the weather and are inherently easier to project.
The analogies to sporting events and elections are simply inane. They involve human behavior and thus aren’t model-able with the same basic laws of physics. They are apparently included in the article simply to amuse and confuse.
The piece is a long litany of mostly irrelevant information and disinformer talking points:
Climate scientists admit that some models overestimated how much the Earth would warm in the past decade. But they say this might just be natural variation in weather, not a disproof of their methods.
Uhh, “some models”? So some unnamed models may not have gotten it right. Or maybe it was just that some of the groups doing the measuring lowballed actually warming. The U.K.’s Met Office — which many scientists have said has underestimated recent warming — posted an analysis in December which concluded, “The global temperature rise calculated by the Met Office’s HadCRUT record is at the lower end of likely warming.”
In fact, NASA’s analysis makes clear that warming continues just as the models had projected. Indeed, the WashPost buries this central point, which by itself renders the entire article mostly moot:
Put in the conditions on Earth more than 20,000 years ago: they produce an Ice Age, NASA’s Schmidt said. Put in the conditions from 1991, when a volcanic eruption filled the earth’s atmosphere with a sun-shade of dust. The models produce cooling temperatures and shifts in wind patterns, Schmidt said, just like the real world did.
If the models are as flawed as critics say, Schmidt said, “You have to ask yourself, ‘How come they work?’ “
The models were actually used to accurately predict the cooling from the Pinatubo eruption.
The Washington Post entirely misses the even more important point that the models used for the 2007 IPCC report consistently underestimated recent climate changes (and emissions trends):
- “The recent [Arctic] sea-ice retreat is larger than in any of the (19) IPCC [climate] models” — and that was a Norwegian expert in 2005. The retreat has accelerated since 2005, especially in volume.
- The ice sheets appear to be shrinking “100 years ahead of schedule.” That was Penn State climatologist Richard Alley in March 2006. In 2001, the IPCC thought that neither Greenland nor Antarctica would lose significant mass by 2100. They both already are.
- Sea-level rise from 1993 and 2006 — 3.3 millimetres per year as measured by satellites — was higher than the IPCC climate models predicted.
- The subtropics are expanding faster than the models project.
- Since 2000, carbon dioxide emissions have grown faster than any IPCC model had projected.
Needless to say, the Post never talks about the paleoclimate record, which provides both support for the climate models — and more evidence that they lowball likely future impacts (see Science: CO2 levels haven’t been this high for 15 million years, when it was 5° to 10°F warmer and seas were 75 to 120 feet higher — “We have shown that this dramatic rise in sea level is associated with an increase in CO2 levels of about 100 ppm”).
The models’ biggest flaws concern their ignoring most major amplifying carbon-cycle feedbacks (see “An illustrated guide to the latest climate science“). But rather than explaining even once that the necessarily imperfect models almost certainly underestimate future impacts, the Post chooses to repeat without explanation this misleading point:
All the major climate models seem to show that greenhouse gases are causing warming, climate scientists say, although they don’t agree about how much. A 2007 United Nations report cited a range of estimates from 2 to 11.5 degrees over the next century.
Now this appears to willfully conflate two very different issues. It seems to imply that the climate model don’t agree on how much warming we’ll see — by a factor of nearly 6! But in fact much of that disparity is due to the use of very different scenarios of how much emissions will grow this century.
As I’ve noted many times, the IPCC wastes a huge amount of time and effort modeling countless low emissions scenarios that have no basis in reality. Now if you take a low climate sensitivity (warming caused by a doubling of CO2 concentrations) and multiply it by a low emissions scenario, you get a low total warming. The anti-science crowd then gloms onto that low number as evidence global warming won’t have serious consequences. (And the media gloms onto that number and compares it to the high emissions, high sensitivity case as evidence the IPCC modelers “don’t agree” by a wide amount.)
But the IPCC has never clearly explained that all of the low emissions scenarios presuppose we ignore the anti-science crowd’s plea to do nothing and instead take very strong action to reduce emissions.
On the other hand, the IPCC has explained it is far more likely that the climate sensitivity is quite high than it is quite low — but very few people in the media follow the science closely enough to realize that.
And so what the scientific literature and climate models tells us today with increasingly certainty is that if we take no serious action, catastrophic change might best be considered business as usual = highly likely (see M.I.T. doubles its 2095 warming projection to 10°F — with 866 ppm and Arctic warming of 20°F and Our hellish future: Definitive NOAA-led report on U.S. climate impacts warns of scorching 9 to 11°F warming over most of inland U.S. by 2090 with Kansas above 90°F some 120 days a year — and that isn’t the worst case, it’s business as usual!”).
But the media and opinionmakers and most economists have been led to believe those scenarios are the extreme worst case and very unlikely, when in fact they are simply what is projected to happen if we keep doing nothing.
The true plausible worst case — which combine keeping on our current high level of emissions trend with what a more accurate attempt to model carbon cycle feedbacks — is far, far worse: U.K. Met Office: Catastrophic climate change, 13-18°F over most of U.S. and 27°F in the Arctic, could happen in 50 years, but “we do have time to stop it if we cut greenhouse gas emissions soon.”
But you won’t learn any of that crucial information from the Washington Post. So why not join hundreds of thousands of others and stop reading it entirely!
UPDATE: MIT’s Joint Program on the Science and Policy of Climate Change had a very useful figure based on its 2009 peer-reviewed paper, which makes the point with more probabilistic detail:
Here is how MIT describes what it calls the “Greenhouse Gamble” in “an attempt to better convey the uncertainty in climate change prediction”:
Depicted as a roulette wheel, the image portrays the MIT Program’s estimations of climate change probability, or the likelihood of potential (global average surface) temperature change over the next hundred years, under different possible scenarios. Estimates of the risks of climate change are based on the best available information at the time the estimates are made, and thus as continued observations are made and scientific investigation proceeds the likelihood estimates that underlie these wheels must be updated.
Based on new research we provide updated estimates of the likelihood of different amount of global warming over the century under reference case, in which it is assumed “no policy” action is taken to try to curb the global emissions of greenhouse gases, and a “policy case” that limits cumulative emissions of greenhouse gases over the century to 4.2 trillion metric tons of greenhouse gases (GHGs) measured in CO2-equivalent.
The notion is that as humans allow global emissions of greenhouse gases to continue to increase, the roulette wheel continues to spin. We can control emissions — the policy case represents one choice for cumulative allowed emissions over the century — and by doing so we can limit risk. Uncertainties in the Earth system response to increasing emissions are given by nature; we can learn more about these responses but we can not directly control them. The results show much higher likelihood of higher temperature increases than for the previous wheels.
On our current emissions path, MIT using the “best available information,” MIT projects a 9 percent chance of an incomprehensibly catastrophic warming of 7°C by century’s end, but less than a 1 percent chance of under 3°C warming. As one MIT professor put it:
“The take home message from the new greenhouse gamble wheels is that if we do little or nothing about lowering greenhouse gas emissions that the dangers are much greater than we thought three or four years ago,” said Ronald G. Prinn, professor of atmospheric chemistry at MIT. “It is making the impetus for serious policy much more urgent than we previously thought.”
The time to act was quite some time ago, but now is far better than later!