All due respect to Paul Krugman, but the Weitzman thesis [PDF] has always made me a little uncomfortable. The idea is that it’s human nature to disregard unlikely risks, but if the unlikely risks are catastrophic enough then legislators should build policy around them. If there’s, say, a 2 percent chance that global warming could substantially wipe out human civilization, we should do whatever possible to avert that risk.

But how do you keep that genie in the bottle? If there was a 1 percent chance Saddam was going to slip a nuke to a terrorist group that would use it on an American city, should we have invaded Iraq? If there’s a 2 percent chance Iran will launch WWIII, should we do everything in our power, up to and including invading, to prevent then from establishing a nuclear program? If there’s a 3 percent chance that a supervirus will emerge from Asia to kill a big chunk of the human population this century, should we preemptively wipe out the monkey and bird populations there?

Reader support makes our work possible. Donate today to keep our site free. All donations DOUBLED!

I’m far from an expert on Weitzman’s work, so all you smart readers should educate me here. What’s to stop us from lunging this way and that based on low-probability, high-impact risks?

(FYI, Jim Manzi has a much, much longer critique of Weitzman here.)