Can we trust carbon labeling?
About a year ago, I was cautiously bullish on British supermarket giant Tesco‘s pledge to start putting carbon labels on its food. But I think that their progress so far — which I’ll get to in a minute — suggests an important lesson about the policy risks of treating a fuzzy exercise as if it were completely reliable.
Tesco’s idea was that the chain and its suppliers would pay for objective, comprehensive reviews of the greenhouse-gas emissions from the foods on the store’s shelves. The analyses would cover all major steps in bringing food from farms to the checkout line — everything from running farm machinery, to food processing, to transportation, to refrigeration. Then, each item in the store would be labeled with the climate-warming emissions that could be traced to that particular product.
This sort of exercise is called “life cycle analysis,” and it’s been used for decades to great effect, to shed light on all sorts of questions: paper vs. plastic (for bags), cloth vs. disposable (for diapers), hybrids vs. hydrogen (for cars), and a host of others.
Last week, a nifty article by Michael Specter in The New Yorker reported on Tesco’s progress so far. The results? There’s still only one product on the shelves with a carbon label — a single brand of potato chips, or “crisps” in British parlance.
You see, as it turns out, life cycle analysis can be really, really difficult. And to make matters worse, it may be that the whole enterprise is chock full of uncertainty.
Where carbon is concerned, it can be hard to trust the label.
From the article, a brief description of the life cycle analytical process:
In order to develop the label for [the crisps], researchers had to calculate the amount of energy required to plant seeds for the ingredients (sunflower oil and potatoes), as well as to make the fertilizers and pesticides used on those potatoes. Next, they factored in the energy required for diesel tractors to collect the potatoes, then the effects of chopping, cleaning, storing, and bagging them. The packaging and printing processes also emit carbon dioxide and other greenhouse gases, as does the petroleum used to deliver those crisps to stores. Finally, the research team assessed the impact of throwing the empty bags in the trash, collecting the garbage in a truck, driving to a landfill, and burying them. In the end, the researchers — from the Carbon Trust — found that seventy-five grams of greenhouse gases are expended in the production of every individual-size bag of potato chips.
“Crisps are easy,” Murlis had told me. “They have only one important ingredient, and the potatoes are often harvested near the factory.” [Emphasis added.]
That’s right — despite all the steps involved, crisps are easy! Imagine foods that aren’t produced locally, or that have dozens of ingredients. The effort required to get an accurate result is just staggering.
This is not, I repeat, not a critique of Tesco’s efforts. They’ve gone out on a limb, and I think they deserve a lot of credit for what they’re trying to do.
It’s just that, well, life cycle analysis can be viciously complicated. Worse — and I hesitate to say this — I think it’s inherently uncertain. A single number on a label gives the appearance of certainty and accuracy; yet tallying life cycle emissions requires a series of fuzzy judgment calls. Put simply: different people, using slightly different sets of methods, data, and assumptions, can wind up with wildly different estimates of emissions.
Of course, some things are pretty straightforward. The energy required to run farm machinery clearly counts toward the crisps’ life cycle emissions tally; and it’s not all that complicated to figure out.
But what about the emissions from producing, drying, and shipping potato seeds? That might not seem like a big deal, but I’ve read that seed production can be pretty energy intensive. Figuring out just how intensive requires, well, examining the practices of an entire industry; yet doing all that work might increase life cycle emissions from the crisps by a truly minuscule amount, probably not even enough to affect the label.
The complications multiply. The energy used to, say, manufacture farm machinery ought to be included in a tally of emissions, right? But that requires an analysis of the manufacturing sector — an undertaking that’s breathtaking in its scope, but once again would probably add only marginal precision to the estimate of food emissions.
In the end, there are dozens or even hundreds of near-invisible activities that go into making a bag of crisps — insurance industry operations, medical care for workers, driving to and from work — that are simply impossible in practice to measure. Taken together, these “hidden” emissions can really add up; but in many analyses, they’re ignored, or perhaps simply hinted at, rather than directly measured.
Ultimately, a system of comprehensive and reliable emissions measurements is a mirage. Tesco, along with everyone else practicing the craft, has to make shortcuts. For life cycle assessments relying on direct measurement of emissions, the first and most obvious shortcut is to establish a boundary around the analysis. (See more here.) The boundary determines what you’re going to include, and what you’re going to leave out. Ideally, a well-drawn boundary will still cover most of the emissions attributable to a product. And (hopefully) the stuff that’s outside the boundary out won’t vary too much from product to product. So if all goes well, the unmeasured emissions from a bag of potato chips will be a small portion of the total emissions; and, moreover, the unmeasured emissions will be roughly the same for potato chips as for popcorn, cookies, or any other snack food.
The problem, though, is that sometimes the stuff you leave out is every bit as important as the stuff you keep in. Boundaries are often drawn based on what data is available, rather than what’s genuinely important; sometimes there’s important stuff that’s simply too hard to measure.
We ran into this very issue with our assessment of greenhouse-gas emissions from expanding highway lanes. We found that emissions from new traffic on an expanded highway vastly outweighed any emissions benefits from temporary congestion relief. But — as I saw it, at least — our estimates of highway-related emissions were lower than they should have been, since we couldn’t accurately capture new driving that would occur off the highway. (After all, people need to drive to and from a highway, right? And what about all that new development on the edge of town?) So in our analysis, we included modest, low-ball estimates for off-highway driving — estimates that almost certainly put highway expansion in too favorable a light.
In this case, we chose a boundary — driving on the highway itself — for the sake of convenience. We included what we could measure. But as a result, I think we significantly understated the emissions that would result from highway expansion.
Similarly — and apropos of Eric’s biofuels post — I recently ran across another puzzler in this technical analysis of a low-carbon fuel standard. The authors looked at two different models to estimate the life cycle emissions from different kinds of transportation fuels — one that focuses fairly narrowly on fuel production and extraction, and the other more broadly (but perhaps less precisely) on fuel plus land use effects for biofuels. In some cases, the models agreed pretty closely. But they were never identical. And for some fuel types, the differences were simply massive; there was a seven-fold difference in net emissions for soybean biodiesel, for example. Take a look at the chart from page 13:
Just so, the life cycle emissions from corn ethanol has been debated endlessly — yet the answer to whether it’s good or bad for the climate still depends on which set of assumptions, data, and boundaries you include in your analysis. In the end, I’ve come to this resting place: there’s no single “right” answer about how much carbon gets released from producing corn ethanol. In the end, there are just too many uncertainties, and too many counterfactuals, for a diverse group of experts — no matter how fair-minded — to come to agreement on a precise figure.
I’m not trying to say that life cycle analysis is a hopeless enterprise. It can be really helpful, and quite illuminating. It’s just that it’s fallible, and any result, no matter how well researched, can be open to debate and interpretation.
And in the end, I think Tesco’s approach is exactly the right one: come up with some consistent methods, do your best, and present the information to the public, letting them decide how to interpret it. That should give consumers more information than they have now. And hopefully, it will get the broad brush strokes close enough to be of use in shaping consumers’ choices.
That said, I think it’s unwise to base broad swaths of climate policy on the hope that life cycle analysis will produce unfailingly correct results. To me, that’s a recipe for endless squabbling — and, quite possibly, some really bad policy outcomes.