Six thousand years ago, Egyptian Sahara. The desert is moving. We know this to be true from carbon dating, from archaeological investigations, from satellite imagery; from geographers, climatologists, and anthropologists. For at least 6,000 years, the Sahara Desert — or what would become the Sahara Desert — has been expanding and contracting. Mostly it has been inching southward.
For just as long, people scurry in its wake. Dynasties rise and fall. Pastoralists lead their cattle down away from the desert. Populations swell. The sands of time, too, do their inching.
In the last decades of the 20th century, the multi-millennial creep of the desert is accompanied by an increased frequency of drought and heat waves that has little to do with its previous flux. The Saharan expansion of yestercentury is largely thought to have been spurred by slight changes in Earth’s orbit. The new desertification narrative is different. Alarmingly so.
In the eastern Sahel, there are mouths to feed. Fields are overgrazed; trees are slashed for firewood; fields are wrung dry with exhaustive cropping. In Sudan, in four decades, average precipitation falls by 30 percent. Farmers fence their land. Herders cannot graze the animals. Colonial borders do no favors. There is not enough water for all the mouths. It is very hot. And still the Sahara, ever hungry, marches south.
2003, Sudan’s Darfur region. Genocide.
“Almost invariably, we discuss Darfur in a convenient military and political shorthand — an ethnic conflict pitting Arab militias against black rebels and farmers,” writes U.N. Secretary-General Ban Ki-moon in the Washington Post in 2007. “Look to its roots, though, and you discover a more complex dynamic. Amid the diverse social and political causes, the Darfur conflict began as an ecological crisis, arising at least in part from climate change.”
An ecological crisis.
2010, N’Djamena, Chad. Leaders from 11 African nations sign a pledge to build a Great Green Wall: a stopgap defense against desertification. 4,700 miles long, 9 miles deep, all trees.
2014, the Sahel. Population: 100 million. Great Green miles planted: 300.
When the sand shifts below your feet, either you migrate, you fight for your livelihood, or you try to convince the desert to stop moving.
What if the desert isn’t listening?
There’s bacon in the skillet, hissing and popping like an old record. It is a hot July morning in Seattle, and I’m watching, sans air conditioner, sans sleeves, the bacon do its wiggly bacon thing. Some of the strips are starting to curl and are ready for another flip. Fork to pan, a lazy flick, a little flop. And in the final moments of turning the last strip, a small disaster: The bacon slaps the pan with a little extra gusto, and a splash of hot grease ends up on my forearm. Pain, sink, water, towel off.
And then, for no good reason, I punch the refrigerator, where there is now a sad dent.
Chalk it up to changes. I was going through a good handful of them: finishing grad school, moving back to the United States, figuring out how to make rent and get on one of those health insurance exchanges, trying out a city where I’d never been and didn’t know anyone, trying out Digital Journalism and a Nine-To-Five; sleeping, alone, on an air mattress. The apartment smelled unfamiliar, like dentistry. And it was just so hot.
I’ve been thinking a lot about that heat lately.
A few months before I moved to Seattle, a Stanford economist named Marshall Burke and two of his colleagues published an article in the Annual Review of Economics about the relationships between climate and conflict. Surveying dozens of independent studies, the team showed how deflections in temperature or rainfall patterns — whether at the national, state, or local scale — were associated with increases in both large-scale violence and individual violent crime. The story fits neatly with Ban Ki-moon’s hedging on Darfur.
As someone who has worked in four neuroscience labs and finagled two degrees in the brain stuff, I spend a good amount of time thinking about the neurobiological roots of behavior. When Burke and his colleagues claim that temperature changes are related to violence, I start to wonder which tangle of biological pathways might be at work — which strands, if any, can be separated from the economics and the sociology — and what this mess might mean for a warming world. Climate change is always out there, in that other place and future time. If droughts and rising temperatures are doing something to people now, in here, the noggin, even just a little, it seems worth asking how and why.
When I call up Burke a few months after denting my fridge, he begins by telling me that I don’t need to go to Darfur or Damascus to learn about the connections between heat and hot-headedness. “We see this everywhere in the U.S. — in the city level, at the county level — really any way you want to look at it.”
He cites papers that documented the relationship in Phoenix in 1986; in Charlotte in 1988; in Minneapolis and Dallas, between 1997 and 2005. It was there in St. Louis in 2013. Nationwide, 2014. Forget drought: When temperatures alone rise, violence does the same.
Burke and his colleagues estimate that a single standard-deviation increase in temperature elevates the risk of person-to-person violence by about 2.5 percent and the risk of group-to-group violence by about 11 percent. These are events with low underlying likelihoods in the first place — most countries aren’t constantly on the brink of civil war, and most people aren’t constantly on the brink of aggravated assault — but in a warming world, the economists argue the effects are appreciable. Some parts of the world are expected to warm by two to four standard deviations by 2050.
“If future civil conflicts remain as deadly as those that took place recently,” his team writes, projections suggest that the “increase in conflict would result in 393,000 additional battle deaths by 2030” — in Africa alone.
I’m not really sure what to do with that number. In 2016, environmental apocalypse scenarios are somehow simultaneously alarmist and desensitizing. If climate change is making people kill one another, we might as well give up and ride this one out until it’s Mad Max out there.
Then again, that’s not exactly what Secretary Ban wrote about Darfur. The “began as” and “at least in part” are important qualifiers. It’s the same logic that the Department of Defense uses when it describes climate change as a “threat multiplier” — never the cause of a conflict, always a lens through which societies are focused and refracted; bent over, eventually, some nameless, violent tipping point. Arable land isn’t something you can import, and when livelihoods depend on crop yields, an increased frequency of drought can make or break families.
But that still doesn’t explain why, say, domestic violence spikes in Australia on hotter days. It certainly doesn’t explain why pitchers are more likely to intentionally hit batters at higher temperatures.
“A lot of people naturally gravitate to the income effect story, but in the U.S., we’re usually looking at daily-scale data,” Burke told me. “We’re comparing a hot day to a normal day — and temperature variation on that kind of time scale just does not, or is very unlikely to, induce variation in income.”
In other words, it’s fairly obvious that a creeping desert and its accompanying heat and drought could dry out a wallet — and that, on average, this financial stress could feasibly influence the likelihood of someone committing a violent act — but it makes less sense that rising temperatures on their own should have anything to do with violence. That hot days should result in more refrigerator punches than normal days. And yet decades of research suggest they do.
Burke put it bluntly: “You can piss people off and get them to be mean to each other if you put them in a room and crank up the heat.”
I wondered if anyone had tried anything like that lately.
It turns out the idea of boiling blood is an old one. Since at least the late 1800s, the descendants of natural philosophers have debated the links between temperature, aggression, and violence. It wouldn’t be until the latter half of the 20th century, though, that psychologists started packing people into rooms and cranking up the heat to see what happened.
Craig Anderson was one of those psychologists. “Even as a child I was interested in aggression,” the Iowa State researcher writes in his contribution to a forthcoming essay collection, ambitiously titled Scientists Making a Difference: The Greatest Living Behavioral and Brain Scientists Talk about Their Most Important Contributions.
What had begun as a childhood fascination with Westerns and World War II shows would develop into a meticulous scholarly pursuit, he writes. As a young Stanford graduate student in the late 1970s, Anderson’s first professional publication demonstrated a linear relationship between temperature and the likelihood of civil riots in the United States. Pinning down a psychological explanation, though, was easier said than done.
When I called him up, Anderson explained the line of questioning that had fueled his early work: “If this effect is real, what’s going on? Are people just more irritable? Can you replicate this in a lab?”
The problem with that line of experimentation is that people tend to be suspicious of psychologists. “If your research participants know what you’re doing, they start to behave pretty weirdly,” said Anderson. But where previous psychologists had made the game too obvious — “a kerosene heater sitting in the lab,” he recalled — Anderson and his colleagues were a bit more covert. Their solution: Use a shitty office.
“With a set of crappy cubicles in an old building, participants could reasonably believe it was just the building’s heating system acting up,” he said, reflecting on a set of studies his team conducted in the early 1980s. “They couldn’t believe that we had any control over the temperature.” And when the psychologists ran experiments testing for aggressive thinking and hostile feelings in the aging cubicles, lo and behold, the observations held. People were more aggressive at higher temperatures.
Throughout the ’80s and ’90s, Anderson and his collaborators would continue to probe the temperature–aggression link for explanations. As the experimenters plodded on, a theory began to take shape.
It seemed that many of the upticks in aggression came down to over-interpreting a situation. It wasn’t that Anderson’s subjects were automatically more aggressive at higher temperatures — though feelings of hostility often did increase — it was that higher temperatures more easily lent themselves to escalation. In hotter rooms, for example, people would perceive a video as depicting a more aggressive interaction. “That’s what we think is happening in the real-world data at a psychological level,” he said. People over-interpret and overreact. “That’s how bar fights turn into shootings in the parking lot.”
Intuitively speaking, this has the taste of truth. “Hot-headedness” is called what it is for a reason. I know I, for one, am probably more likely to tense up on a crowded bus if it’s a hot day. There’s a fine line between discomfort and a hair trigger.
But the psychological explanation has that twisty tautological rub that comes with a lot of psychological explanations. It might offer clues as to why an increase in heat corresponds with an increase in aggression — because of over-interpretation, perhaps — but it fails to answer why an increase in heat corresponds with an increase in over-interpretation.
Answering that would require zooming the lens in a bit.
So what about the people with the pipettes and the electrodes?
“I think the exact neurophysiology here isn’t that well understood,” said Burke, the Stanford economist. “As far as we can tell from our economist chairs, it doesn’t appear definitive.”
He’s right: It isn’t, and it’s not.
To dive into the state of the relevant brain science, I reached out to the authors of a handful articles on the neurobiology of aggression. Dongju Seo, a Yale clinical neuroscientist, told me that she wasn’t aware of any evidence related to — or neuroscientists studying the relationships between — temperature and aggression. “It seems to be an interesting topic,” she wrote in an email, “but certainly a less studied area.”
There simply aren’t many people asking these questions. “For some reason,” offered U.C. Davis behavioral neuroscientist Brian Trainor, “funding agencies in the U.S. have decided that aggression is not a priority.” This switch in funding priorities was the “major reason” that Trainor, who now studies behavioral responses to stress, shifted away from aggression research. To a certain extent, trends in federal funding will always dictate researchers’ priorities.
Piecing together the state of neuroscience research on temperature and aggression is a bit like attempting to assemble a jigsaw puzzle without a photo. If there’s any relevant research out there, it’s tucked away outside the purview of researchers like Seo and Trainor, the authors of journal articles titled “Role of serotonin and dopamine system interactions in the neurobiology of impulsive aggression and its comorbidity with other clinical disorders” and “Neural mechanisms of aggression,” respectively.
But let’s see what we can cobble together.
Leave temperature aside for a moment. It’s not entirely clear how aggression manifests itself at the level of the brain, but if you’re a neuroscientist (really, any kind of biologist), one good way to get to the physiological heart of a bodily unknown is to see what happens to it when you add some drugs to the mix. If you understand a given drug’s effects — that is, which proteins it acts on; which chemical cascades it initiates — and if you observe some behavioral difference upon delivery of the drug, it’s usually fair game to assume that the behavior has something to do with the biological tidbits upon which the drug acted.
In the case of aggression, 50 years of pharmacology have taught us plenty, but not enough. We know that people tend to be less aggressive when they take things like chlorpromazine and haloperidol — classic drugs given for the treatment of epilepsy — but that’s mostly because these drugs have a sedative effect. More recently, drugs like risperidone have been shown to be effective in reducing aggression and violence in people with schizophrenia, but risperidone often also causes significant weight gain, metabolic problems, and irritability.
Broadly speaking, though, there’s a clue here. All of the above drugs act on the brain’s dopamine and serotonin signaling systems. And while these (all-too-easy to sensationalize) molecules regulate and color vast arrays of behavior, aggression research in the 1980s and ’90s tended to hone in on them, as well. If we were forced to construct a quasi-credible neurobiology of temperature-sensitive aggression, those molecules would be two obvious corner pieces to the puzzle.
We’d also need some hunks of brain matter.
Here, the neuroscience points to the hypothalamus and the limbic system. The hypothalamus, a tiny patch of neural tissue buried deep within the brain, is responsible in part for things like hunger, sleep, and the workings of our hormones. The limbic system is a conglomeration of other cogs of neural machinery — buzzword areas like the amygdala and the hippocampus — thought to be at work when we feel emotions and react to social situations. (The hypothalamus is considered by some neuroscientists be a part of the limbic system, but we’ll leave that line drawn in the neural sand for now.)
By the turn of the millennium or so, standard neuroscience dogma said that increased levels of serotonin (and related molecules like the parodically named 5-hydroxyindoleacetic acid) in and around these areas were associated with lower levels of aggression. Conversely, increased levels of brain dopamine were associated with higher levels of aggression. And since low serotonin activity was shown to result in dopamine overactivity, there was a reasonably clean link between serotonin, dopamine, and aggression as a whole.
So there: serotonin, dopamine, the hypothalamus, and the limbic system. Four corners to frame the jigsaw picture. Which is something.
But this is about the point where it starts to feel like the puzzle we’ve been handed is missing some pieces, that the box contains pieces to more than one picture, and that several crucial scenes may have been eaten.
Over the past two decades, the serotonin story, like much of neuroscience, has gotten a lot more complicated. While some researchers have continued to link higher serotonin activity with lower levels of aggression, others have demonstrated no effect or even shown that serotonin can increase impulsive aggression. It’s also not just the molecules that matter: It’s the machinery that handles them. It turns out that serotonin receptors — the baseball mitts that catch serotonin molecules when they’re tossed from one neuron to another — are a lot more diverse than we expected.
But even if the first generation of research was right — that higher serotonin levels and lower aggression levels go hand in hand — throwing temperature into the mix mucks things up. Serotonin signaling is closely related to how brains and bodies react to and regulate temperature. But brain serotonin levels peak in the warm, sunny days of summer. If that’s true, why are those the same days we see spikes in violent crime?
Ahmad Hariri, a cognitive neuroscientist at Duke, wrote me that he suspected any apparent correlations between temperature and serotonin in people were probably “confounded by differences in light.” That is, it’s the amount of time the sun spends in the sky in summer that’s the real factor here, not the heat it produces.
Hariri’s former PhD student, Patrick Fisher, now a neuroscientist at Copenhagen University Hospital, corroborated Hariri’s cautionary tales, pointing to other potential confounds like seasonal changes in eating habits. He also passed along a study — conducted on people — that failed to find a relationship between temperature and biological levels of a protein relevant to serotonin signaling. If that relationship isn’t there, the logic goes, then how can temperature exert any kind of action on violence through the serotonergic system?
Nobody knows. It might not.
And as far as I can tell, no neuroscientist is asking these questions. Whether it’s a lack of funding, the generally siloed nature of academia, fear of sensationalization, or some other barrier, the temperature–violence connection hasn’t hit the contemporary neuroscience benches. Conversation after conversation and a slew of relevant Google Scholar and PubMed searches turn up a fat donut’s worth of results.
This is usually around the time one upends the jigsaw table.
That’s not quite the right reflex, though. Neuroscience is almost never a tale of brain juice X + brain area Y = behavior, and this was never going to be a story of “this is your brain on climate change.” What we do know is that rising temperatures alone appear to be associated with increased violence. And that’s reason enough to question the simplicity of the resource scarcity story. It’s also reason enough to take a changing global climate seriously.
It’s one of those things that’s there and isn’t, heat, one of those sticky clouds wedged between a noun and a verb. Noun but not object; verb, but a wall you can feel. Defined in a negation, heat: energy that is not work. That’s the thing about things that are there and aren’t — you’re never quite sure when they’re acting. You never know when you can blame them. And when causality gets a little squishy, it’s easy to pretend the effect doesn’t exist. Can’t stand the heat? Get out of the kitchen.
Back in July, I’m out of the kitchen, on the floor because I don’t yet own chairs, in front of a fan because it is 92 degrees, eating bacon I’m not supposed to be eating because I’m ideologically inconsistent when it comes to climate change and Judaism and vegetarianism. My burn and my knuckles hurt a bit.
I’m still wondering why I punched the fridge.
Marshall Burke might tell me that I was that July day’s data point; a random sample of the distribution that describes the kinds of things that happen at these kinds of temperatures. Craig Anderson might tell me that the heat had primed me toward anger, which in turn had primed me toward retaliation. I might read the papers of people like Dongju Seo and Brian Trainor and tell myself that there was something kooky going on with my limbic serotonin metabolism.
Of course, it’s all of these things and it’s none of them. There’s no way of knowing that, of all the refrigerator punches in all the towns in all the world, the heat had anything to do with this one. I’m mostly still chalking it up to changes. Certainly, the lack of health insurance and that dentistry smell didn’t help.
But blame is one of the closest cousins of responsibility, and responsibility remains the cousin that’s always showing up late to the climate action dinner table. The more blame we can realistically assign, the better. The more evidence that rising temperatures are of import now, not in 2100, the closer the problem hits to home. Don’t upend the puzzle; find a bigger table. The economists have snapped in a few pieces. The psychologists have fiddled with that funny dogeared piece and found where it fits. Maybe it’s time for a neuroscientist to pull up a chair.
And maybe there’s no relevant brain science here after all. But we should be jumping at every opportunity we have to recenter the climate debate on the individual. It’s only possible to care about tiny shifts in the average — tiny increases in crime rates, for example, or in the risk of civil conflict — if you’re actually convinced you could be part of that shift.
Admitting that the climate might have something to do with the way people behave — might already be exerting that influence, in places like the Sahel, in places like Syria, in places like St. Louis — means recognizing that people are at risk of losing far more than coastal property or crops, and that maybe we really should be throwing ourselves at solutions. The alternative is throwing ourselves at each other: out with a bang and not a whimper after all.
Get Grist in your inbox