The scientist Paul Crutzen grew tired of the Holocene 24 years ago. The geologic epoch had reigned for 11,700 years, ever since the sprawling ice sheets covering North America and Europe began melting rapidly, and Crutzen thought its time was up. The atmospheric chemist had won a Nobel Prize in 1995 for demonstrating how humanity was destroying the ozone layer, just one of the many ways people had radically altered the planet, from urbanization to releasing greenhouse gases. After repeatedly hearing mention of the Holocene at a scientific conference in Mexico, Crutzen lost his temper.
Interrupting a speaker, he announced that the world had already entered a new age: the Anthropocene. The notion, which a few people had pondered as early as the dawn of the Industrial Revolution, implied that human activity was a powerful geological force, one rivaling volcanoes and asteroids. “My remark had a major impact on the audience,” Crutzen, who died in 2021, recalled years later. “First there was a silence, then people started to discuss this.”
All these years later, they’re still discussing it. Since Crutzen’s outburst, the term “Anthropocene” has proliferated through the sciences, humanities, and pop culture. (The musician Grimes named one of her albums Miss Anthropocene.) The task of formalizing the geological era fell to the International Commission on Stratigraphy, the scientific organization which establishes global standards for such things. The Earth system scientists who became part of the commission’s Anthropocene Working Group, created in 2009, found themselves at the center of a contentious debate as the world looked to them for a definitive answer: Are we living in the Anthropocene?
As differences came to a head, heated discussions, votes, and resignations ensued. The latest drama came last week, when The New York Times reported that a committee of scholars had rejected the working group’s proposal to declare the Anthropocene an official epoch after 15 years of debate. Some panelists disputed the report of the leaked vote. A press release shared by the committee’s chair said the alleged vote violated the group’s rules and that an inquiry into annulling it was being launched.
Epochs are typically stretches of several million years designated by clues left in the soil, rocks, and fossils. Even as some headlines suggested that the decision meant Earth hasn’t entered the “age of humans,” geologists and other experts say the vote shouldn’t change how people talk about humanity’s influence on the planet. Since the Industrial Revolution, climate change has warmed the globe by 2 degrees Fahrenheit (1.2 degrees Celsius) and counting. The Anthropocene might still be considered a geological “event,” a more flexible term that would put it on par with major transformations such as the Great Oxidation Event, when oxygen became a major component of Earth’s atmosphere more than 2 billion years ago.
“Nobody is saying that global change caused by people is not significant,” said Erle C. Ellis, an environmental scientist at the University of Maryland, Baltimore County, and a former member of the working group. “It’s just about whether we should narrow down the Anthropocene definition to an epoch beginning in 1952.”
The trouble with pinning down a date stems from the fact that humans have been changing the planet for a long time. The comparatively calm, warm conditions of the Holocene encouraged the development of agriculture. People planted crops and built cities, expanding civilization until they started remaking Earth itself. Somewhere in the midst of all this activity — the clearing of the world’s forests, the rampant burning of fossil fuels, and the testing of nuclear bombs — scientists say the planet entered the Anthropocene.
Last year, the working group settled on 1952 after identifying Crawford Lake in Ontario, Canada, as the best place to find proof of this new age. Almost 80 feet under the glassy surface, the layers of mud at the bottom almost perfectly preserve hints of human history, such as maize pollen from nearby Indigenous settlements of the 13th century and charcoal from a local logging mill in the 19th. A layer of radioactive plutonium from mid-century nuclear weapon testing was selected as a potential “golden spike,” a term for unique geological signatures that typically mark the start of new epochs.
The group found “overwhelming evidence that there was a fundamental change in how the planet works around that time,” said Francine McCarthy, an Earth scientist at Brock University in Ontario, who led a team of 60 researchers collecting and examining soil samples at Crawford Lake. For McCarthy, the controversial vote is a frustrating end to many years of work. “I don’t care what other people feel should be the beginning of the Anthropocene,” she said. “We spent 15 years, millions of euros to answer the question.”
The awareness that humans might be irrevocably changing their home dates to the dawn of the Industrial Revolution, when steam engines started chugging and industrial factories emerged. George Perkins Marsh, an early American conservationist, declared that humans were reshaping Earth — for the worse — in his influential 1864 book Man and Nature. In 1873, Antonio Stoppani, an Italian priest and geologist, proclaimed that the “Anthropozoic era” had begun, with no end in sight.
Scholars started reckoning with the concept in earnest after World War II, when radiocarbon from nuclear bomb blasts settled into the rock record. In 1955, 70 researchers from around the world met in Princeton, New Jersey, for a symposium on “Man’s Role in Changing the Face of the Earth” for the first large-scale discussion of how humans had transformed the environment. But it wasn’t until 2000 that the idea really took off. Crutzen, along with the diatom researcher Eugene F. Stoermer, wrote a newsletter proposing the term “Anthropocene” for the current geological epoch.
Since then, the word’s use has steadily increased, and Google search interest followed a similar pattern. By 2014, it had landed in the Oxford English Dictionary. In a surprising twist, the obscure-sounding term had escaped academic journals and entered the domain of popular culture, inspiring artists, novelists, and musicians. The scientists in the Anthropocene Working Group even became stars in a documentary.
For Dipesh Chakrabarty, a historian at the University of Chicago, hearing the term for the first time was career-changing. Branching out from his work on South Asian history, he began to write about climate change, eventually publishing a handful of books on the subject. “I was very struck that geologists were describing humanity as a geological force,” he said. “That is something much bigger than what I used to imagine humans to be.”
Not all scholars love the term, though. Some argue that it generalizes the culpability for climate change and other environmental problems, seemingly blaming everyone instead of the countries and industries most responsible. The discussion has spawned a host of bizarre-sounding spin-offs: the Capitalocene (blaming capitalism), the Plantationocene (blaming agriculture), and the Occidentalocene (blaming rich, industrialized countries).
Ellis used to think that formalizing the Anthropocene as an epoch was a good idea, but last summer, after 14 years of heated discussion — and a dispute over narrowing the start date to 1950 — he resigned from the Anthropocene Working Group. Now, he thinks that restricting the term to a technical definition may only confuse the public. The quest to establish the Anthropocene as an epoch was “an experiment in how to communicate the science of humans changing the planet,” Ellis said. “And it is a failed experiment.”
“We can still use the term as we always have,” he said, in the wake of the commission’s vote. “We just don’t need an epoch to attach to it.”