The “Epic Row” Over a New Epoch

0
16


A few months into the third millennium, a group called the International Geosphere-Biosphere Programme (I.G.B.P.) held a meeting in Cuernavaca, Mexico. Among the researchers in attendance was Paul Crutzen, an atmospheric chemist best known for his research on ozone-depleting chemicals, such as chlorofluorocarbons. For this work, Crutzen, a Dutchman living in Germany, had received a Nobel Prize, in 1995. In his Nobel lecture, he noted that, given humanity’s heedlessness, it had got off lightly. Millions of pounds of CFCs had been released into the air before anyone had considered the possible consequences. As a result of the chemicals’ behavior in the stratosphere, a “hole” had opened up in the ozone layer over Antarctica. But, if CFCs had turned out to behave just slightly differently, the hole would have stretched from pole to pole before scientists had even had the tools to measure it.

“I can only conclude that mankind has been extremely lucky,” Crutzen said.

At the I.G.B.P. meeting in Cuernavaca, Crutzen found himself growing agitated. His colleagues kept referring to the Holocene, the geological epoch that began at the close of the last ice age, about twelve thousand years ago. At the dawn of the Holocene, the global population was maybe four million—barely enough to fill a city like Sydney or St. Petersburg. By the time of the meeting in Mexico, there were more than six billion people on the planet, and human activity was fundamentally altering such basic Earth processes as the carbon cycle.

“Stop using the word ‘Holocene,’ ” Crutzen blurted out. “We’re not in the Holocene any more. We’re in the . . . ” He paused, searching for the right word. “We’re in the Anthropocene!” During the next coffee break, Crutzen’s neologism was the main topic of conversation. Someone suggested that he copyright the term.

As it turned out, the Anthropocene wasn’t Crutzen’s to claim. Eugene Stoermer, a biologist at the University of Michigan, had coined the word back in the nineteen-eighties, out of much the same frustration. Crutzen got in touch with Stoermer, and the two wrote an essay for the I.G.B.P. newsletter, laying out their case for a new age. Human activities, the pair argued, were altering the planet faster and more dramatically than the geological forces that had shaped it for most of its history.

“It seems to us more than appropriate to emphasize the central role of mankind” by using “the term ‘anthropocene’ for the current geological epoch,” the pair wrote. Not many people read the I.G.B.P. newsletter, so in 2002 Crutzen refashioned the essay for the journal Nature. He listed some of the ways that humans were altering the planet: chopping down rain forests, messing with the climate, and manufacturing novel chemicals, such as CFCs. Once again, Crutzen stressed how fortunate humanity had been so far. Had the ozone layer sustained more damage, large parts of the world could have been rendered uninhabitable. “More by luck than by wisdom, this catastrophic situation did not develop,” he observed.

Many researchers found Crutzen and Stoermer’s term useful. Soon the word “Anthropocene” began popping up in scientific papers. This, in turn, piqued the interest of stratigraphers—the subset of geologists who maintain the planet’s official timetable, the International Chronostratigraphic Chart. Had the Earth really entered a new epoch, in the stratigraphic sense of the term? And, if so, when? The International Commission on Stratigraphy (I.C.S.) set up the Anthropocene Working Group (A.W.G.) to look into the matter. It was still working away last month, when, in a vote that one group member described to me as “Putinesque,” a subcommittee of the I.C.S. decided against adding the Anthropocene to the timetable. The vote might have marked the end of the story, were it not that it was probably just the beginning. As another geologist put it to me, “Voting down the Anthropocene is a bit like trying to vote down plate tectonics. It’s real, it’s there, and we are going to have to deal with it.”

Stratigraphers are used to thinking in vast stretches of time. The International Chronostratigraphic Chart starts with the Hadean eon, which began with the birth of the planet, 4.5 billion years ago. The Hadean lasted five hundred million years and was succeeded by the Archean eon, which went on (and on and on) for 1.5 billion years. The Permian period spanned nearly fifty million years, the Cretaceous period eighty million. Within these periods there were many sub-periods—technically known as epochs—which also lasted a long time. The Cisuralian epoch of the Permian, for example, stretched over twenty-six million years.

But, the closer the chart gets to the present, the narrower the divisions become. The second most recent geological period, the Neogene, lasted just twenty million years. The current period, the Quaternary, began with the start of the ice ages, a mere 2.58 million years ago. The Quaternary is further divided into two epochs—the Pleistocene, which spanned 2.57 million years, and the Holocene, which, for now, is still ongoing.

To mark the boundaries between the various epochs and periods, the I.C.S. relies on what are formally called “global boundary stratotype sections and points” and informally known as “golden spikes.” For the most part, golden spikes are layers of rock that contain evidence of some notable shift in Earth’s history—a reversal of the planet’s magnetic poles, say, or the disappearance of a fossilized species. The golden spike for the start of the Triassic period, for example, is a layer of rock found in Meishan, China, and the shift it records is a mass extinction that killed off something like ninety per cent of all species on Earth. (The Chinese have set up a park in Meishan, where visitors can view the two-hundred-and-fifty-million-year-old rock layer in an exposed cliffside.) With golden spikes, again, the closer you get to the present, the more the present intrudes. In the case of the Holocene, the golden spike is a layer in an ice core from Greenland that’s stored in a freezer in Copenhagen. The layer consists of the compressed remains of snow that fell eleven thousand seven hundred years ago, which corresponds to the end of a cold snap known as the Younger Dryas.

With the exception of the Holocene, the start dates for geological ages have been determined millions of years after the fact. This means that whatever signal is being used to set them has withstood the test of time. The rocks of the Anthropocene, of course, do not yet exist. When the Anthropocene Working Group was formed, in 2009, its first task was to decide whether human impacts on the planet would still be discernible millions of years from now.

After several years of study, the group decided that the answer was yes. The carbon emissions from burning fossil fuels will leave a permanent signature in the rocks of the future, as will the fallout from nuclear testing. Novel ecosystems that people have created by moving plants and animals around the world will produce novel fossil assemblages. Meanwhile, traces of some of the trillions of tons of stuff humans have generated, from transistors to tanker ships, will be preserved, meaning that a whole new class of fossils will appear in the record—so-called technofossils. Before aluminum smelting was invented, in the nineteenth century, aluminum existed on Earth only in combination with other elements. Future geologists will thus be able to distinguish the current epoch via the remains of beer cans—the Bud Light layer.

These and other “distinctive attributes of the recent geological record support the formalization of the Anthropocene as a stratigraphic entity,” members of the A.W.G. noted in a paper that appeared in Science in 2016.

When Crutzen and Stoermer initially proposed the Anthropocene, they suggested that it had begun with the first stirrings of the Industrial Revolution, in the late eighteenth century. The A.W.G. considered this possibility, but ultimately rejected it. In the decades following the Second World War, resource consumption skyrocketed—a development that’s become known as the Great Acceleration. The fantastic growth in the production of new materials such as aluminum and plastic, the group decided, made a date closer to 1950 a more logical starting point for the new epoch.

Last summer, under pressure from the International Commission on Stratigraphy to finish its work, the A.W.G. announced its proposal for a golden spike. It chose a marker similar to the one used for the base of the Holocene, although, in this case, the core came not from an ice sheet but from a lake bottom.

Crawford Lake, which is about thirty miles southwest of Toronto, is what’s known as meromictic, which means that its top and bottom waters don’t mix. As a result of this and other unusual qualities, everything that falls into the lake, from pollen grains to radioactive particles, gets preserved in layers of sediment that can be very precisely dated. The idea was to designate the base of the Anthropocene as the layer of Crawford Lake sediment laid down in 1952—and, more specifically, as the 1952 layer preserved in one particular core kept in a freezer in Quebec. (The United States conducted the first H-bomb tests in 1952, and the fallout from these clearly shows up in the lake bed as a spike in plutonium.) The working group announced its choice of the Crawford Lake core while stratigraphers from around the world were gathered for a conference in Lille, France. But, in a sign of things to come, the group was barred from making the announcement at the conference hall and had to rent a room in a nearby hotel.

LEAVE A REPLY

Please enter your comment!
Please enter your name here