It’s literally epoch-defining news. A group of experts tasked with considering the question of whether we have officially entered the Anthropocene – the geological age characterised by humans’ influence on the planet – has delivered its answer: yes.
The British-led Working Group on the Anthropocene (WGA) told a geology conference in Cape Town that, in its considered opinion, the Anthropocene epoch began in 1950 – the start of the era of nuclear bomb tests, disposable plastics and the human population boom.
The Anthropocene has fast become an academic buzzword and has achieved a degree of public visibility in recent years. But the more the term is used, the more confusion reigns, at least for those not versed in the niceties of the underpinning science.
Roughly translated, the Anthropocene means the “age of humans”. Geologists examine layers of rock called “strata”, which tell a story of changes to the functioning of Earth’s surface and near-surface processes, be these oceanic, biological, terrestrial, riverine, atmospheric, tectonic or chemical.
When geologists identify boundaries between layers that appear to be global, those boundaries become candidates for formal recognition by the International Commission on Stratigraphy (ICS). The commission produces the International Chronostratigraphic Chart, which delimits verified changes during the planet’s 4.5 billion-year evolution.
The chart features a hierarchy of terms like “system” and “stage”; generally, the suffix “cene” refers to a geologically brief stretch of time and sits at the bottom of the hierarchy. We have spent the past 11,500 years or so living in the so-called Holocene epoch, the interglacial period during which Homo sapiens has flourished.
If the Holocene has now truly given way to the Anthropocene, it’s because a single species – us – has significantly altered the character of the entire hydrosphere, cryosphere, biosphere, lithosphere and atmosphere.
The end of an era?
Making this call is not straightforward, because the Anthropocene proposition is being investigated in different areas of science, using different methods and criteria for assessing the evidence. Despite its geological ring, the term Anthropocene was coined not by a geologist, but by the Nobel Prize-winning atmospheric chemist Paul Crutzen in 2000.
He and his colleagues in the International Geosphere-Biosphere Program have amassed considerable evidence about changes to everything from nutrient cycles to ocean acidity to levels of biodiversity across the planet.
Comparing these changes to those occurring during the Holocene, they concluded that we humans have made an indelible mark on our one and only home. We have altered the Earth system qualitatively, in ways that call into question our very survival over the coming few centuries.
Crutzen’s group talks of the post-1950 period as the “Great Acceleration”, when a range of factors – from human population numbers, to disposable plastics, to nitrogen fertiliser – began to increase exponentially. But their benchmark for identifying this as a significant change has nothing to do with geological stratigraphy. Instead, they ask whether the present period is qualitatively different to the situation during the Holocene.
Meanwhile, a small group of geologists has been investigating the stratigraphic evidence for the Anthropocene. A few years ago a subcommission of the ICS set up the Anthropocene working group, which has now suggested that human activity has left an indelible mark on the stratigraphic record.
The major problem with this approach is that any signal is not yet captured in rock. Humans have not been around long enough for any planet-wide impacts to be evident in Earth’s geology itself. This means that any evidence for a Holocene-Anthropocene boundary would necessarily be found in less permanent media like ice sheets, soil layers or ocean sediments.
The ICS has always considered evidence for boundaries that pertain to the past, usually the deep past. The WGA is thus working against convention by looking for present-day stratigraphic markers that might demonstrate humans’ planetary impact. Only in thousands of years’ time might future geologists (if there are any) confirm that these markers are geologically significant.
In the meantime, the group must be content to identify specific calendar years when significant human impacts have been evident. For example, one is 1945, when the Trinity atomic device was detonated in New Mexico. This and subsequent bomb tests have left global markers of radioactivity that ought still to be evident in 10,000 years.
Alternatively, geographers Simon Lewis and Mark Maslin have suggested that 1610 might be a better candidate for a crucial human-induced step change. That was the year when atmospheric carbon dioxide dipped markedly, suggesting a human fingerprint linked to the New World colonists’ impact on indigenous American agriculture, although this idea is contested.
The fact that the WGA has picked a more recent date, 1950, suggests that it agrees with the idea of defining the Great Acceleration of the latter half of the 20th century as the moment we stepped into the Anthropocene.
It’s not a decision that is taken lightly. The ICS is extremely scrupulous about amending the International Chronostratigraphic Chart. The WGA’s suggestion will face a rigorous evaluation before it can be scientifically accepted by the commission. It may be many years before it is formally ratified.
Elsewhere, the term is fast becoming a widely used description of how people now relate to our planet, rather like the Iron Age or the Renaissance. These words describe real changes in history and enjoy widespread use in academia and beyond, without the need for rigorously defined “boundary markers” to delimit them from prior periods.
Does any of this really matter? Should we care that the jury is still out in geology, while other scientists feel confident that humans are altering the entire Earth system?
Writing on The Conversation, geologist James Scourse suggests not. He feels that the geological debate is “manufactured” and that humans’ impact on Earth is sufficiently well recognised that we have no need of a new term to describe it.
Clearly, many scientists beg to differ. A key reason, arguably, is the failure of virtually every society on the planet to acknowledge the sheer magnitude of the human impact on Earth. Only last year did we finally negotiate a truly global treaty to confront climate change.
In this light, the Anthropocene allows scientists to assemble a set of large-scale human impacts under one graphic conceptual banner. Its scientific status therefore matters a great deal if people worldwide are at long last to wake up to the environmental effects of their collective actions.
But the scientific credibility of the Anthropocene proposition is likely to be called into question the more that scientists use the term informally or otherwise. Here the recent history of climate science in the public domain is instructive.
Even more than the concept of global warming, the Anthropocene is provocative because it implies that our current way of life, especially in wealthy parts of the world, is utterly unsustainable. Large companies who make profits from environmental despoliation – oil multinationals, chemical companies, car makers and countless others – have much to lose if the concept becomes linked with political agendas devoted to things like degrowth and decarbonisation. When one considers the organised attacks on climate science in the United States and elsewhere, it seems likely that Anthropocene science will be challenged on ostensibly scientific grounds by non-scientists who dislike its implications.
Sadly, such attacks are likely to succeed. In geology, the WGA’s unconventional proclamation potentially leaves any ICS definition open to challenge. If accepted, it also means that all indicators of the Holocene would now have to be referred to as things of the past, despite evidence that the transition to a human-shaped world is not quite complete in some places.
Some climate contrarians still refuse to accept that researchers can truly distinguish a human signature in the climate. Similarly, scientists who address themselves to the Anthropocene will doubtless face questions about how much these changes to the planet are really beyond the range of natural variability.
If “Anthropocene sceptics” gain the same momentum as climate deniers have enjoyed, they will sow seeds of confusion into what ought to be a mature public debate about how humans can transform their relationship with the Earth. But we can resist this confusion by recognising that we don’t need the ICS’s imprimatur to appreciate that we are indeed waving goodbye to Earth as we have known it throughout human civilisation.
We can also recognise that Earth system science is not as precise as nuclear physics or geometry. This lack of precision does not mean that the Anthropocene is pure scientific speculation. It means that science knows enough to sound the alarm, without knowing all the details about the unfolding emergency.
The Anthropocene deserves to become part of our lexicon – a way we understand who we are, what we’re doing and what our responsibilities are as a species – so long as we remember that not all humans are equal contributors to our planetary maladies, with many being victims.
Now, Check Out:
- Fracking and health: What we know from Pennsylvania’s natural gas boom
- We’ve been wrong about the origins of life for 90 years
- As Rio bay waters show, we badly need innovation in treating human wastes
- Biohybrid robots built from living tissue start to take shape [Video]
- Sea turtle ‘hitchhikers’ could play an important role in conservation