Genetic studies reveal diversity of early human populations – and pin down when we left Africa

By George Busby, University of Oxford.

Humans are a success story like no other. We are now living in the “Anthropocene” age, meaning much of what we see around us has been made or influenced by people. Amazingly, all humans alive today – from the inhabitants of Tierra del Fuego on the southern tip of the Americas to the Sherpa in the Himalayas and the mountain tribes of Papua New Guinea – came from one common ancestor.

We know that our lineage arose in Africa and quickly spread to the four corners of the globe. But the details are murky. Was there just one population of early humans in Africa at the time? When exactly did we first leave the continent and was there just one exodus? Some scientists believe that all non-Africans today can trace their ancestry back to a single migrant population, while others argue that there were several different waves of migration out of Africa.

Now, three new studies mapping the genetic profiles of more than 200 populations across the world, published in Nature, have started to answer some of these questions.

Out of Africa

Humans initially spread out of Africa through the Middle East, ranging further north into Europe, east across Asia and south to Australasia. Later, they eventually spread north-east over the top of Beringia into the Americas. We are now almost certain that on their way across the globe, our ancestors interbred with at least two archaic human species, the Neanderthals in Eurasia, and the Denisovans in Asia.

Genetics has been invaluable in understanding this past. While hominin fossils hinted that Africa was the birthplace of humanity, it was genetics that proved this to be so. Patterns of genetic variation – how similar or different people’s DNA sequences are – have not only shown that most of the diversity we see in humans today is present within Africa, but also that there are fewer differences within populations the further you get from Africa.

These observations support the “Out of Africa” model; the idea that a small number of Africans moved out of the continent – taking a much reduced gene-pool with them. This genetic bottleneck, and the subsequent growth of non-African populations, meant that there was less genetic diversity to go round, and so there are fewer differences, on average, between the genomes of non-Africans compared to Africans.

When we scan two genomes to identify where these differences, or mutations, lie, we can estimate how long in the past those genomes split from each other. If two genomes share long stretches with no differences, it’s likely that their common ancestor was in the more recent past than the ancestor of two genomes with shorter shared stretches. By interrogating the distribution of mutations between African and non-African genomes, two of the papers just about agree that the genetic bottleneck caused by the migration out of Africa occurred roughly 60,000 years ago. This is also broadly in line with dating from archaeological investigations.

Their research also manages to settle a long-running debate about the structure of African populations at the beginning of the migration. Was the small group of humans who left Africa representative of the whole continent at that time, or had they split off from more southerly populations earlier?

SGDP model of the relationships among diverse humans (select ancient samples are shown in red) that fits the data.
Credit: Swapan Mallick, Mark Lipson and David Reich.

The Simons Genome Diversity Project compared the genomes of 142 worldwide populations, including 20 from across Africa. They conclusively show that modern African hunter-gatherer populations split off from the group that became non-Africans around 130,000 years ago and from West Africans around 90,000 years ago. This indicates that there was substantial substructure of populations in Africa prior to the wave of migration. A second study, led by Danish geneticist Eske Willersev, with far fewer African samples, used similar methods to show that divergence within Africa also started before the migration, around 125,000 years ago.

More migrations?

Following the move out of the continent, the pioneers must then have journeyed incredibly quickly to Australia. The Danish study, the most comprehensive analysis of Aboriginal Australian and Papuan genomes to date, is the first to really examine the position of Australia at the end of the migration.

They found that the ancestors of populations from “Sahul” – Tasmania, Australia and New Guinea – split from the common ancestor of Europeans and Asians 51,000-72,000 years ago. This is prior to their split from each other around 29,000-55,000 years ago, and almost immediately after the move out of Africa. This implies that the group of people who ended up in the Sahul split away from others almost as soon as the initial group left Africa. Substantial mixing with Denisovans is only seen in Sahulians, which is consistent with this early split.

Crucially, because the ancestors of modern-day Europeans and Asians hadn’t split in two at this point, we think that they must have still been somewhere in western Eurasia at this point. This means that there must have been a second migration from west Eurasia into east Asia later on. The Simons Genome Diversity Project study, by contrast, albeit with a far smaller sample of Sahulian genomes, found no evidence for such an early Sahulian split. It instead shows that the ancestors of East Asians and Sahulians split from western Eurasians before they split from each other, and therefore that Denisovan admixture occurred after the former split from each other.

A graphic representation of the interaction between modern and archaic human lines, showing traces of an early out of Africa (xOoA) expansion within the genome of modern Sahul populations.
Dr Mait Metspalu at the Estonian Biocentre, Tartu, Estonia

Meanwhile, a third paper proposes an earlier, “extra” migration out of Africa, some 120,000 years ago. This migration is only visible in the genomes of a separate set of Sahulians sequenced as part of the Estonian Biocentre Human Genome Diversity Panel. Only around 2% per cent of these genomes can be traced to this earlier migration event, which implies that this wave can’t have many ancestors left in the present day. If true (the two other papers find little support for it), this suggests that there must have been a migration across Asia prior to the big one about 60,000 years ago, and that anatomically modern human populations left Africa earlier than many think.

Whatever the reality of the detail of the Out of Africa event, these studies provide some benchmarks for the timings of some of the key events. Importantly, they are also a huge resource of over 600 new and diverse human genomes that provide the genomics community with the opportunity for further understanding of the paths our ancestors took towards the Anthropocene.

The ConversationGeorge Busby, Research Associate in Statistical Genomics, University of Oxford

This article was originally published on The Conversation. Read the original article.

Now, Check Out:

Alligators Are Ancient: New Studies Show They Haven’t Evolved for Over 8 Million Years

While many of today’s top predators are more recent products of evolution, the modern American alligator is a reptile from another time.

New research shows these prehistoric-looking creatures have remained virtually untouched by major evolutionary change for at least 8 million years, and may be up to 6 million years older than previously thought. Besides some sharks and a handful of others, very few living vertebrate species have such a long duration in the fossil record with so little change.

“If we could step back in time 8 million years, you’d basically see the same animal crawling around then as you would see today in the Southeast. Even 30 million years ago, they didn’t look much different,” says Evan Whiting, a former University of Florida undergraduate and the lead author of two studies published during summer 2016 that document the alligator’s evolution—or lack thereof.

“We were surprised to find fossil alligators from this deep in time that actually belong to the living species, rather than an extinct one,” he says.

Alligators and humans

Whiting, now a doctoral student at the University of Minnesota, describes the alligator as a survivor, withstanding sea-level fluctuations and extreme changes in climate that would have caused some less-adaptive animals to rapidly change or go extinct. Whiting also discovered that early American alligators likely shared the Florida coastline with a 25-foot now-extinct giant crocodile.

In modern times, however, he said alligators face a threat that could hinder the scaly reptiles’ ability to thrive like nothing in their past—humans.

Despite their resilience and adaptability, alligators were nearly hunted to extinction in the early 20th century. The Endangered Species Act has significantly improved the number of alligators in the wild, but there are still ongoing encounters between humans and alligators that are not desirable for either species and, in many places, alligator habitats are being destroyed or humans are moving into them, Whiting says.

“The same traits that allowed alligators to remain virtually the same through numerous environmental changes over millions of years can become a bit of a problem when they try to adapt to humans,” Whiting says. “Their adaptive nature is why we have alligators in swimming pools or crawling around golf courses.”

Whiting hopes his research findings serve to inform the public that the alligator was here first, and we should act accordingly by preserving the animal’s wild populations and its environment. By providing a more complete evolutionary history of the alligator, his research provides the groundwork for conserving habitats where alligators have dominated for millions of years.

“If we know from the fossil record that alligators have thrived in certain types of habitats since deep in time, we know which habitats to focus conservation and management efforts on today,” Whiting says.

Giant crocodiles

The researchers began re-thinking the alligator’s evolutionary history after Whiting examined an ancient alligator skull, originally thought to be an extinct species, unearthed in Marion County, Florida, and found it to be virtually identical to the iconic modern species. He compared the ancient skull with dozens of other fossils and modern skeletons to look at the whole genus and trace major changes, or the lack thereof, in alligator morphology.

Whiting also studied the carbon and oxygen compositions of the teeth of both ancient alligators and the 20- to 25-foot extinct crocodile Gavialosuchus americanus that once dominated the Florida coastline and died out about 5 million years ago for unknown reasons. The presence of alligator andGavialosuchus fossils at several localities in north Florida suggest the two species may have coexisted in places near the coast, he says.

Analysis of the teeth suggests, however, that the giant croc was a marine reptile, which sought its prey in ocean waters, while alligators tended to hunt in freshwater and on land. That doesn’t mean alligators weren’t occasionally eaten by the monster crocs, though.

“Evan’s research shows alligators didn’t evolve in a vacuum with no other crocodilians around,” says coauthor David Steadman, ornithology curator at the Florida Museum of Natural History at the University of Florida. “The gators we see today do not really compete with anything, but millions of years ago it was not only competing with another type of crocodilian, it was competing with a much larger one.”

Steadman says the presence of the ancient crocodile in Florida may have helped keep the alligators in freshwater habitats, though it appears alligators have always been most comfortable in freshwater.

While modern alligators do look prehistoric, study authors say they are not somehow immune to evolution. On the contrary, they are the result of an incredibly ancient evolutionary line. The group they belong to, Crocodylia, has been around for at least 84 million years and has diverse ancestors dating as far back as the Triassic, more than 200 million years ago.

Whiting’s studies were published in the Journal of Herpetology and Palaeogeography, Palaeoclimatology, Palaeoecology

Source: Republished from as a derivative work under the Attribution 4.0 International license. Original article posted to Futurity by

Now, Check Out:

Simulating evolution: how close do computer models come to reality?

By Christoph Adami, Michigan State University.

Darwin’s theory of evolution is a simple but powerful framework that explains how complexity can come from simplicity: how everything biological around us – from the microbial biofilms on your teeth to the majestic redwood trees – emerged from the very simplest of beginnings.

How exactly this happened is, of course, a matter of intense research. Each species is finely adapted to thrive in its environment, which in turn has shaped that species’ evolutionary history. But those environmental forces exerted on a species occurred over a very long period of time, in the often very distant past. How can we understand which environmental features were responsible for which adaptations we see today?

As an example, my research group recently got interested in what makes people dislike taking risks. Of course we can’t travel through time to go back and run a controlled experiment on our early human ancestors to see how that tendency might have evolved. But as scientists, we want to do more than just come up with an untestable hypothesis.

So we turned to computers to simulate the dynamics of ancient people for thousands of generations. By carefully choosing the starting parameters for our computer simulation, we were able to see how in small groups of about 150 people – the size common during the Stone Age – gambles that pay off big time (but only rarely) end up being genetically costly. We also found that risky behavior had no consequences as long as populations were large. I can’t think of another way an evolutionary study like this could have been carried out. Here’s why we can believe what these kinds of computer simulations tell us.

Passing on a constant flux of traits

Darwin’s theory of evolution is simple in the sense that it requires only three necessary (and sufficient) components for the process to work: inheritance, variations and differential survival (sometimes called “selection”).

Natural selection is one mechanism for how evolution happens. Elembis, CC BY-SA

Inheritance guarantees that anything new discovered by the process is not lost. Variation ensures that new things are being tried out constantly. And differential survival implies that differences matter – variations that help rather than hurt have consequences for the descendants of the first individual that carried that beneficial change.

But even though these principles are straightforward, how they play out in a complex world is far from simple. We might be able to work out in our head how one beneficial change (say, a larger body size that allows an individual to withstand a predator’s assaults) can also have negative consequences (more time spent foraging to support the body weight exposes the individual to more predation). Such simple trade-offs can be captured by mathematical formulas, and their consequences can be worked out.

But in real biology, every single trait could conceivably affect every other. It’s not easy to work out the net benefit of a set of traits, either in your head or with mathematics. This is where computers come in.

Computers run through scenarios, fast

What computers really do within scientific research is often misrepresented or misunderstood. I frequently hear the phrase: “With a computer, you can get any result you want.” But this is not true. What a computer does is keep track of things for you.

To a large extent, this is what mathematics does too. I like to point out that mathematics is “the crutch of the feeble-minded”; it allows us to use symbols to embody complex relationships that we can then manipulate according to strict rules.

The computer is no different, except it allows us to keep track of vastly more variables, and to work out the consequences of the relationships over long periods of time. Since we set strict rules, of course, we can’t get “anything we want.” We get only what is allowed according to the rules.

But what are those rules?

In mathematics, you start with a set of assumptions, and you work out the consequences according to the rules of logic. This is still true inside a computer, but now we can also implement very specific rules – for example, the laws of chemistry, the effects of friction or the cost of finding a mate.

Researchers in a variety of fields turn to computer simulations to help them test ideas that they can’t investigate any other way. Astrophysicists use these kinds of models to simulate how stars form. Material scientists simulate the aging of nuclear weapons to predict if they will still work in the future.

In evolutionary biology, we might ask which factor shaped a particular trait or behavior. For instance, my colleague Kay Holekamp has been observing hyenas in Kenya for over 25 years, and she’s collected an enormous data set pertaining to the hunting habits (among other traits) of these animals. But even all those observations can’t tell us why she sees what she sees in the field. The reasons may lie in pressures that the population was under in the past, or maybe the pressures manifest themselves only over thousands of generations.

Even decades of observation leave us with questions about why animals behave in certain ways. Anne Engh, CC BY-NC-ND

To answer questions such as “Why don’t the highest-ranking female hyenas participate in the hunt?,” we have to study the consequences of different assumptions on the long-term survival of the group.

Evolutionary theory says that only beneficial traits survive in the long run, but it can often be hard to understand how a certain trait might help. This is because of all those trade-offs I mentioned, and sometimes the benefit of a trait only becomes clear after a long time. After all, evolution has had millions of years of trials, failures and successes. Even 50 years of observation might not reveal to us the long-term consequences of a set of traits and how they interact and play out in a complex world.

But a computer might work this out in minutes, as a population of 1,000 gazelles and a group of, say, 150 hyenas can be followed over thousands of simulated generations.

Matching theory to observation

In evolutionary science, computers thus are prediction machines: they answer questions like “What would happen under these rules, given I started in this world with these starting conditions?”

In our study of the evolutionary origins of risk aversion, for example, we could ask what happens to risk aversion if the total population was large, but composed of small groups with migration between them. Running the scenario, we found that risk aversion still evolved unless the migration rate was exceedingly high.

Of course, if you start with the wrong rules, or inappropriate starting conditions, the results may not match what we observe in reality. But this is exactly what we require in the scientific process. If the predictions are wrong, then we must modify either the rules, or the initial conditions (or both).

Once we do obtain a match between the computer simulations and real-world observations, we can’t stop there and conclude we’ve discovered the rules that correctly reflect what is happening in nature. We must, instead, test whether these rules also predict other things that we didn’t set out to test in the first place. For example, do the same set of rules also explain the observation that the spoils of a kill are not distributed equally among the hyenas?

This kind of thinking is no different from the way theory and experiment have worked in unison to build the complex and powerful framework of theoretical physics. In that quest, theories were laid down, for the most part, mathematically. In evolutionary biology, though, this is usually not possible simply because biology is too complicated.

Evolutionary simulations allow us to test hypotheses, but they’re not asking or even answering questions. We ask “What if,” and the computer dutifully responds: “In this case, this is what you would get.” The computer helps us “think forward in time” with blazing speed, and in evolutionary science this is precisely what is required to generate understanding.

The ConversationChristoph Adami, Professor of Microbiology and Molecular Genetics & Physics and Astronomy, Michigan State University

This article was originally published on The Conversation. Read the original article.

Featured Image Credit: Data image via

Now, Check Out

How sexually transmitted diseases might have driven the evolution of monogamy

By Rob Knell, Queen Mary University of London.

Exactly why so many humans choose monogamous pair bonds over juggling multiple partners has long been a mystery to scientists. After all, having several partners at the same time should lead to more offspring – an outcome you’d think evolution would favour. Now a new study has linked the phenomenon to sexually transmitted diseases, arguing that monogamy could have evolved because it offered protection against the threat of infection.

Monogamy is, of course, the norm in Western societies. But there are many cultures where a husband can have more than one wife (polygyny) or, less commonly, a wife can have more than one husband (polyandry). This diversity of human mating systems is also hard to explain. What we do know, however, is that many hunter-gatherer societies, living in small groups, were most often polygynous (and many remaining groups still are). But with the rise of agriculture, societies tended to become more complex – and less polygynous. In the most strictly monogamous societies, there was often a social punishment for polygynists, either informally or, as in many modern societies, through a legal system.

Many explanations for this evolution have been put forward, including changes to the way that women chose their partners, such as being faithful to men who invested in provisioning for them. Another possibility is that groups of monogamists may have performed better than groups of polygynists. But the new research adds a further option: could an increased risk of infection from sexually transmitted infections associated with polygyny have contributed to – or even driven – the overall move from polygyny to monogamy?

The indigenous Himba people, some of which are hunter gatherers, in northern Namibia are polygynist. Hans Stieglitz/wikimedia, CC BY-SA

Sexually transmitted diseases have been infecting humans for a long time. Prior to modern medicine, they also often caused significant harm – especially to the reproductive system. Clearly, these diseases infect polygynists more than monogamists, and it has been argued that when a polygynist and a serial monogamist have the same number of partners overall, the polygynist is more likely to pick up a dose of something nasty than the monogamist. According to computer modelling, this is because contact networks are more connected when you have concurrent partners than when you have serial partner change. Either way, overall, these effects could have had a big enough impact on the well-being of polygynists to allow monogamous individuals to take over a population.

The challenges of modelling

It’s certainly a good argument. But it’s hard to assess how likely it is to be true. This is because we know very little about the risk of sexually transmitted diseases in hunter-gatherer societies or historical societies transitioning to agriculture. This is a common problem in science: we can only make progress when we can test an idea, but plausible ideas are sometimes very hard to evaluate without massive effort.

One option in these cases is to do your experiment in the form of a computer simulation. This is what the researchers behind the new study did, modelling the impact of a bacterial sexually transmitted disease similar to gonorrhoea or chlamydia. Their results strongly back the hypothesis that such diseases could have triggered monogamy.

Light blue means polygamy is permitted while dark blue means it is not fully criminalised. Black means fully outlawed. wikimedia, CC BY-SA

In their model, sexually transmitted diseases tend to “fade out” from small groups such as polygynist hunter-gatherers. This occurs because of random chance events that are more likely to be important in small groups, such as all the infected people suddenly getting better or dying. In larger, agricultural groups, however, such fade-out is much less likely, so sexually transmitted diseases tend to persist, damaging the health and reducing the birth rates of polygynists while allowing monogamists to take over.

What’s more, the monogamists that are most likely to take over a group for a long period are those that follow a “punishment strategy”, which fits with what we observe in many societies today.

So is the puzzle solved? Not quite yet. Computer simulations are useful and can tell us important things, but they are always limited and necessarily simplify the real world. In this case, for example, the researchers assumed that the disease they were modelling had similar pathological effects on men and women, whereas in reality many sexually transmitted diseases affect women more severely than men, potentially changing the effect of the disease on polygynists.

Promiscuous fellow. Gilles San Martin/wikimedia,  CC BY-SA

Further questions are raised by research into sexually transmitted diseases in animals, which hasn’t really found a clear relationship between promiscuity and disease. In fact, computer modelling work focused on animals has found that promiscuous and monogamous individuals can coexist even in the presence of a dangerous disease. What’s more, there are examples of highly promiscuous animals which are heavily infected with sexually transmitted diseases yet carry on regardless (two-spot ladybirds in Continental Europe are one example, believe it or not).

As the researchers themselves point out, there are indeed some challenges associated with this idea. More detailed simulations or better data on sexually transmitted infections in societies where people live in small groups would make the picture clearer. For now, it remains an intriguing and plausible suggestion that we should explore further. Given the continuing threats posed by sexually transmitted diseases today it’s surprising that it’s taken this long for someone to put two and two together and suggest that the advent of monogamy may have served a very practical purpose.

The ConversationRob Knell, Senior Lecturer, Queen Mary University of London

This article was originally published on The Conversation. Read the original article.

Featured Photo Credit: wikimedia

Now, Check Out:

Human History Tells Us That We are an Invasive Species

Human populations have not always grown unchecked. A new study of South America’s colonization finds that for much of human history on the continent, human populations grew like an invasive species, which is regulated by the environment as it spreads into new places.

Populations grew exponentially when people first colonized South America. But then they crashed, recovered slightly, and plateaued for thousands of years after over-consuming local natural resources and reaching continental carrying capacity, according to the analysis.


“The question is: Have we overshot Earth’s carrying capacity today?” says Elizabeth Hadly, a professor in environmental biology at Stanford University and senior author of the new paper in the journal Nature.

“Because humans respond as any other invasive species, the implication is that we are headed for a crash before we stabilize our global population size.”

The paper is the first in a series on the interaction of local animal populations, humans, and climate during the massive changes of the last 25,000 years in South America. The series will be featured at the Latin American Paleontology Congress this fall.

The study lays a foundation for understanding how humans contributed to the Pleistocene era’s largest extinction of big mammals, such as ground sloths, horses and elephant-like creatures called gomphotheres.


It reconstructs the history of human population growth in South America using a newly assembled database of radiocarbon dates from more than 1,100 archaeological sites. Unlike many archaeological studies that look at environmental change in one particular site, the study provides a picture of long-term change, such as climatic fluctuations, fundamental to human populations rather than a single culture or ecosystem.

The researchers found strong evidence for two distinct phases of demographic growth in South America. The first phase, characterized by logistic growth, occurred between 14,000 and 5,500 years ago and began with a rapid spread of people and explosive population size throughout the continent.

Then, consistent with other invasive species, humans appear to have undergone an early population decline consistent with over-exploitation of their resources. This coincided with the last pulses of an extinction of big animals. Subsequent to the loss of these big animals, humans experienced a long period of constant population size across the continent.


The second phase, from about 5,500 to 2,000 years ago, saw exponential population growth. This pattern is distinct from those seen in North America, Europe, and Australia.

The seemingly obvious explanation for the second phase—initial domestication of animals and crops—had minimal impact on this shift, the researchers write. Instead, the rise of sedentary societies is the most likely reason for exponential population growth.

Practices such as intensive agriculture and inter-regional trade led to sedentism, which allowed for faster and more sustained population growth. Profound environmental impacts followed.

“Thinking about the relationship between humans and our environment, unchecked growth is not a universal hallmark of our history, but a very recent development,” says co-lead author Amy Goldberg, a biology graduate student at Stanford. “In South America, it was settled societies, not just the stable food sources of agriculture, that profoundly changed how humans interact with and adapt their environment.”

Today, as the world’s population continues to grow, we turn to technology and culture to reset nature’s carrying capacity and harvest or even create new resources.

“Technological advances, whether they are made of stone or computers, have been critical in helping to shape the world around us up until this point,” says co-lead author Alexis Mychajliw, a graduate student in biology. “That said, it’s unclear if we can invent a way out of planetary carrying capacities.”

The team’s paper is published in the journal Nature.

Source: Republished from as a derivative work under the Attribution 4.0 International license. Original article posted to Futurity by .

Now, Check Out:


How a 3D-Printed Dracula Orchid Helped Scientists Understand How it Tricks Bugs

Using a 3D printer, scientists have unlocked the mystery of how plants called Dracula orchids use mimicry to attract flies and ensure their survival.

The research, done in the last unlogged watershed in western Ecuador, is a win in the field of evolutionary biology and helps provide information that should benefit conservation efforts. The approach could also be applicable to studies of other plant-pollinator systems, researchers say.

A 3D-printed orchid (lower right) is being prepped in the lab. The two orchids in the cup are real. (Credit: Melinda Barnadas)
A 3D-printed orchid (lower right) is being prepped in the lab. The two orchids in the cup are real. (Credit: Melinda Barnadas)

“Mimicry is one of the best examples of natural selection that we have,” says Barbara “Bitty” Roy, a biologist at the University of Oregon. “How mimicry evolves is a big question in evolutionary biology. In this case, there are about 150 species of these orchids. How are they pollinated? What sorts of connections are there? It’s a case where these orchids plug into an entire endangered system.”

Dracula orchids grow in Central America and northwest reaches of the Andes Mountains in South America. The Dracula label literally means “little dragon” because of a face-like feature in the flowers. Some observers say they see Count Dracula as a bat that appears in vampire depictions in literature and the movies.

“Dracula orchids look and smell like mushrooms,” says Tobias Policha, an adjunct instructor and plant scientist in the Institute of Ecology and Evolution and lead author of the study that is published online in the journal New Phytologist. “We wanted to understand what it is about the flowers that is attractive to these mushroom-visiting flies.”

Continue reading to learn how the researchers used “chimera” orchids to discover how these orchids lure in the flies…


Bigger Brains Seem to be an Extinction Risk

Many of the pressures that have put animal life on the precipice of the sixth mass extinction are easy to spot: pollution, climate change, over-hunting, fractured habitats. Now research suggests relative brain size could be another important factor.

The findings, published in the Proceedings of the Royal Society B, come as something of a surprise. Research has long shown that larger brains can offer advantages, such as the ability to solve problems in a changing environment.


“If the landscape becomes colder, an animal might not be able to grow dense fur, but these animals can problem-solve,” says study author Eric Abelson, who conducted the work when he was a doctoral researcher working with Rodolfo Dirzo, professor of biology at Stanford University.

“They might use cognition to overcome a colder environment by building a warmer nest, or choosing to spend more time in the sun.”

But there’s a tradeoff to these benefits, Abelson says. Neural tissue is incredibly expensive to grow and support, so the animal has to eat more or spend fewer calories doing other things.

This expense might expose such animals to a bigger hit from the other pressures typically related to extinction, such as resource scarcity. Or there might simply be scenarios that a relatively larger brain can’t out-think—a relatively bigger brain, for instance, may not help an aquatic animal that lives in polluted waters.

To study the relationship, Abelson calculated the relative brain size—a measure that normalizes absolute brain size compared to body size—of several hundred living mammal species. He then compared this to the International Union for Conservation of Nature’s (IUCN) list of the current endangerment status of those mammals. Species with larger relative brain sizes were more likely to be endangered with extinction.

And, smaller mammals with a larger relative brain size seem to fare the worst.

This is just a start to understanding the role brain size plays in the extinction vulnerability of mammalian species, Abelson says. Additional studies are needed to better understand the relationships between brain size and extinction risk.

For now, the discovery might provide another tool to assess which mammals might be at risk and how to focus conservation efforts.

The Cozumel raccoon (Procyon pygmaeus), also called the pygmy raccoon,[3] is a critically endangered species of island raccoon endemic on Cozumel Island off the coast of the Yucatan Peninsula, Mexico. (Credit: Cristopher Gonzalez/Flickr)
“Right now, conservation efforts could benefit from better predictions of which animals might become endangered in the future,” says Abelson, who is currently a researcher at the US Forestry Service’s Pacific Southwest Research Station. “Understanding the role that relative brain size plays in endangerment risk might give us another tool to identify the animals that might face trouble down the road.”

The data also revealed that the cost-benefit trade-off of a relatively large brain plays out differently in small-bodied mammals than in large ones, a pattern that Abelson plans to investigate further. He hopes that insights into an easily measured trait—brain size—might play a useful role in designing new conservation strategies. The research is published in Proceedings of the Royal Society B.

Republished from as a derivative work under the Attribution 4.0 International license. Original article posted to Futurity by

Featured Photo Credit:  Márcio Motta/Flickr, CC BY

Now, Check Out:

Testing Ancient Human Hearing Via Fossilized Ear Bones

Rolf Quam, Binghamton University, State University of New York

How did the world sound to our ancient human relatives two million years ago?

While we obviously don’t have any sound recordings or written records from anywhere near that long ago, we do have one clue: the fossilized bones from inside their ears. The internal anatomy of the ear influences its hearing abilities.

Using CT scans and careful virtual reconstructions, my international colleagues and I think we’ve demonstrated how our very ancient ancestors heard the world. And this isn’t just an academic enterprise; hearing abilities are closely tied with verbal communication. By figuring out when certain hearing capacities emerged during our evolutionary history, we might be able to shed some light on when spoken language started to evolve. That’s one of the most hotly debated questions in paleoanthropology, since many researchers consider the capacity for spoken language a defining human feature.

Many primates vocalize; only people have full-blown language., CC BY-SA

Human hearing is unique among primates

We modern human beings have better hearing across a wider range of frequencies than most other primates, including chimpanzees, our closest living relative. Generally, we’re able to hear sounds very well between 1.0-6.0 kHz, a range that includes many of the sounds emitted during spoken language. Most of the vowels fall below about 2.0 kHz, while the higher frequencies mainly contain consonants.

Thanks to testing of their hearing in the lab, we know that chimpanzees and most other primates aren’t as sensitive in that same range. Chimpanzee hearing – like most other primates who also live in Africa, including baboons – shows a loss in sensitivity between 1.0-4.0 kHz. In contrast, human beings maintain good hearing throughout this frequency range.

We’re interested in finding out when this human hearing pattern first emerged during our evolutionary history. In particular, if we could find a similar pattern of good hearing between 1.0-6.0 kHz in a fossil human species, then we could make an argument that language was present.

Testing the hearing of a long-gone individual

To study hearing using fossils, we measure a large number of dimensions of the ancient ears – including the length of the ear canal, the size of the ear drum and so on – using virtual reconstructions of the fragile skulls on the computer. Then we input all these data into a computer model.

Published previously in the bioengineering literature, the model predicts how a person hears based on his ear anatomy. It studies the capacity of the ear as a receiver of a signal, similar to an antenna. The results tell us how efficiently the ear transmits sound energy from the environment to the brain.

We first tested the model on chimpanzee skulls, and got results similar to those of researchers who tested chimpanzee hearing in the lab. Since we know the model accurately predicts how humans hear and how chimpanzees hear, it should provide reliable results for our fossil human ancestors as well.

Excavations at Sterkfontein. This area contained regions of open savanna when these fossil hominins lived here.
John Walker


Myth of the ‘Missing Link’ in Evolution Does Science No Favors

Sean Nee, Pennsylvania State University

This spring, the world learned of a newly discovered missing link between microbes and humans called Lokiarchaeota. The actual story is that the microbe Lokiarchaeota, discovered on the deep sea floor by a hydrothermal vent called Loki’s Castle, shares features with both bacteria and us. The spin is that this makes it a missing link between the two. Microbiologists have been discreetly quiet about this narrative fiction; although the microbe is fascinating, and so deserves the spotlight, it is no more a missing link than the platypus is a missing link between ducks and humans.

This missing link imagery, based on the idea that evolution is a methodical process with logical, continuous connections to be discovered and mapped, might set up a good story. But it’s wrong – and can detrimentally influence our understanding of immediately threatening processes like the rapid evolution of flu.

The Great Chain of Being

The notion of missing links in evolution comes from medieval theology’s Great Chain of Being, an idea that survived Darwin and still persists. It is compelling – not least because you-know-who winds up at the presumed pinnacle of evolution.

Two views of relationships among contemporary animals known to medieval theologians.

On the left of this figure is the usual picture of life on Earth, a chain of creatures smoothly lined up from fish, through frogs, lizards and on up to human beings. The idea is that, step by step, life continually “advanced” to gain greater and greater complexity. The Great Chain of Being is ingrained enough that it’s even become a way for comedy and cartoons to mock the zeitgeist.

But what is the figure on the right?

Both are correct and show exactly the same information about the relationships – humans are more closely related to monkeys than both are related to lizards than all are related to frogs, and so on. But we know people are not a link between fish and frogs in any meaningful sense. Any of the nodes in a family tree can be rotated without changing the correct relationships among the present-day members – the rotation on the left is the usual one just because we like to be on top!

The power of this kind of imagery in science could not have been more starkly revealed than in the positioning of Lokiarchaeota as a missing link in such a fictional chain.

One of the great triumphs of our understanding of life is that the eukaryotic cell, your kind of cell, is an ancient fusion through symbiogenesis of entirely disparate life forms. It’s a chimera or Frankenstein. The microbe Lokiarchaeota is a hodgepodge of disparate elements resembling the three Domains of Life, some of which we share. It is a Frankenstein we have not seen before, but a Frankenstein nonetheless – not a link in a chain.

We see more and more such startling discoveries, all arising from our use of new molecular technologies to explore the Earth’s biodiversity. For example, Penny Chisholm’s lab at MIT has discovered viruses that have borrowed cassettes of photosynthetic information from marine plankton. A remarkable discovery, but not because it is a missing link between viruses and plants.

The continuous chain imagery, step-by-step and link-by-link, is all-pervasive in how we think about evolution – and it affects your daily life. Consider flu, which requires yearly vaccination because it constantly evolves to evade our immune system; what protected you last year will not protect you this year. Ignoring the thankfully rare emergence of monstrosities like Spanish Flu after WWI, the never-ending evolution of the influenza virus is typically modeled as occurring by antigenic drift – a smooth, continuous wander through evolutionary space, finding new places to hide from the immune system.