A Stone-Age Meeting of Minds

Thomas Wynn & Frederick L Coolidge. American Scientist. Volume 96, Issue 1. Jan/Feb 2008.

More than 40 millennia ago, bands of Homo sapiens trekked up the Danube River into central Europe. There they ran into the native Neandertals (Homo neanderthalensis), who were also formidable huntergatherers and who had similarly survived dramatic, even catastrophic, shifts in climate over the preceding 40,000 years. Strangely enough, the two had met before. Around the time of the beginning of the last ice age (some 80,000 years ago), Neandertals had pushed into the Near East, where Homo sapiens had resided for perhaps 20,000 years. These people apparently could not compete with their brawny cousins, and they retreated back into Africa, where they had originated. But the second encounter had a very different outcome: Within 10,000 years, more or less, Neandertals were extinct.

Anatomically, these later Homo sapiens were indistinguishable from the earlier ones-both looked just like modern humans (so we refer to them here as “humans” for simplicity). And both were equally puny compared with the powerfully built Neandertals. So what had changed? Paleoanthropologists trying to answer this question are slowly inching toward a consensus, although it is still not unanimous. They believe that these later Homo sapiens must have had certain cognitive abilities not shared by Neandertals. The many spectacular cave paintings preserved in France and Spain suggest that one key difference may have been our Paleolithic ancestors’ capacity to use symbols as an aid in abstract thought. An even more popular (and more speculative) candidate is the human facility for language. But in our view the archaeological evidence actually points to an alternative: the evolution in Homo sapiens of an enhanced ability to plan and strategize. Comprehending how such skills allowed them ultimately to prevail requires first off a basic understanding of the mental capacity of Neandertals.

Behind the Brow Ridges

Paleoanthropologists know a fair amount about Neandertals. Not only did they live in the relatively recent past (the last one died less than 30,000 years ago), they also resided in Europe, where the archaeological record is rich and well documented. From the abundant evidence found there, scholars have been able to paint a rather good picture of the species, one that shows an impressive adaptation to glacial and interglacial conditions.

Neandertals’ stone implements were sophisticated, and they had a variety of methods for producing the sharpened pieces of stone that were the basis of their technology. They varied their tool-making operations according to the nature of the raw material available to them, and as problems arose they could subtly adjust their technique. They shaped their stone tools appropriately for various purposes: steep edges for scraping, serrated ones for sawing and so forth. Among these artifacts are sharp points, which the Neandertals attached to the ends of thrusting spears. These stone-tipped weapons brought down many of the large mammals of the Neandertals’ world: mammoth, rhinoceros, horse, bison, reindeer. Indeed, just about any big package of meat went on the Neandertal menu.

The archaeological evidence of their hunting prowess and the measurement of trace elements from their bones both indicate that Neandertals were committed carnivores, deriving a substantial proportion of their diet from large herbivorous mammals. Neandertals appear to have been very effective tacticians who used features of the landscape to their advantage, again and again slaughtering considerable numbers of a single species at the same locale.

It was, however, a risky business. Many of their skeletons show healed upper-body fractures, incurred no doubt from bouts with dangerous prey. That the injured hunters survived attests to the effectiveness of their social groups, which were small, some probably with as few as 30 or so individuals. It seems these bands were nevertheless large enough to support the casualties of hunting and also to care for the aged. They occupied relatively circumscribed areas, each group perhaps limiting itself to a single river valley, and only occasionally did they venture farther afield. (Archaeologists can surmise as much from the nature of the raw materials they employed in making various artifacts.) All in all, paleoanthropologists regard Neandertals as a very successful species, one that endured in Europe for more than 200,000 years.

But there are several things missing from the Neandertal record that archaeologists usually find among the vestiges of modern humans. In particular, there is no indication of continuing innovation: Neandertals made the same kinds of tools for 200,000 years without ever tinkering with the basic components. They never produced hunting gear that was more elaborate than their stone-tipped spears. What’s more, Neandertal hearths were ephemeral, ad hoc affairs, which were rarely placed in pits or lined with stone or even used for intense burning over long periods. Although Neandertals used powdered mineral pigments (perhaps for body painting), they produced no convincing examples of durable art or depictions of any sort. They did not fabricate personal ornaments, at least not until their second encounter with modern humans some 40,000 years ago.

From a cognitive perspective, the Neandertal suite of mental abilities-and missing abilities-suggests that they possessed minds that were flexible and could respond to changing conditions, but that they practiced little, if any, conscious invention and did not form any innovative plans of action. The modern equivalent that this mindset most resembles is that of a person who has developed some kind of expertise.

Expertise is based on patterns and procedures that are held in long-term memory and that can be accessed quickly and deployed with a minimum of attention. This ability underpins most of the mundane activities of the modern world-driving to work, cleaning the house, cooking-but also many things that we hold in high esteem, say, playing a musical instrument or rendering a medical diagnosis. One significant difference between expertise and other forms of problem solving is the way it is normally acquired: by apprenticeship. Because the major components are procedural memories, one must learn them through practice, failure and more practice. The time required for developing expert abilities in just about any field is typically a decade or so. It takes this long because the novice needs to learn not just the basic rules and motor memories but also how to adapt them in the face of changing conditions. This flexibility enables the expert to solve even novel problems by drawing from a large, sometimes vast, range of internalized routines and procedures.

Expertise remains perhaps the most common problem-solving strategy people use today. For Neandertals, it appears to have been the only one. Their lack of inventiveness and lack of long-range planning suggest that they did not have a second kind of modern problem-solving ability, one known to psychologists as “executive functions,” which are largely enabled by something called working memory.

Working Memory

The term working memory is a bit of a misnomer. Some people think that it only refers to short-term acoustic storage of the kind employed when one reads or hears a new phone number and repeats it aloud or under one’s voice to keep the sequence of numerals in mind long enough to dial it. Indeed, the ability to recall a few words in sequence is a component of working memory, a concept that Alan D. Baddeley and Graham Hitch, both of the University of York, originally proposed in 1974. Subsequently, their concept became the predominant model of how the mind interacts with what it hears and sees and how it puts a record of these experiences into short and long-term storage.

As Baddeley currently views it, buried within each of our minds is a central executive whose responsibilities include attention to certain tasks and the decision making required to ensure that they are consistent with our overall goals in spite of interference from much irrelevant stimuli. Other functions of import in this regard include the inhibition of automatic responses and the updating of relevant information during the performance of the job at hand, allowing the chosen strategy to be adjusted as unexpected problems arise.

This central executive has two handy tools at its disposal to aid in remembering things. The first is something psychologists call a phonological loop, which includes the vocal and subvocal articulation that is so helpful in remembering phone numbers. The second might be thought of as a mental sketchpad for the short-term storage of visual and spatial information. The central executive also has a temporary memory bank for the integration of phonological and visual or spatial information.

Baddeley’s ideas have received much empirical support, and an important portion of this research has attempted to identify the parts of the brain involved in working memory. Investigators have found, for example, that the attention and decision-making aspects of working memory depend on both the prefrontal cortex (roughly, the outer, front part of the brain) and the cingulate cortex (which lies deeper and midway back). Storage of sounds spoken by someone else and the mental processing necessary to generate articulate speech appear to be functions of the parietal lobes, which are found toward the back of the head. The mental sketchpad has been harder to localize: It appears to involve both the prefrontal cortex and the parietal lobes as well as a region located at the very back of the head called the occipital lobe. These results, drawn from various brainimaging studies, are quite impressive, but they are also a little misleading: There is probably no part of the mental skill set that Baddeley and Hitch described that solely relies on a single circumscribed region of the brain.

Some of the recent research into working memory is concerned with its relation to other mental activities that take place at a similarly high cognitive level, such as language, reasoning and intelligence. One tack investigators have taken has been to identify people who appear to have greater and lesser working-memory capacity (also labeled working-memory span). However, because there is no definitive way to measure this mental attribute, the results vary depending on the cognitive domain under consideration.

For example, psychologists have found that general working-memory span correlates well with a person’s fluid intelligence, the kind of intelligence involved in solving novel problems. Researchers have also determined that phonological-storage capacity is closely related to a person’s ability to comprehend and produce language (for example, to the breadth of someone’s vocabulary). And when investigators measure phonologicalstorage capacity by asking the subject to repeat increasingly long strings of digits backwards, the results tend to correlate with a person’s ability to produce and understand metaphors. (There is also even some recent evidence that reduced working-memory capacity may be related to psychopathologies such as schizophrenia and personality disorders.)

It may be hard to see immediately how a Neandertal hunter could have made much use of the kinds of abilities measured in these tests, remembering phone numbers and having a rich vocabulary not being all that useful during the Ice Age. But as we argue below, he or she might have benefited enormously. As it was, Neandertals never acquired that mental equipment; only humans did.

We suggest that a genetic mutation in the ancestors of modern humans caused a rewiring of brain neurons that resulted in an enhanced workingmemory capacity (or in an enhancement in one of its components). Critics of this basic position argue that modern thinking is unlikely to have evolved via a single genetic mutation, and we agree: It is unlikely-but not impossible. And some recent work in molecular genetics seems to point to the importance of one or perhaps a small number of mutations in shaping mental capacities.

For example, in 2001, Cecilia S. L. Lai of the University of Oxford and her coworkers found that disruption of a single gene known as FOXP2 causes severe problems in both articulation and language comprehension. On that basis, they suggested the normal version of this gene is somehow involved in the developmental process that gives people their unique abilities with speech and language. In 2002, Wolfgang Enard of the Max Planck Institute for Evolutionary Anthropology, Lai and several colleagues showed that although this gene exists in many mammalian species, the version found in Homo sapiens underwent some unique changes during recent evolution. Although it may once have been tempting to think that this gene accounted for the difference between human and Neandertal mental capacities, just this past October Johannes Krause, also at the Max Planck Institute for Evolutionary Anthropology, and his coworkers showed that Neandertals had the human version of this gene.

Another example comes from recent work of Bruce T. Lahn of the University of Chicago and his colleagues. They found that a particular form of a gene that is active in the fetal human brain, one known as microcephalin (or MCPHl), became prominent roughly 37,000 years ago. These investigators suggested that a change in this single gene could have caused human brains to become significantly larger or more capable at that time. They noted that the observable effects of the genetic mutation could not be determined from their study, although they speculated that among the possibilities were changes in personality or cognition.

This study not withstanding, we think it more likely that something other than a change in a single gene accounts for the difference in mental capabilities between humans and Neandertals. Of course, our arguments still hinge on the assumption that working-memory capacity and its related functions are heritable, and indeed, there is much recent research that amply demonstrates that they are. All of these investigations have shown that working memory is under the influence of multiple genes that must interact in some complex fashion. And if a particular set of genetic changes produced a phenotype with even a small reproductive advantage, it could have increased in frequency fairly rapidly.

The noted geneticist J. B. S. Haldane devised the following illustrative example: If a new gene gives the individuals who have it only a 1 percent advantage in reproduction (that is, a group of people with the gene would produce 1 percent more offspring than the same size group lacking it), it would increase in frequency from 0.1 percent to 99.9 percent in just 4,000 generations. For humans, 4,000 generations amounts to only 100,000 years-a very brief moment in evolutionary time and exactly the interval that most of the people who study modern human origins limit themselves to considering.

Cognitive Archaeology

To document the presence of modern executive functions in our ancient forebears, one needs to be sure, first, that the archaeological signature is a reliable indicator of a specific activity and, second, that it is one that indeed requires such modern mental capabilities. A prehistoric artisan may have used some well-developed executive functions to plan and fabricate a hafted spear, for example, but if expertise can also produce such a weapon (and it can), archaeologists would be wise to conclude that the simpler skill was responsible. Nevertheless, some persuasive archaeological evidence for the existence of modern executive functions can be found in the nature of artifacts produced tens of thousands of years ago.

The most visible results come from a consideration of ancient technology, some of which requires the delayed gratification and remote planning in space and time that is typical of modern executive functions. Some of the earliest evidence of this kind in the archaeological record comes from Near Eastern desert kites. Ancient hunters used these large-scale lines of mounded stone to funnel gazelle into corrals and then killed the animals relatively easily. The earliest kites date back 12,000 years to the very end of the Pleistocene. Later examples of such technology include remotely operated traps and elaborate fish weirs to aid in rounding up prey. Good records of such facilities come from the European Mesolithic period (which spans the interval between about 10,000 and 6,000 years ago) and the North American Archaic period (8,000 to 3,000 years ago), where such apparatuses were important elements in systems of managed foraging.

Slightly less compelling is the evidence for what archaeologist Peter Bleed of the University of Nebraska-Lincoln termed “reliable” weaponry. These are systems of complex gear that require a great deal of up-front preparation time and also down time for maintenance. A case in point would be harpoons with detachable heads that were affixed to separate “spear throwers.” Such things are especially common in high latitudes, where hunters must focus on a narrow range of species and where complex technologies yield a significant payback. Inuit implements provide perhaps the best nonindustrial exemplars in contemporary times, and archaeologists have been able to trace the development of such technology back more than 4,000 years into the prehistory of that culture.

Identifying such technologies in the archaeological record of more ancient times is a challenge, because it can be difficult to recognize the investment of labor required for construction and maintenance of a particular tool. But archaeologists have found some compelling examples that date back about 20,000 years further into the past, during the late Paleolithic in Europe, where combinations of harpoons and spear throwers were clearly in operation.

Neandertals, who were also highlatitude hunters, used neither such elaborate systems of hunting gear nor sophisticated traps, either of which would have reduced the daily dangers of their foraging and increased its effectiveness. The difference is one of the clearest contrasts in the archaeological signatures of modern humans and Neandertals. It suggests strongly to us that Neandertals lacked the advanced executive functions and working-memory capacity that people have today.

But the evidence goes further than just hunting and trapping gear. Modern hunter-gatherers all manage their food supply to some extent. Such efforts can be as simple as scheduling the use of specific resources during certain seasons or as complex as altering habitats and manipulating the plants and animals themselves. Some indigenous groups in Western Australia, for example, burned tracts of land to force a second green-up of grass to attract herbivores. They also used a system of rotation in which they would burn specific areas only after letting them lie undisturbed for several seasons. It is such long-range planning, often with a horizon measured in years, that most clearly reflects the presence of modern executive functions.

Archaeological evidence for managed foraging is ubiquitous for the relatively recent past. During the Archaic period in North America, for example, foragers exploited a large array of plant resources even though their continued use required careful scheduling. These people also manipulated the growth of plants such a goosefoot (Chenopodium sp.) to assure future yields. Specifically, they sometimes moved plants to a new location or weeded out competing growth. According to Bruce D. Smith of the Smithsonian Institution’s National Museum of Natural History, these activities led eventually to the cultivation of goosefoot. Indeed, after the end of the Pleistocene, managed foraging appeared virtually everywhere, and in several areas agriculture soon followed. This synchrony has always been a puzzle, and the commonly accepted explanations that cite climate change or demographic growth have always seemed strained (especially in those local cases where there is no evidence for either). But if agriculture emerged out of foraging systems that required modern mental capacities, human cognitive evolution itself may have been a prerequisite for such food production.

Evidence for managed foraging extends even more deeply into the past. Cambridge archaeologist Graeme Barker and colleagues have recently made a provocative argument for use of fire management in the area around Niah Cave in Borneo. Pollen profiles from the site indicate times with high percentages of plant species known to invade burned landscapes, episodes that were more frequent than one would expect for the relatively moist conditions that surrounded the cave at the time. Some corroboration comes from finding the faunal remains of pigs of all ages, an observation that implies the use of traps. These features are clearly quite ancient, although the chronology at Niah is not precise. The relevant archaeological levels date to some time between 43,000 and 28,000 years ago. Almost as old (23,000 years) and two continents away in southern Africa, there is evidence for a significant reliance on corms, which are fleshy, sub-surface storage organs of certain tropical plants. Corms are adapted to fire ecologies, and heavy corm use almost certainly required the intentional setting of bush fires.

These are the earliest indications of ecological management, but both of these examples provide only indirect evidence. They are provocative, but do not qualify as smoking guns (although the burned land surely did a lot of smoking). More convincing examples, such as the pits used to freeze and store meat in the Ukraine, are available, although they date to only about 20,000 years ago.

We do not mean to imply that much earlier human and Neandertal systems of foraging were not effective means of subsistence; obviously they were. They allowed both species to survive for some 200,000 years. But these systems appear to have been largely tactical, rather that strategic in nature. Thus they suggest a lack of significant working-memory capacity.

Perhaps the most direct evidence for modern levels of working memory comes from artifacts that embody abstract concepts or that aid in making calculations. The Hohlenstein-Stadel figurine, unearthed from a site in southwestern Germany, is one of the most evocative of such objects. This carved-ivory representation of a person with the head of a lion was made about 32,000 years ago. Such chimeras are common elements in modern myth, folklore and religion. So, if nothing else, the Hohlenstein-Stadel figurine has a familiarity to it that is reminiscent of modern forms of symbolic thought. But the image also has important implications for workingmemory capacity. To conceive of such an abstract idea as “lion-man,” someone had to have held two very different concepts in mind, considered their features and combined a subset of each into a new, totally imaginary beast. Such thinking belongs to the province of working memory. Humans have such abilities. We come equipped with the cognitive machinery that’s necessary. Nobody would deny that the man or woman who imagined this fantasy creature had a modern mind.

Almost as impressive are the enigmatic objects sometimes termed tally sticks. These are rods or plaques of bone engraved with dots, slashes or notches. The oldest were made perhaps 28,000 years ago. The late Alexander Marshack, a journalist and independent scholar, and Francesco d’Errico of the Universite de Bordeaux examined most of these objects and in many cases were able to document that the marks were produced by different tools at different times. It thus appears as though someone was keeping track of something. D’Errico has argued that they were memory aids, but we think that there was a bit more to them. They must have helped people perform some sort of calculation, even if it was as simple as keeping a cumulative record. Transferring such data to an object frees up working memory and therefore expands a person’s mental-processing capacity. The existence of such artifacts is a strong indication that their users had a working memory that was at a modern level. Indeed, these artifacts may represent the first physical means of expanding working memory itself.

Out of Africa?

Among the items we have discussed, there is no convincing evidence for executive decision making in the modern sense prior to about 32,000 years ago. Alert readers will have noticed a slight chronological inconsistency: The archaeological record indicates that modern humans moved into Central Europe before 40,000 years ago, and yet the artifacts they left behind do not give any hint that they used modern executive functions until some 8,000 years later.

There are several ways to understand this gap. The most parsimonious is that it took a very long time for significant differences in technology, subsistence practices and social interactions to develop. A more provocative idea is that the genes for enhanced working memory did not reach frequencies high enough to enable significant numbers of people to rely on them until a long while after modern humans arrived in Europe (and in other parts of Eurasia). In both cases, the genes responsible for enhanced working memory capacity would have evolved first in Africa, perhaps along with or soon after the hypothesized population bottleneck that Stanley H. Ambrose of the University of Illinois at Urbana-Champaign thinks occurred about 70,000 years ago.

Whenever and however they were acquired, these genes, and the executive functions they made possible, provided modern humans with an ability to conceive of and carry out long-range plans of action, an ability Neandertals never enjoyed. The difference was small, but it ultimately had a profound consequence: Our species survived and flourished, whereas Neandertals met their demise.