May 20, 2011

External symbolic storage

Posted in Language and Myth, Values and the pfc tagged , , at 10:11 am by Jeremy

My last post described what I call the “external pfc” – the accumulation of symbolic networks of meaning that literally sculpt the growing brain of every infant born into a particular culture.  This idea is not completely new.  A celebrated and influential cognitive neuroscientist, Merlin Donald, has described the power of what he calls “external symbolic storage,” an idea that has generated much interest in academic circles, including symposia dedicated solely to exploring this idea further.  This section of my book, Liology: Towards an Integration of Science and Spirit, describes the linkage between Donald’s idea of “external symbolic storage” and my concept of the external pfc.

[PREVIOUS POST]

External symbolic storage

Through these mechanisms, the external pfc exerts a profound influence on the shaping of the individual mind.  However, the power of the external pfc is magnified even further by the existence of what Donald refers to as “external symbolic storage.”  When early humans arrived in Europe and began carving and painting their first artefacts, they were forming external manifestations of the symbolic web of meaning structured by their mythic consciousness.  These were, in Donald’s words, “the first irrefutable expressions of a symbolic process that is capable of conveying a rich cultural heritage of images and probably stories from generation to generation.  And they are the first concrete evidence of the storage of such symbolic information outside of a human brain.  They mark a change in the structure of human cultures.”[1]  These were the original forms of external symbolic storage: the set of physical objects constructed, shaped or used by humans to hold and communicate a symbolic meaning beyond mere utilitarian function.

An early example of external symbolic storage: cave paintings from Lascaux, France.

While artwork is the most obvious example of external symbolic storage, it also includes personal ornamentation such as jewellery, stone-working styles, and even the spatial patterns of how a campsite is used.[2]  The crucial importance of this new form of symbolic storage is that now, the external pfc no longer resides merely in the network of other people’s minds.  It now takes up permanent residence in a set of concrete symbols that remain fixed, outliving  those who initially constructed them, and communicating stable symbolic meaning to countless new generations.  As Donald puts it, “this is more than a metaphor; each time the brain carries out an operation in concert with the external symbolic storage system, it becomes part of a network.  Its memory structure is temporarily altered; and the locus of cognitive control changes.”[3]

The power of external symbolic storage to shape the human mind arises partially from its fixed and stable attributes, but also because the nature of its symbolic meaning is different from the meaning that arises within a human mind.  Donald explains this crucial distinction by contrasting the biological memory records created by the brain, known as engrams, with external symbols which he calls “exograms.”  Engrams, he writes, are ” impermanent, small, hard to refine, impossible to display in awareness for any length of time, and difficult to locate and recall…  In contrast, external symbols give us stable, permanent, virtually unlimited memory records.”[4]

Because of this distinction, engrams and exograms store a qualitatively different type of information.  Consider a common abstract notion, such as patriotism.  Each time you think of your country, your mind will produce something slightly different than the previous time.  The concept arises within a tangled, momentary web of feeling, emotion, symbol, memory and narrative.  Now think of your nation’s flag.  The information stored in this external symbol is far more fixed, virtually unalterable.  The next time the flag is unfurled it will store the same symbolic information that it held the previous time.  Of course, over extended periods, even the information of exograms may degrade or disappear.  We no longer know what the Lascaux cave paintings symbolize.  But it is the relatively fixed nature of exograms that gives them so much power to influence each new generation of human minds.

A more modern example of external symbolic storage

External symbolic storage may therefore be said to stabilize symbolic meaning within a group, thus permitting communities to expand massively in size and complexity without disintegrating.  As Tomasello has pointed out, institutions that we take for granted such as marriage, money or government, exist only because their reality is grounded in “the collective practices and beliefs of a social group” that relies on external symbolic storage to maintain permanent and stable meaning.  Since the days of the Upper Paleolithic revolution, the sheer volume of external symbolic storage has of course expanded vastly.  In our modern world, it incorporates virtually everything around us, including books, newspapers, the internet, television, music, architecture, interior design, fashion, road signs… the list is endless.   Without external symbolic storage, human civilization could never have developed.  However, it has implications for the autonomy of each individual pfc’s search for meaning which need to be clearly understood.


[1] Donald (2001) op. cit., 374.

[2] For a full discussion of these other types of external symbolic storage, see Wadley, L. (2001). “What is Cultural Modernity?  A General View and a South African Perspective from Rose Cottage Cave.” Cambridge Archaeological Journal, 11(2(2001)), 201-21.

[3] Donald, (2001) op. cit., 313.

[4] Ibid., 308-10.

November 28, 2010

The patterning instinct

Posted in Hunter-gatherers, Language and Myth tagged , , , , at 12:11 am by Jeremy

The human prefrontal cortex (pfc) instills in us a patterning instinct that shapes patterns of meaning to make sense of our world.  This section from my book Finding the Li: Towards a Democracy of Consciousness explains how this patterning instinct forms the essence of our mythic consciousness that is the source of religious thought.  It then begins to explore the question of how an infant’s pfc first begins to lock into the patterns of meaning of its specific culture.

[PREVIOUS POST]

The patterning instinct

The !Kung Bushmen possess one of the most ancient unbroken cultural traditions in the world.  As noted earlier, they belong genetically to one of the earliest lineages of the human race, dating back to before the takeover by the L3 lineage which now dominates the globe.  Their technology, “if uncovered by an archeologist and taken in isolation, would place them in the late Stone Age.”  Not surprisingly, anthropologists have been drawn to study them to gain insights into the earliest forms of human cognition.  Merlin Donald describes how “myth and religion permeate every activity” of their daily lives from the way they hunt wild animals to the celebration of a girl’s first menstruation.  The !Kung take their beliefs so seriously that they will rarely even discuss them; when they do, it’s only with hushed voices, and they’re afraid even to utter the names of their gods.  Donald summarizes their mythical thought  as “a unified, collectively held system of explanatory and regulatory metaphors.” He sees their sophisticated and complex ritual and myth as a paradigmatic example of how the human mind “has expanded its reach … to a comprehensive modeling of the entire human universe.”[1]

!Kung Bushmen: their mythic consciousness arises from the patterning instinct of the pfc

This is the essence of the mythic consciousness that arose with the Upper Paleolithic revolution, and it’s one that Donald relates closely to the development of fully modern language.  Modern language was first used, he proposes, “to construct conceptual models of the human universe.  Its function was evidently tied to the development of integrative thought – to the grand unifying synthesis of formerly disconnected, time-bound snippets of information.”  The pre-eminence of myth in early human society, Donald argues, is “testimony that humans were using language for a totally new kind of integrative thought,” which involved the “first attempts at symbolic models of the human universe.”[2] This is why, as Boyer has put it, “religion as we know it probably appeared with the modern mind.”[3]

In the previous chapter, we discussed how the pfc’s patterning instinct works to mold the young infant’s brain by picking up patterns in the voices she hears around her until she locks into those sounds that match her particular language, ignoring those that don’t fit.[4] Similarly, we now see the pfc honing into patterns of meaning to make sense of the everyday world, to create Donald’s “comprehensive modeling of the entire human universe.”  Crucially, the way the pfc applies meaning is to use the same symbolic behavior that it had developed for its social and linguistic capabilities.  As Deacon describes it, “the symbolic capacity seems to have brought with it a predisposition to project itself into what it models.”  Deacon compares the pfc’s symbolic predisposition to the relentlessly focused perceptions of an autistic savant.  The savant, he writes, “instead of seeing a field of wildflowers, sees 247 flowers.  Similarly, we don’t just see a world of physical processes, accidents, reproducing organisms, and biological information processors churning out complex plans, desires, and needs.  Instead, we see the handiwork of an infinite wisdom, the working out of a divine plan, the children of a creator, and a conflict between those on the side of good and those on the side of evil.”  This is the inevitable and all-embracing power of the mythic consciousness.  “Wherever we look, we expect to find purpose.  All things can be seen as signs and symbols of an all-knowing consciousness at work… We are not just applying symbolic interpretations to human words and events; all the universe has become a symbol.”[5]

It’s only in recent years that advances in cognitive neuroscience have enabled the linkage of our symbolic drive for meaning with the physiology of the pfc.  However, earlier observers have at times noticed the same unyielding drive for meaning in the human condition without the explicit attribution to the pfc.  The father of evolutionary theory, Charles Darwin, saw this “craving to understand” as a natural consequence of human cognition, writing that “as soon as the important faculties of the imagination, wonder, and curiosity, together with some power of reasoning, had become partially developed, man would naturally crave to understand what was passing around him, and would have vaguely speculated on his own existence.”[6] The influential 20th century anthropologist Clifford Geertz saw something similar, describing a human as a “symbolizing, conceptualizing, meaning-seeking animal,” whose “drive to make sense out of experience, to give it form and order, is evidently as real and as pressing as the more familiar biological needs.”  Geertz sees religion, art and ideology – the products of mythic consciousness – as “attempts to provide orientation for an organism which cannot live in a world it is unable to understand.”[7] More recently, other observers have arrived at similar conceptions to the pfc’s patterning instinct, one group describing a “cognitive imperative” for humans to “construct myths to explain their world,” and another researcher summarizing it as a “narrative drive” to “create meaning to our world.”[8]

Clifford Geertz: saw a human as a "symbolizing, conceptualizing, meaning-seeking animal"

Powerful as this patterning instinct of the pfc appears to be, we would severely understate the overwhelming force of its influence in molding our human consciousness unless we look more closely at the process of how the molding and patterning takes place in an infant’s developing mind.  Just as language “warps the perception” of an infant as she listens to the patterns of sounds around her, to the extent that a grown Japanese person can’t distinguish between the sounds /r/ and /l/, so the mythic patterns of thought informing the culture a child is born into will literally shape how that child’s pfc constructs meaning in her world.  It’s as though there is an external pfc created by the cumulative symbolic constructions of generations of minds gone before, which has already assembled the comprehensive mythological structures of thought that will be inherited by the new generation.  How this “external pfc” molds each individual’s own pfc as they grow up in their culture is what we’ll now examine.


[1] Donald, M. (1991). Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition, Cambridge, Mass.: Harvard University Press, 213-16, 267.  Donald cites an earlier study of the !Kung Bushmen in his evaluation of their cultural traditions: Lee, R.B. and De Vore, I. (1976).  Kalahari hunter-gatherers: Studies of the !Kung Sang and their neighbors. Cambridge, Mass: MIT Press.

[2] Ibid.

[3] Boyer (2001) op. cit., 323.

[4] Chapter 3, page 39.

[5] Deacon (1997) op. cit., 435.

[6] Cited from Darwin, C. (1871)The Descent of Man by Sjöblom (2007) op. cit.

[7] Cited by Guthrie (1993) op. cit., 32.

[8] d’Aquili, E., and Newberg, A. B. (1999). The Mystical Mind: Probing the Biology of Religious Experience, Minneapolis: Fortress Press, 86; Sjöblom (2007) op. cit.

October 31, 2010

Out of Africa

Posted in Language and Myth tagged , , at 11:07 pm by Jeremy

This section of my book, Finding the Li: Towards a Democracy of Consciousness, covers the exodus of modern humans from Africa, and describes what happened when they met the Neanderthals in Europe.  It’s taken from the chapter “The Rise of Mythic Consciousness.”  The section begins by answering the question posed at the end of the previous section, called the “sapient paradox”: if modern humans evolved over 150,000 years, why did it take until 40,000 years ago for human to show symbolic behavior in the Upper Paleolithic revolution?

[PREVIOUS SECTION]

Out of Africa

Well actually, according to a growing number of experts, it did happen sooner.  A lot sooner.  In fact, there’s evidence that the beginnings of cultural modernity may have occurred at least seventy-five thousand years ago.  It’s just that it wasn’t in Europe that these stirrings of modernity first showed up, but in South Africa.  In recent years, excavations at two important sites on the coastline of South Africa – Howieson’s Poort and Blombos Cave – have uncovered startling new evidence of symbolic behavior by our human ancestors a full thirty-five thousand years before the Upper Paleolithic revolution in Europe.  Some of the findings include engraved ostrich eggshells and perforated shells that were probably used as personal ornaments, but the most striking treasure unearthed to date has been one particular piece of ochre with a series of complex cross-hatched lines engraved into it.[1] [Figure 3.]  These lines, in the view of archaeologists Renfrew and Mellars “seem certainly to be deliberate patterning”  and represent “the earliest unambiguous forms of abstract ‘art’ so far recorded,” and, along with the other findings, suggest that “the human revolution developed first in Africa … between 150,000 and 70,000 years ago.”[2] In fact, some additional engraved pieces have been found that are even older, leading Mellars to assert that “there is now no question that explicitly symbolic behavior was taking place by 100,000 years ago or earlier.”[3]*

Ochre with cross-hatching from Blombos Cave, South Africa

If our ancestors were thinking symbolically and behaving like modern humans a hundred thousand years ago, then what about the Upper Paleolithic revolution and the Great Leap Forward?  Doesn’t it perhaps begin to seem like a series of tentative steps rather than a great leap?  Certainly some observers think so.  Two archaeologists, Sally McBrearty and Alison Brooks, have caused a stir with an article entitled “The revolution that wasn’t: a new interpretation of the origin of modern human behavior,” arguing exactly this point.[4] And even the momentous findings in Blombos and Howieson’s Poort seem to peter out of the archaeological record after that, suggesting “intermittent” advances in modernity rather than one sweeping tidal wave of progress.[5] Mellars describes the process as possibly “a gradual working out of these new cognitive capacities” of our human ancestors “under the stimulus of various kinds of environmental, demographic, or social pressures.”[6]

But if the excitement of the Great Leap Forward is somewhat diminished, another epic story, perhaps grander than any other, has come into the foreground.  It’s a story that’s emerged through advances in mitochondrial DNA analysis, through which scientists can trace the patterns of previous molecular changes in the DNA of modern humans and thus establish accurate time estimates regarding the migrations of different human groups.  The story can only be called “Out of Africa”  and it goes something like this.  At some time around sixty to eighty thousand years ago, a certain lineage of humans (known as L3 based on their mitochondrial DNA type)  began to expand throughout Africa, becoming the majority population throughout the continent with the exception of the Khoisan (Bushmen) and the Biaka (Pygmies). One group of this L3 lineage got as far north as Ethiopia and from this group a small initial contingent, no more than a few hundred people at most, migrated across the mouth of the Red Sea, through Arabia and eastward along southern Asia until reaching Australia.  This epic journey happened sometime during the period between fifty to sixty-five thousand years ago.   A some point during this migration, another group headed north into western or central Asia, and from there arrived in Europe, where their descendants eventually instigated the Upper Paleolithic “revolution.”  A couple of startling facts arise from this story.  The first is that all non-African people currently alive today are descendants of this very small group of several hundred that made its way across the Red Sea.  Secondly, because of this, there is a far wider genetic diversity between different African populations than between all other non-African people on the planet. [7]

It’s a grand story, but it still raises as many questions as it answers.  What led to the original expansion of the L3 group through Africa?  And how does that tie in with the findings at Blombos Cave?  And we still have the “sapient paradox” to contend with: if humans were acting so modern all this time, why is there nothing special to show for it in the archaeological record other than some pierced shells and cross-hatched ochre until the flowering of achievements in Europe forty thousand years ago?

Archaeologist Richard Klein believes that the answer to the first set of questions may be genetic.  In his view, a genetic mutation, most likely in the “neural capacity for language or for ‘symboling’,” is the best explanation for the dramatic changes that ensued.  Here’s how he argues his case:

When the full sweep of human evolution is considered, it is surely reasonable to propose that the shift to a fully modern behavioral mode and the geographic expansion of modern humans were also coproducts of a selectively advantageous genetic mutation. Arguably, this was the most significant mutation in the human evolutionary series, for it produced an organism that could alter its behavior radically without any change in its anatomy and that could cumulate and transmit the alterations at a speed that anatomical innovation could never match. As a result, the archeological record changed more in a few millennia after 40 ky ago than it had in the prior million years.[8]

There are, however, other explanations for the dramatic transformation in human behavior which don’t require a genetic mutation to happen just at the right time.  Mellars has suggested that a positive feedback loop may have begun with the more efficient hunting weapons that the Blombos and Howieson’s Poort groups would have been capable of constructing.  Increased hunting efficiency, along with expanded trading and exchange networks between different groups, may have led to a sustained growth in population.  In fact, the mitochondrial DNA analysis does suggest rapid population growth between sixty and eighty thousand years ago.[9] Another group of archaeologists has produced mathematical studies showing that once a certain demographic critical size is reached, there is a greater impetus for more innovation and, perhaps most importantly, these innovations are more likely to be copied by other communities, creating a “cultural ratchet effect.”[10]

Either a genetic mutation or the positive feedback loop from denser populations could explain the successful migration out of Africa.  But neither of these are sufficient to explain the Upper Paleolithic revolution.  The population densities in Europe were no greater than those in Africa, and the people who made it to Europe were genetically no different than the rest of the L3 group.  So how might we explain that explosion in symbolic behavior and thus resolve the “sapient paradox”?  An important clue might be found in examining what these L3 humans encountered when they arrived in Europe.

Neanderthals: modern humans met them when they first arrived in Europe

When our human ancestors first showed up in Europe, they weren’t the only ones around.  The continent was already populated by Neanderthals, close cousins of homo sapiens who had diverged genetically only a few hundred thousand years earlier.[11]*  The Neanderthals had withstood more than two hundred thousand years of climatic variations in the cold reaches of Ice Age Europe, and with their heavy-set bodies they would have seemed better equipped than the homo sapiens arriving from Africa to handle Europe’s Ice Age climate.  But within ten thousand years of the arrival of homo sapiens on the scene, the Neanderthals were extinct.

To many anthropologists, the evidence seems cut and dried: the Neanderthals were outcompeted by their cognitively superior cousins.  They were “driven to extinction” by the homo sapiens invaders simply because they were “unable to compete for resources.”  They “perceived and related to the environment around them very differently” than our human ancestors and, as a result, “wielded culture less effectively.”  There’s even been mention of a “Pleistocene holocaust” prompting some observers to look at our more recent historical record and note acerbically that “homo sapiens has not been notable for a tolerance of differences or a drive toward coexistence with differing cultures – to say nothing of competing species.”[12]

Other archaeologists have, however, argued that the situation was not so simple.  In fact, they claim, the Neanderthals showed evidence of symbolic behavior just as sophisticated as that of their homo sapiens competitors.  Traditionally, when bone tools and ornaments were dug up from Neanderthal sites, they were dismissed by arguments that the Neanderthals were just mimicking the homo sapiens invaders without understanding the true meanings of these things.  But recently, the same kind of ornaments have been discovered that date back to fifty thousand years ago, or ten thousand years before modern humans came on the scene, offering unequivocal evidence of Neanderthal symbolic thought.[13] So what should we make of that?

A possible resolution to this debate arises if we go back and consider the three stages of language evolution posited in the previous chapter.[14] Under that hypothesis, the hominids living around three hundred thousand years ago had reached the second stage of language evolution, with a protolanguage that accompanied the stone-working complexity known as Levallois technology (which is associated with the Neanderthals).  Possibly, the Neanderthals had reached that level of cognitive sophistication, but were unable to make the leap across the metaphoric threshold to modern language.  It’s easy to imagine how a group that could say to each other “fire stone hot” would be outcompeted by another group that could say “I put the stone that you gave me in the fire and now it’s hot.”  A recent paper by Coolidge and Wynn speculating on the Neanderthal mind is consistent with this hypothesis, proposing that homo sapiens had greater “syntactical complexity” than the Neanderthals, including the use of subjunctive and future tenses, and that this enhanced use of language may have given modern humans “their ultimate selective advantage over Neandertals.”[15]

However, the competition between homo sapiens and Neanderthals was probably fierce and most likely endured for thousands of years.  In fact, it’s this very competition that might have been the catalyst for the dramatic achievements of the Upper Paleolithic revolution, thus providing a possible solution to the “sapient paradox.”  As we know from modern history, warfare is frequently the grim handmaiden of major technological innovations, and it’s reasonable to believe that the same could have been true of that much earlier conflict.   Conard, for example, has raised the possibility that “processes such as competition at the frontiers between modern and archaic humans contributed to the development of symbolically mediated life as we know it today.”[16]*

There’s more at stake in the possible distinction between Neanderthal and modern human cognition than just a forensic post mortem of how the Neanderthals became extinct.  As we’ll see in the next section, this distinction may help us to understand the underlying sources of the mythic consciousness that became the hallmark of everything accomplished by homo sapiens from that time on.

[NEXT SECTION]


[1] Henshilwood, C. S., d’Errico, F., Vanhaeren, M., van Niekert, K., and Jacobs, Z. (2004). “Middle Stone Age Shell Beads from South Africa.” Science, 304, 404; Henshilwood, C. S., and Marean, C. W. (2003). “The Origin of Modern Human Behavior: Critique of the Models and Their Test Implications.” Current Anthropology. City, pp. 627-651; Henshilwood, C. S. et al. (2002). “Emergence of Modern Human Behavior: Middle Stone Age Engravings from South Africa.” Science, 295, 1278-80.

[2] Renfrew, op. cit.; Mellars, P. (2006). “Why did modern human populations disperse from Africa ca. 60,000 years ago?  A new model.” PNAS, 103(June 20, 2006), 9381-9386.

[3] Mellars (2006) op. cit.  Notably, the program director at Blombos, Christopher Henshilwood, sees these findings as evidence that, “at least in southern Africa, Homo sapiens was behaviorally modern about 77,000 years ago,” and other archaeologists who were initially skeptical of these claims are increasingly coming around, acknowledging that “the new material removes any doubt whatsoever.”  See Heshilwood (2002) op. cit., and Balter, M. (2009). “Early Start for Human Art?  Ochre May Revise Timeline.” Science, 323(30 January 2009), 569, quoting archaeologist Paul Pettitt.

[4] McBrearty, S., and Brooks, A. S. (2000). “The revolution that wasn’t: a new interpretation of the origin of modern human behavior.” Journal of Human Evolution, 39(2000), 453-563.

[5] Powell, A., Shennan, S., and Thomas, M. G. (2009). “Late Pleistocene Demography and the Appearance of Modern Human Behavior.” Science, 324(5 June 2009), 1298-1301.

[6] Mellars (2006) op. cit.

[7] Forster, P. (2004). “Ice Ages and the mitochondrial DNA chronology of human dispersals: a review.” Phil. Trans. R. Soc. Lond. B, 359(2004), 255-264; Mellars (2006) op. cit.

[8] Klein, R. G. (2000). “Archeology and the Evolution of Human Behavior.” Evolutionary Anthropology, 9(1), 17-36.

[9] Mellars (2006) op. cit.

[10] Powell, A., Shennan, S., and Thomas, M. G. (2009). “Late Pleistocene Demography and the Appearance of Modern Human Behavior.” Science, 324(5 June 2009), 1298-1301; Culotta, E. (2010). “Did Modern Humans Get Smart Or Just Get Together?” Science, 328, 164.

[11] The Neanderthals and other hominids (for example, homo erectus) had already colonized southern Asia and Europe beginning over a million years ago.  See Mithen, S. (1996). The Prehistory of the Mind, London: Thames & Hudson, 29; Forster, P. (2004) op. cit.

[12] Quotations taken, in order, from the following sources: Forster, P. (2004) op. cit.; Mithen, S. (2006). The Singing Neanderthals: The Origins of Music, Language, Mind, and Body, Cambridge, Mass.: Harvard University Press; Tattersall, I. (2008). “An Evolutionary Framework for the Acquisition of Symbolic Cognition by Homo sapiens.” Comparative Cognition & Behavior Reviews, 3, 99-114; Klein, R. G. (2003). “Whither the Neanderthals?” Science, 299, 1525-1527; Proctor, R. N. (2003). “The Roots of Human Recency: Molecular Anthropology, the Refigured Acheulean, and the UNESCO Response to Auschwitz.” Current Anthropology, 44(2: April 2003), 213-239; Ehrlich, P. R. (2002). Human Natures: Genes, Cultures, and the Human Prospect, New York: Penguin.  See also Mellars, P. (2005). “The Impossible Coincidence. A Single-Species Model for the Origins of Modern Human Behavior in Europe.” Evolutionary Anthropology, 14(1), 12-27 for a valuable discussion on the topic.

[13] Bahn, P. G. (1998). “Neanderthals emancipated.” Nature, 394(20 August 1998), 719-721; Zilhao, J. (2010). “Symbolic use of marine shells and mineral pigments by Iberian Neandertals.” PNAS, 107(3), 1023-1028; d’Errico, F., Zilhao, J., Julien, M., Baffier, D., and Pelegrin, J. (1998). “Neanderthal Acculturation in Western Europe?  A Critical Review of the Evidence and Its Interpretation.” Current Anthropology, 39(Supplement), S1-S43.

[14] See page 41.

[15] Wynn, T., and Coolidge, F. L. (2004). “The expert Neandertal mind.” Journal of Human Evolution, 46(4), 467-487.

[16] Conard (2010) op. cit.  This viewpoint is also argued by David Lewis-Williams who writes: “It was not cooperation but social competition and tension that triggered an ever-widening spiral of social, political and technological change that continued long after the last Neanderthal had died, indeed throughout human history.”  See Lewis-Williams, D. (2002). The Mind In the Cave, London: Thames & Hudson, 96.

September 24, 2010

Language evolution: “gradual and early” or “sudden and recent”?

Posted in Language and Myth tagged , , at 6:04 pm by Jeremy

Did language evolve early and gradually in human evolution, or was it a more recent development?  This is a major topic of debate among linguists, archeologists and anthropologists, with significant implications for understanding how our minds work.  This section of my book, Finding the Li: Towards a Democracy of Consciousness, introduces this debate.

[PREVIOUS SECTION]

Language evolution: “gradual and early” or “sudden and recent”?

It seems, at first sight, fairly straightforward.  If language evolved socially as an increasingly sophisticated substitute for grooming, then it must have happened gradually, and a long time ago.  It’s therefore no surprise that Aiello and Dunbar, the grooming theorists, are also proponents of the “gradual and early” emergence of language, arguing that “the evolution of language involved a gradual and continuous transition from non-human primate communication systems,” beginning as far back as two million years ago.  They believe that language most likely “crossed the Rubicon” to its modern state about 300,000 years ago, shortly preceding the emergence of anatomically modern humans.  By 250,000 years ago,  (an era known as the Middle Paleolithic), they believe “groups would have become so large that language with a significant social information content would have been essential.”[1] They are certainly not alone in this view.  For example, another well regarded team of archaeologists describes “a sense of continuity, rather than discontinuity, between human and nonhuman primate cognitive and communicative abilities… We infer that some form of language originated early in human evolution, and that language existed in a variety of forms throughout its long evolution.”[2]

So what’s the problem?  Well, it’s probably become clear by now that language is a network of symbols, connected together by the magical weave of syntax.  If that’s the case, then whoever could produce the symbolic expression of language must have been thinking in a symbolic way, and therefore would likely have produced other material expressions of symbolism.  It therefore seems reasonable to expect that language users would have left some trace of symbolic artifacts such as body ornamentations (e.g. pierced and/or painted shells), carvings of figures, cave paintings, sophisticated hunting and trapping tools (e.g. boomerangs, bows, nets, spear throwers), and maybe even musical instruments.

And in fact, the archeological evidence does indeed point to a time when all these clear expressions of symbolic behavior suddenly emerged.  There’s just one problem.  That time was around thirty to forty thousand years ago in Europe.  Most certainly not 250,000 years ago, when Aiello and Dunbar believe that language was “essential.”  Here’s how Steve Mithen describes this “creative explosion”:

Art makes a dramatic appearance in the archaeological record.  For over 2.5 million years after the first stone tools appear, the closest we get to art are a few scratches on unshaped pieces of bone and stone.  It is possible that these scratches have symbolic significance – but this is highly unlikely.  They may not even be intentionally made.  And then, a mere 30,000 years ago … we find cave paintings in southwest France – paintings that are technically masterful and full of emotive power.[3]


Upper Paleolithic cave art: does it signify the emergence of modern language?

In recent years, as new archeological findings have been unearthed, the timing for what’s known as the “Upper Paleolithic revolution” has been pushed back to around forty to forty-five thousand years ago, but the shift remains as dramatic as ever.  We find the “first consistent presence of symbolic behavior, such as abstract and realistic art and body decoration (e.g., threaded shell beads, teeth, ivory, ostrich egg shells, ochre, and tattoo kits),” ritual artifacts and musical instruments.[4] It’s a veritable “crescendo of change.”[5] This revolution of symbols, which has been aptly named by scientist Jared Diamond the “Great Leap Forward,”[6] is so important that we’ll be reviewing it in more detail in the next chapter, but for now we need to focus on its implications for when language first emerged.

As you might expect, those who emphasize the symbolic nature of language are the strongest proponents of the “late and sudden” school of language emergence.  The most notable of these is the psychologist/archaeologist team Bill Noble and Iain Davidson, who boldly make their claim as follows:

The late emergence of language in the early part of the Upper Pleistocene accounts for the sharp break in the archaeological record after about 40,000 years ago.  This involved … world-wide changes in the technology of stone and especially bone tools, the first well-documented evidence for ritual and disposal of the dead, the emergence of regional variation in style, social differentiation, and the emergence of both fisher-gatherer-hunters and agriculturalists.  All these characteristics of modern human behavior can be attributed to the greater information flow, planning depth and conceptualization consequent upon the emergence of language.

Noble and Davidson don’t actually claim that language use began forty thousand years ago.  They point out that the first human colonization of Australia occurred about twenty thousand years earlier than that, and they believe this huge feat required the sophistication arising from language.  On account of this, they’re willing to push back their date of language emergence, concluding that “sometime between about 100,000 and 70,000 years before the present the behaviour emerged which has become identified as linguistic.”    Still, a lot later than Aiello and Dunbar’s 250,000 years ago.

The disagreement is not just a matter of timing.  It’s also about the way in which language arose.  Noble and Davidson believe that, because of the symbolic nature of language, you can no more have a “half-language” than you can be half-pregnant.  “Our criterion for symbol-based communication,” they state, “is ‘all-or-none.’… As with the notion of something having, or not having, ‘meaning’, symbols are either present or absent, they cannot be halfway there.”  They are joined in this view by Fauconnier and Turner, the team that described the “double scope conceptual blending” characteristic of language. The appearance of language, they write, is “a discontinuity…, a singularity much like the rapid crystallization that occurs when a dust speck is dropped into a supersaturated solution.”[7] The logic is powerful.  Once a group of humans realizes that one symbol (i.e. a word) can relate to another symbol through syntax, then the sky’s the limit.  Any word can work.  All you need is the underlying set of neural connections to make the realization in the first place, a community that stumbles upon this miraculous power, and then it’s all over.  The symbols weave themselves into language, which then reinforces other symbolic networks such as art, religion and tool use.  “Language assisted social interaction, social interaction assisted the cultural development of language, and language assisted the elaboration of tool use… all intertwined.”[8]

Archaeologist Richard Klein suggests a genetic mutation may have caused the emergence of modern language

Another celebrated archaeologist, Richard Klein, points out the difference in the sheer complexity of life from the Middle Paleolithic era to the Upper Paleolithic revolution.  The artifacts of the Middle Paleolithic were “remarkably homogeneous and invariant over vast areas and long time spans.  Their tools, camp sites and graves were all “remarkably simple.”  By contrast, Upper Paleolithic remains are far more complex, implying “ritual or ceremony.”  For Klein, the difference is so dramatic that he thinks it could be best explained by a “selectively advantageous genetic mutation” that was, “arguably… the most significant mutation in the human evolutionary series.”  What kind of mutation would this have been?  “It is especially tempting to conclude,” writes Klein, “that the change was in the neural capacity for language or for ‘symboling.'”  Another team of archaeologists gets even more specific, proposing that “a genetic mutation affected neural networks in the prefrontal cortex approximately 60,000 to 130,000 years ago.”[9]

It’s a powerful argument.  And one that seems incompatible with the “gradual and early” camp.  How should we make sense of it?  Perhaps there’s another way to approach the problem.  At the beginning of the chapter, I mentioned another raging debate over language: whether or not there’s a “language instinct.”  Surely this would help resolve the issue?  After all, if there is a language instinct, then you’d think it would be embedded so deep in the human psyche that we must have been talking to each other at least a few hundred thousand years ago.  So let’s see what light this other debate sheds on the problem.

[NEXT SECTION]


[1] Aiello & Dunbar, op. cit.

[2] McBrearty, S., and Brooks, A. S. (2000). “The revolution that wasn’t: a new interpretation of the origin of modern human behavior.” Journal of Human Evolution, 39(2000), 453-563.

[3] Quoted in Fauconnier & Turner, op. cit., 183.

[4] Powell, A., Shennan, S., and Thomas, M. G. (2009). “Late Pleistocene Demography and the Appearance of Modern Human Behavior.” Science, 324(5 June 2009), 1298-1301.

[5] Hauser, M. D. (2009). “The possibility of impossible cultures.” Nature, 460(9 July 2009), 190-196.

[6] Diamond, J. (1993). The Third Chimpanzee: The Evolution and Future of the Human Animal, New York: Harper Perennial.

[7] Fauconnier, G., and Turner, M. (2002). The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities, New York: Basic Books, 183.

[8] Ibid.

[9] Coolidge, F. L., and Wynn, T. (2005). “Working Memory, its Executive Functions, and the Emergence of Modern Thinking.” Cambridge Archaeological Journal, 15(1), 5-26.

August 30, 2010

The neuroanatomy of language

Posted in Language and Myth tagged , , at 11:24 pm by Jeremy

What parts of the brain are responsible for language?  Most people up to speed on the subject would argue for Broca’s area and Wernicke’s area.  But it’s really the prefrontal cortex and its symbolizing capability that’s responsible for our language capability.  Here’s a section of my book draft, Finding the Li: Towards a Democracy of Consciousness, that explains in more detail.

[Go to previous section]

The neuroanatomy of language

Considering the crucial importance of the pfc in enabling symbolic thought, it has been relatively ignored until recently as a major anatomical component of our capability for language.  Traditionally, when researchers studied the anatomical evolution of language, they focused attention not just on the brain’s capacity but also on our descended larynx, which was thought to be a unique feature of the human vocal tract.  However, recent studies have shown that a number of other species, including dogs barking, lower their larynx during vocalization, and some mammals even have a permanently descended larynx.  An even more powerful argument against the descended larynx as a prerequisite of language is that infants born deaf can learn American Sign Language with as much speed and fluency as hearing children learn spoken language.  There’s seems little doubt that the human larynx co-evolved with our language capacity to enable our fine, subtle distinctions in speech sounds, but it doesn’t seem to have been required for language development.[1] In the words of Merlin Donald, “it is the brain, not the vocal cords, that matters most.”[2]

Broca discovered a crucial area relating to language in the late 19th century

Even within the brain itself, the pfc hasn’t had much press in relation to language.  In the late nineteenth century, two European physicians named Paul Broca and Carl Wernicke focused attention on two different regions in the left hemisphere of the cerebral cortex – now named appropriately enough Broca’s area and Wernicke’s area – as the parts of the brain that control language.  They made their discoveries primarily through observing patients who had suffered physical damage to their brains in these regions and had lost their ability to speak normally (known as aphasia.)  For over a hundred years, it has become generally accepted that these two areas are the “language centers” of the brain.[3] Equally importantly, both of these areas were noticed to be on the left side of the brain, and in recent decades neuroanatomical research has shown that the left hemisphere is generally the one most used for sequential processing, for creating “a narrative and explanation for our actions,” for acting as our “interpreter.”[4]

However, although Broca’s and Wernicke’s areas have long been viewed as unique to humans, recent research has shown them also to be active in other primates.  In one study, for example, the brains of three chimpanzees were scanned as they gestured and called to a person requesting food that was out their reach.  As they did so, the chimps showed activation in the brain region that corresponds to Broca’s area in humans.[5] Terrence Deacon believes that, rather than view these areas as “language centers” controlling our ability to speak, we should rather think of language as using a network of different processes in the brain.  Broca’s area is adjacent to the part of the brain that controls our mouth, tongue and larynx; and Wernicke’s area is adjacent to our auditory cortex.  Therefore, these areas likely evolved as key nodes in the language network of the brain, which would explain the aphasia resulting from damage to them.  “Broca’s and Wernicke’s areas,” Deacon explains, “represent what might be visualized as bottlenecks for information flow during language processing; weak links in a chain of processes.”[6] Neuroscientist Jean-Pierre Changeux agrees, arguing that “efficient communication of contextualized knowledge involves the concerted activity of many more cortical areas than the ‘language areas’ identified by Broca and Wernicke.”[7]

Deacon also warns against reading too much into left hemisphere specialization, known as lateralization.  He sees lateralization as “probably a consequence and not a cause or even precondition for language evolution,” pointing out that several other mammals, including other primates, also show lateralization, and that even in humans, nearly 10 percent of people are “not left-lateralized in this way.”  Lateralization, in his view, “is more an adaptation of the brain to language than an adaptation of the brain for language.”[8]

So, if it’s not the larynx, not Broca’s and Wernicke’s areas, and not lateralization, is there anything about the human anatomy that makes it uniquely capable of creating language?  No prizes for guessing that the answer may be the pfc.  As Deacon puts it, “two of the most central features of the human language adaptation” are “the ability to speak and the ability to learn symbolic associations.”[9] We’ve already noted that skilled vocalizations are a helpful, but not a necessary, part of our language capability.  So that leaves “the symbol-learning problem,” which “can be traced to the expansion of the prefrontal cortical region, and the preeminence of its projections in competition for synapses throughout the brain.”[10] Changeux once again agrees, noting that “propositions and structured speech can be seen as evolutionary phenomena accompanying the expansion of the prefrontal cortex,”[11] as does celebrated neuroscientist Joaquin Fuster who writes that “given the role of prefrontal networks in cognitive functions, it is reasonable to infer that the development of those networks underlies the development of highly integrative cognitive functions, such as language.”[12]

If the pfc was, in fact, the central driver of the emergence of language, what light (if any) does that shed on those raging debates about when and at what rate language evolved, and whether there is something that can be called a “language instinct”?  In order to answer that, we need to understand a little more about the social context in which language emerged.

[NEXT SECTION]


[1] For a full review of this issue, see Fitch, W. T. (2005). “The evolution of language: a comparative review.” Biology and Philosophy, 20, 193-230.

[2] Donald, M. (1991). Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition, Cambridge, Mass.: Harvard University Press, 39.

[3] See Donald op. cit., 45-94, for a full discussion of the history of anatomical theories of human language.

[4] Gazzaniga, M. S. (2009). “Humans: the party animal.” Dædalus(Summer 2009), 21-34.

[5] Taglialatela, J. P., Russell, J. L., Schaeffer, J. A., and Hopkins, W. D. (2008). “Communicative Signaling Activates ‘Broca’s’ Homolog in Chimpanzees.” Current Biology, 18, 343-348.

[6] Deacon, op. cit., 288.

[7] Changeux, J.-P. (2002). The Physiology of Truth: Neuroscience and Human Knowledge, M. B. DeBevoise, translator, Cambridge, Mass.: Harvard University Press, 123.

[8] Deacon, op. cit., 310, 315.  Italics in original.

[9] Ibid., 220.

[10] Ibid.

[11] Changeux, op. cit., 123-4.

[12] Fuster, J. M. (2002). “Frontal lobe and cognitive development.” Journal of Neurocytology, 31(December 2002), 373-385.

August 23, 2010

Language: weaving a net of symbols

Posted in Language and Myth tagged , at 10:46 pm by Jeremy

Here’s the first section of Chapter 3 of my book draft, Finding the Li: Towards a Democracy of Consciousness.  This chapter’s about the evolution of language.  This first section delves into what’s special about language, contrasting it to the calls of vervet monkeys described by Seyfarth & Cheney.  It parses a typical sentence to highlight linguistic features such as “double-scope conceptual blending,” displacement, counterfactuals, and the “magical weave” of syntax.

As always, constructive comments are warmly welcomed.

Weaving a net of symbols.

Given that it’s something every one of us uses every day of our lives, and which has been studied for millennia, it’s amazing how much the experts still disagree about language.  For example, consider the question of when language first emerged.  Some researchers argue for a long, slow, evolution of language, beginning in the time of our hominid ancestors several million years ago, and gradually developing into what we now think of as modern language.  Other experts argue for a much later and more sudden rise of language, perhaps as recently as 40,000 years ago.  There’s even more raging disagreement about the relationship of language and our brains.  Some famous theorists have proposed that we have a “language instinct,” an innate set of neural pathways that have evolved to comprehend the unique attributes of language such as syntax and grammar.  Other researchers argue back that this is impossible, and that what’s innate in our brains is something more fundamental than language itself.

One thing nobody seems to disagree about is the central importance of language to our human experience.  “More than any other attribute,” writes one team of biologists, “language is likely to have played a key role in driving genetic and cultural human evolution.”[1] When you consider your daily life, your interactions with your family, your work, even the way you think about things, you quickly realize that language is necessary for virtually everything.  In the words of one linguist, “everything you do that makes you human, each one of the countless things you can do that other species can’t, depends crucially on language.  Language is what makes us human.  Maybe it’s the only thing that makes us human.”[2]

As we’ll see, language is equally important to the rise of the pfc’s power in human consciousness.  We’ll explore in this chapter how language first gave the pfc the capability to expand its purview beyond its original biological function.  In pursuing this exploration, we’ll find that understanding language in terms of the pfc may help us to untangle some of those debates about language that continue to galvanize the experts, and as we do so, to uncover some insights into the very nature of how we think.

What’s special about language?

Vervet monkeys can call out different threats... but it's not language

First, though, we need to get a handle on what language really is and what’s so special about it.  Perhaps a good place to start is what language isn’t.  Back in 1980, a team of researchers spent over a year in the Amboseli National Park in Kenya, watching groups of vervet monkeys interact, and recording their vocalizations.  What they found made waves in the field of animal communication.  The monkeys have three important natural predators: leopards, eagles and pythons, each of which has a different style of attacking them, either jumping at them, attacking from the sky or from the ground.  The researchers discovered that the monkeys had developed completely different vocalizations to warn their group of each predator: short tonal calls for leopards, low-pitched staccato grunts for eagles and high-pitched “chutters” for snakes.  When the monkeys heard the leopard call, they’d climb up in the trees; an eagle call caused them to look up or run into dense bush; and a snake call had them looking down at the ground around them.  The researchers could induce the different behaviors in the monkeys by playing tape recordings of each call.  “By giving acoustically distinct alarms to different predators,” they explained, “vervet monkeys effectively categorized other species.” These fascinating findings showed that vervet monkeys were capable of what was described as “perceptual categorization… of rudimentary semantic signals.”[3] It certainly showed how smart vervet monkeys are.  But it wasn’t language.

A fundamental characteristic of language is that, in the words of researchers Noble and Davidson, it involves the “symbolic use of communicative signs”.[4] Anthropologist/neuroscientist Deacon agrees with this, suggesting that “when we strip away the complexity, only one significant difference between language and nonlanguage communication remains: the common, everyday miracle of word meaning and reference … which can be termed symbolic reference.”[5] But, an alert reader might ask at this point, wasn’t that what the vervet monkeys were doing?  If we consider the definition of “symbol” from the previous chapter, as something that has a purely arbitrary relationship to what it signifies, then the vervet calls seem to meet that definition.  It’s only because the other vervet monkeys understand the meaning of the grunts or chutters that they know whether to look up or look down.  That may be true, but there’s another aspect of language that sets it apart from the vervet calls: syntax.

“Animal communication is typically non-syntactic, which means that signals refer to whole situations,” explains a team of language researchers.  Human language, on the other hand, “is syntactic, and signals consist of discrete components that have their own meaning… The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolution of human language.”[6] So, when a vervet monkey gives a low-pitched grunt, he’s not saying the word “eagle.”  He’s saying, in one grunt: “There’s an eagle coming, and we’d all better head for the bushes.”  If he grunted twice, that wouldn’t mean “two eagles.”  And if he gave out a grunt followed by a chutter, that wouldn’t mean “an eagle just attacked a snake.”  The vervet monkeys can’t get out of the context of their specific situation.  They can’t use syntax to make “infinite use of finite means.”[7]

To fully understand the power of language, consider the following sentence:

You remember that guy from New York we met at the cocktail party the other day, who told us that if the Fed doesn’t ease the money supply, stocks would fall?

It seems like a simple sentence, but there’s a lot going on under the surface.  First, let’s begin with the words “cocktail party.”  A cocktail refers to a mixed drink.  A party refers to a group of people getting together.  But we all know that “cocktail party” refers to a specific type of party.  It wasn’t necessary for anyone to actually be drinking a cocktail to make it a “cocktail party.”  They might have been serving wine and champagne, but we wouldn’t call it a “wine and champagne party.”  This crucial element of language takes two completely separate aspects of reality – a mixed drink and a social gathering – and blends them together to create a brand new concept. Cognitive scientists Fauconnier and Turner have aptly called this “conceptual blending” and consider it to be “one (and perhaps the) mental operation whose evolution was crucial for language.”[8]

Stock prices don't really fall the way people do

But the complexity really gets going when we come to phrases like “ease the money supply” and “stocks would fall.”  Here, we meet one of the most ubiquitous aspects of modern language, which is the use of tangible metaphors to convey abstract meaning.  We’re so comfortable with these metaphors in our daily language that we don’t even consider them as such, but ponder for a moment what it means to “ease the money supply.”  There’s an underlying metaphor of some kind of reservoir of liquid, perhaps water, which would normally come flowing out to people.  But someone has their hands on a lever of some sort, which keeps the supply controlled.  Now, this person – the Fed – wants everyone to have a little more of the liquid, so they ease up on the lever, allowing some more to flow out.  Similarly, stocks don’t really fall.  People, animals or things might fall, off a table or out of a tree.  But of course when something falls, it goes from a high position to a lower position.  So, we naturally understand that a falling stock is one whose price is moving from higher to lower.  These metaphors are examples of what Fauconnier and Turner see as an advanced form of conceptual blending, which they term “double-scope conceptual blending.”  It’s called “double scope” because it integrates “two or more conceptual arrays… which typically conflict in radical ways on vital conceptual relations” – such as in this case stock prices and falling things – into a “novel conceptual array” which “develops emergent structure not found in either of the inputs.”[9]

There’s still more amazing complexity to that simple sentence.  Notice that it’s referring to someone we met “the other day.”  He’s not there talking to us now.  It all happened somewhere else and in the past, but through language we can bring the past back to the present in a matter of seconds, and we can whisk people or things from anywhere in the universe to be present in our minds with just a few words.  This near-magical power of language is known as displacement, “the ability to make reference to absent entities.”[10]

The magic of language goes even further than displacement.  Consider that we’re being asked to imagine a scenario where stocks would fall if the Fed doesn’t ease the money supply.  This is something that hasn’t actually happened.  It may never happen.  But we can still talk about the scenario with as much ease as if it were happening right now.  This ability of language to create hypothetical situations out of thin air is known as a “counterfactual,” a reference to something that’s not a concrete fact but can still exist in our minds and get communicated through language.

There’s already a lot to be impressed about in that one sentence, with its double-scope conceptual blending, its displacements and its counterfactuals.  But the coup de grace of this sentence and most other sentences in every language of the world is its syntax.  If language is like a net of symbols, we can think of syntax as a magical  weave that can link each section of the net to any other section at a moment’s notice.  Look at how many miraculous conceptual leaps we make while still holding a meaningful narrative together in our minds.  (1) “You remember” (asked in a questioning tone): we’re asked to access our memory;  (2) “that guy”: focus on the category of male humans;  (3) “from New York”: narrow down that category based on where the person is from; (4) “we met at the cocktail party the other day”: create a mental image of the party;  (5) “who told us”: shift from a mere recall of the person to a recollection of the conversation;  (6) “if the Fed doesn’t ease the money supply…”: abrupt transition from an image of the cocktail party to a hypothetical financial scenario.

Recursion: the magical weave of language

This magical weave that we pull off incessantly every day without even being aware of it is known as “recursion,” and is viewed as the most powerful and characteristic feature of modern language, accomplished by the proper placement and linkage of multiple concepts through the syntax of the sentence.  Humans alone took “the power of recursion to create an open-ended and limitless system of communication,” writes a team of linguistic experts, who propose that this power was perhaps “a consequence (by-product)” of some kind of “neural reorganization” that arose from evolutionary pressure on humans, causing previously separate modular aspects of the brain to connect together and create new meaning.[11]*  As we already know from the previous chapter, there’s one part of the brain that’s uniquely connected to permit this cognitive fluidity that underlies our human capabilities: the pfc.

[Next section]


[1] Szathmary, E., and Szamado, S. (2008). “Language: a social history of words.” Nature, 456(6 November, 2008), 40-41.

[2] Bickerton, D. (2009). Adam’s Tongue: How Humans Made Language, How Language Made Humans, New York: Hill and Wang, 4.

[3] Seyfarth, R. M., Cheney, D. L., and Marler, P. (1980). “Monkey Responses to Thee Different Alarm Calls: Evidence of Predator Classification and Semantic Communication.” Science, 210(November 14, 1980), 801-803.

[4] Noble, W., and Davidson, I. (1991). “The Evolutionary Emergence of Modern Human Behaviour: Language and its Archaeology.” Man, 26(2), 223-253.

[5] Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton, 43 – italics in the original.

[6] Nowak, M. A., Plotkin, J. B., and Jansen, V. A. A. (2000). “The evolution of syntactic communication.” Nature, 404(30 March 2000), 495-498.

[7] Nowak, ibid.

[8] Fauconnier, G., and Turner, M. (2008). “The origin of language as a product of the evolution of double-scope blending.” Behavioral and Brain Sciences, 31(5 (2008)), 520-521.

[9] Fauconnier & Turner, ibid.

[10] Liszkowski, U., Schafer, M., Carpenter, M., and Tomasello, M. (2009). “Prelinguistic Infants, but Not Chimpanzees, Communicate About Absent Entities.” Psychological Science, 20(5:17 April 2009), 654-660.

[11] Hauser, M. D., Chomsky, N., and Fitch, W. T. (2002). “The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?” Science, 298(22 November 2002), 1569-1579. Without denying the immense importance of recursion, I would  note that their view that “no other animal” possesses it is still open to debate.  There are possibilities of some kind of recursion in birdsong and in elephant, dolphin and whale communication, which has been described by Marler (1998) as “phonological syntax” in “Animal communication and human language” in: The Origin and Diversification of Language, eds., G. Jablonski & L. C. Aiello. California Academy of Sciences.

August 18, 2010

So what really makes us human?

Posted in Language and Myth tagged , , , , , , at 10:39 pm by Jeremy

Here’s a pdf file of Chapter 2 of my book draft, Finding the Li: Towards a Democracy of Consciousness.  This chapter’s called “So What Really Makes Us Human?”  It traces human evolution over several million years, reviewing Merlin Donald’s idea of mimetic culture and the crucial breakthrough of “theory of mind.”  It examines the “social brain hypothesis” and explores the current thinking in “altruistic punishment” as a key to our unique human capability for empathy and group values.  Finally, it looks at how social intelligence metamorphosed into cognitive fluidity, and how the pfc’s newly evolving connective abilities enabled humans to discover the awesome power of symbols.

Open the pdf file of Chapter 2: “So What Really Makes Us Human?”

As always, constructive comments and critiques from readers of my blog are warmly welcomed.

August 17, 2010

What the pfc did for early humans

Posted in Language and Myth tagged , , , , at 10:49 pm by Jeremy

This section of my book draft, Finding the Li: Towards a Democracy of Consciousness, looks at how the unique powers of the prefrontal cortex gave early humans the capability to construct tools, exercise self-control, and begin to control aspects of the environment around them.  But most of all it gave them the power of symbolic thought, which has become the basis of all human achievement since then.   As always, constructive comments are welcomed.

[PREVIOUS SECTION]

What the pfc did for early humans

Mithen’s “cognitive fluidity” and Coolidge and Wynn’s “enhanced working memory” are really two different ways of describing the same basic dynamic of the pfc connecting up diverse aspects of the mind’s intelligence to create coherent meaning that wasn’t there before.  But what specifically did this enhanced capability do for our early human ancestors?

To begin with, it enabled us to make tools.  It used to be conventional wisdom that humans are the only tool-makers, so much so that the earliest known genus of the species homo, which lived around two million years ago, is named homo habilis, or “handy man.”  Then, in the 1960s, Jane Goodall discovered that chimpanzees also used primitive tools, such as placing stalks of grass into termite holes.  When Goodall’s boss, Louis Leakey, heard this, he famously replied “Now we must now redefine ‘tool’, redefine ‘man’, or accept chimpanzees as humans!”[1] Well, as we’ve seen in the preceding pages, there’s been plenty of work done in redefining “man” since then, but none of this takes away from the fact that humans clearly use tools vastly more effectively than chimpanzees or any other mammals.

Oldowan tools: better than what any chimpanzee can do.

To be fair to our old “handy man” homo habilis, even the primitive stone tools they left behind, called Oldowan artifacts after the Olduvai Gorge in East Africa where they were first found, represented a major advance in working memory over our chimpanzee cousins.  Steve Mithen has pointed out that some Oldowan tools were clearly manufactured to make other tools, such as “the production of a stone flake to sharpen a stick.”[2] Making a tool to make another tool is unknown in chimpanzees, and requires determined planning, holding the idea of the second tool in your working memory while you’re preparing your first tool.  Oldowan artifacts remained the same for a million years, so even though they were an advance over chimp technology, there was none of the innovation that we associate with our modern pfc functioning.  The next generation of tools, called the Acheulian industry, required more skillful stone knapping, and show attractive bilateral symmetry, but they also remained the same for another million years or so.[3] It was around three hundred thousand years ago, shortly before anatomically  modern humans emerged, that stone knapping really took off, with stone-tipped spears and scrapers with handles representing “an order-of-magnitude increase in technological complexity.”[4]

Acheulian tools: improved on Oldowan but stayed the same for a million years

None of these tools – even the more primitive Oldowan and Acheulian – can be made by chimpanzees, and they could never have existed without the power of abstraction provided by the pfc.[5]*   Planning for this kind of tool-making required a concept of the future, when the hard work put into making the tool would turn out to be worthwhile.  As psychologists Liberman and Trope have pointed out, transcending the present to mentally traverse distances in time and in space “is made possible by the human capacity for abstract processing of information.”  Making function-specific tools, they note, “required constructing hypothetical alternative scenarios of future events,” which could only be done through activating a “brain network involving the prefrontal cortex.”[6]

Another fundamental human characteristic arising from this abstraction of past and future is the power of self-control.  As one psychologist observes, “self-control is nearly impossible if there is not some means by which the individual is capable of perceiving and valuing future over immediate outcomes.”[7] Anyone who has watched children grow up and gradually become more adept at valuing delayed rewards over immediate gratification will not be surprised at the fact that the pfc doesn’t fully develop in a modern human until her early twenties.

This abstraction of the future gave humans not only the power to control themselves but also to control things around them.  A crucial pfc-derived human characteristic is the notion of will, the conscious intention to perform a series of activities, sometimes over a number of years, to achieve a goal.  Given the fundamental nature of this capability, it’s not surprising that, as Tomasello points out, in many languages the word that denotes the future is also the word “for such things as volition or movement to a goal.”   In English, for example, the original notion of “I will it to happen” is embedded in the future tense in the form “It will happen.”[8]

This is already an impressive range of powerful competencies made available to early humans by the pfc.  But of all the powers granted to humans by the awesome connective faculties of the pfc, there seems little doubt that the most spectacular is the power to understand and communicate sets of meaningful symbols, known as symbolization.

The symbolic net of human experience

Ernst Cassirer: first to define humans as "animal symbolicum"

A full generation before Louis Leakey realized it was time to “redefine man,” a German philosopher named Ernst Cassirer who had fled the Nazis was already doing so, writing in 1944 that “instead of defining man as an animal rationale we should define him as an animal symbolicum.”[9] He wasn’t alone in this view.  A leading American anthropologist, Leslie White, also believed that the “capacity to use symbols is a defining quality of humankind.”[10] Because of our use of symbols, Cassirer wrote, “compared with the other animals man lives not merely in a broader reality; he lives, so to speak, in a new dimension of reality.”[11]

Why would the use of symbols take us to a different dimension of reality?  First, it’s important to understand what exactly is meant by the word “symbol.”  In the terminology adopted by cognitive anthropologists, we need to differentiate between an icon, an index, and a symbol.  A simple example may help us to understand the differences.  Suppose it’s time for you to feed your pet dog.  You open your pantry and look at the cans of pet food available.  Each can has a picture on it of the food that’s inside.  That picture is known as an icon, meaning it’s a “representative depiction” of the real thing.  Now, you open the can and your dog comes running, because he smells the food.  The smell is an index of the food, meaning it’s “causally linked” to what it signifies.  But now suppose that instead of giving your hungry dog the food, you wrote on a piece of paper “FOOD IN TEN MINUTES” and put it in your dog’s bowl.  That writing is a symbol, meaning that it has a purely arbitrary relationship to what it signifies, that can only be understood by someone who shares the same code.  Clearly, your dog doesn’t understand symbols, and now he’s pawing at the pantry door trying to get to his food.[12]*

A hungry dog doesn't respond to a note saying "food in ten minutes"

To understand how symbols arose, and why they are so important, it helps to begin with the notion of working memory discussed earlier.   Terrence Deacon has suggested that symbolic thought is “a way of offloading redundant details from working memory, by recognizing a higher-order regularity in the mess of associations, a trick that can accomplish the same task without having to hold all the details in mind.”[13] Remember the image of working memory as a blackboard?  Now imagine a teacher asking twenty-five children to come up and write on the blackboard what they had to eat that morning before they came to school.  The blackboard would quickly fill up with words like cereals and eggs, pancakes and waffles.  Now, suppose that, once the blackboard’s filled up, the teacher erases it all and just writes on the blackboard the word “BREAKFAST”.  That one word, by common consent, symbolizes everything that had previously been written on the blackboard.  And now it’s freed up the rest of the blackboard for anything else.

That’s the powerful effect that the use of symbols has on human cognition.  But there’s another equally powerful aspect of writing that one word “BREAKFAST” on the blackboard.  Every schoolchild has her own experience of what she ate that morning, but by sharing in the symbol “BREAKFAST,” she can rise above the specifics of her own particular meal and understand that there’s something more abstract that is being communicated, referring to the meal all the kids had before they came to school regardless of what it was.  For this reason, symbols are an astonishingly powerful means of communicating, allowing people to  transcend their individual experiences and share them with others.  Symbolic communication can therefore be seen as naturally emerging from human minds evolving on the basis of social intelligence.  This has led one research team to define modern human behavior as “behavior that is mediated by socially constructed patterns of symbolic thinking, actions, and communication.”[14]

Once it got going, symbolic thought became so powerful that it pervaded every aspect of how we think about the world.  In Cassirer’s words:

Man cannot escape from his own achievement… No longer in a merely physical universe, man lives in a symbolic universe.  Language, myth, art, and religion are parts of this universe.  They are the varied threads which weave the symbolic net, the tangled web of human experience.  All human progress in thought and experience refines upon and strengthens this net.[15]

Because of our symbolic capabilities, Deacon adds, “we humans have access to a novel higher-order representation system that… provides a means of representing features of a world that no other creature experiences, the world of the abstract.” We live our lives not just in the physical world, “but also in a world of rules of conduct, beliefs about our histories, and hopes and fears about imagined futures.” [16]

For all the power of symbolic thought, there was one crucial ingredient it needed before it could so dramatically take over human cognition.  It needed a means by which each individual could agree on the code to be used in referencing what they meant.  It had to be a code which everyone could learn and that could be communicated very easily, taking into account the vast array of different things that could carry symbolic meaning.  In short, it needed language – that all-encompassing network of symbols that we’ll explore in the next chapter.


[1] Cited in McGrew, W. C. (2010). “Chimpanzee Technology.” Science, 328, 579-580.

[2] Mithen 1996, op. cit., 96.

[3] Proctor, R. N. (2003). “The Roots of Human Recency: Molecular Anthropology, the Refigured Acheulean, and the UNESCO Response to Auschwitz.” Current Anthropology, 44(2: April 2003), 213-239.

[4] Ambrose, S. H. (2001). “Paleolithic Technology and Human Evolution.” Science, 291(2 March 2001), 1748-1753.

[5] Mithen 1996, op. cit., p. 97 relates a failed attempt to get a famous bonobo named Kanzi, who was very advanced in linguistic skills, to make Oldowan-style stone tools.

[6] Liberman, N., and Trope, Y. (2008). “The Psychology of Transcending the Here and Now.” Science, 322(21 November 2008), 1201-1205.

[7] Barkley, op. cit.

[8] Tomasello, op. cit., p. 43.

[9] Cassirer, E. (1944). An Essay on Man, New Haven: Yale University Press, 26.

[10] Cited by Renfrew, C. (2007). Prehistory: The Making of the Human Mind, New York: Modern Library: Random House, 91.

[11] Cassirer, op. cit.

[12] The distinction, originally made by American philosopher Charles Sanders Peirce, is described in detail in Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton; and is also referred to by Noble, W., and Davidson, I. (1996). Human Evolution, Language and Mind: A psychological and archaeological inquiry, New York: Cambridge University Press.  I am grateful to Noble  & Davidson for the powerful image of writing words to substitute for food in the dog’s bowl as an example of a symbol.

[13] Deacon, op. cit., p. 89.

[14] Henshilwood, C. S., and Marean, C. W. (2003). “The Origin of Modern Human Behavior: Critique of the Models and Their Test Implications”Current Anthropology. City, pp. 627-651.

[15] Cassirer, op. cit.

[16] Deacon, op. cit., p. 423.