October 9, 2010

The metaphoric threshold

Posted in Language and Myth tagged , at 8:19 pm by Jeremy

In my previous post, I proposed three stages of language evolution, with fully modern language emerging between 50,000 and 100,000 years ago.  Along with fully modern language, I suggest, came the first usage of metaphor.  This was far more significant than merely adding to the impact of language.  Rather, it involved humanity crossing what I call the “metaphoric threshold,” necessary for humans to achieve abstract thought of any kind, including the search for meaning and the construction of mythic and religious ideas.  In this final section of my chapter entitled “The Magical Weave of Language,” (from my book Towards A Democracy of Consciousness) I describe the importance of the metaphoric threshold in human thought.

[Click here for the pdf version of the chapter “The Magical Weave of Language.”]


The metaphoric threshold.

We generally think of metaphor as a technique used by poets and other creative writers, but not really something that’s an integral part of our everyday speech.  However, in a truly groundbreaking book published in 1980, cognitive philosophers Lakoff and Johnson have shown how virtually every aspect of our normal speech uses underlying metaphors to communicate abstract ideas and concepts.[1] We saw earlier how simple statements like “stocks falling” or “the Fed easing the money supply” utilize metaphors that work below our level of conscious awareness.  If you examine your regular speech, you will soon discover that it is in fact virtually impossible to say anything with any level of abstraction without using an underlying metaphor that usually relates to something more concrete.

Here are some simple examples of how these unconscious metaphors work:

“I gave you that idea” – Metaphor: AN IDEA IS AN OBJECT

“My spirits rose”; “I fell into a depression” – Metaphor: HAPPY IS UP; SAD IS DOWN

“He broke under cross-examination” – Metaphor: THE MIND IS A BRITTLE OBJECT

“His ideas have finally come to fruition” – Metaphor: IDEAS ARE PLANTS

“He’s a giant among writers” – Metaphor: SIGNIFICANT IS BIG

“I’ve had a full life” – Metaphor: LIFE IS A CONTAINER

“She gave me a warm smile” – Metaphors: FACIAL EXPRESSION IS A GIFT; INTIMACY IS WARMTH

“I don’t have much time to give you” – Metaphor: TIME IS A VALUABLE RESOURCE[2]

The examples are limitless, and I urge you to observe your own language and that of others around you, to discover the full extent of our reliance on metaphors.  In the Upper Paleolithic example, this new form of thought might have enabled our human ancestor to turn to his friend and say, “Since I lost my son, my heart has turned to stone.”

This dynamic has profound implications for how we humans structure our understanding of ourselves and the world around us, and in this book there will be many occasions to return to it.  Right now, there are two important observations to point out.  The first is that metaphors are the quintessential example of Fauconnier & Turner’s  “double scope conceptual blending” described above, whereby two different conceptual arrays of experience are blended together to create new, emergent meaning.  As such, the pfc, with its unique powers of connectivity, would be the logical part of the brain to mediate the creation of metaphoric thought.  The second observation is that, without metaphor, we are simply unable to conceptualize and communicate abstract thoughts about feelings or ideas.

Therefore, I believe that the first use of metaphor in language was not just  another milestone in the increased sophistication of human linguistic abilities.  It was the threshold necessary for human thought to cross in order to achieve abstract thought of any kind, including the search for meaning in life and in the universe and the creation of mythic and religious ideas.  In short, the crossing of the metaphoric threshold of thought led to the first stirrings of the pfc’s power in the human mind, opening the gateway to the Upper Paleolithic revolution of symbols.  This led to the emergence of a mythic consciousness in human thought, which imposed meaning and structure on the natural world based on a metaphoric transformation of the tangible qualities of everyday life, using them as a scaffolding for more abstract conceptions.  This new world of mythic consciousness which arose on the other side of the metaphoric threshold is what we’ll examine in the next chapter.

[1] Lakoff, G., and Johnson, M. (1980/2003). Metaphors We Live By, Chicago: University of Chicago.

[2] Examples taken from Lakoff & Johnson, ibid.


October 5, 2010

Three stages in the evolution of language

Posted in Language and Myth tagged , at 11:00 pm by Jeremy

A debate has been raging for years among linguists as to whether the development of language was gradual and early or sudden and more recent.  In this section of my book, Towards a Democracy of Consciousness, I argue for three stages of language evolution: (1) mimetic language beginning as long as four million years ago; (2) protolanguage emerging around 300,000 years ago (which would also have been spoken by the Neanderthals); and (3) modern language which may have begun to emerge about 100,000 years ago but probably only achieved fully modern syntax around the time of the Upper Paleolithic revolution around 40,000 years ago.


The co-evolution of language and the pfc

Imagine the world of our hominid ancestors over the four million years between Ardi and the emergence of homo sapiens.  As we’ve discussed, it was a mimetic, highly social world, where increasingly complex group dynamics were developing.  Communication in this world was probably a combination of touching, gestures, facial expressions, and complex vocalizations, including the many kinds of grunts, growls, shrieks and laughs that we still make to this day.  Terrence Deacon believes that it was in this world that “the first use of symbolic reference by some distant ancestors changed how natural selection processes have affected hominid brain evolution ever since.”[1] He argues that “the remarkable expansion of the brain that took place in human evolution, and indirectly produced prefrontal expansion, was not the cause of symbolic language but a consequence of it.”[2]

Enhanced prefrontal cortex was probably a major driver of evolutionary success for pre-human hominids

The findings from Kuhl may help to explain how prefrontal expansion could have been a consequence of symbolic language.  If we apply what we learned about how a patterning instinct leads the brain to shape itself based on the patterns it perceives, then we can imagine how a pre-human growing up in mimetic society would hear, see and feel the complex communication going on around him, and how his pfc would shape itself accordingly.  Those infants whose pfcs were able to make the best connections would be more successful at realizing how the complex mélange of grunts, rhythms, gestures and expressions they were hearing and seeing patterned themselves into meaning.  As they grew up, they would be better integrated within their community and, as such, tend to be healthier and more attractive as mates, passing on their genes for enhanced pfc connectivity to the next generation.  It was no longer the biggest, fastest or strongest pre-humans that were most successful, but the ones with the most enhanced pfcs.  In Deacon’s words, “symbol use selected for greater prefrontalization” in an ever-increasing cycle, whereby “each assimilated change enabled even more complex symbol systems to be acquired and used, and in turn selected for greater prefrontalization, and so on.”[3]

Deacon’s view of this positive feedback loop between language and evolution is shared by others, including linguist Nicholas Evans who describes it as “a coevolutionary intertwining of biological evolution, in the form of increased neurological capacity to handle language, and cultural evolution, in the form of increased complexity in the language(s) used by early hominids.  Both evolutionary tracks thus urge each other on by positive feedback, as upgraded neurological capacity allows more complex and diversified language systems to evolve, which in turn select for more sophisticated neurological platforms.”[4] While this view represents some of the most advanced thinking in the field, it’s interesting to see that it has a solid pedigree – as far back, in fact, as Charles Darwin himself, who wrote in 1871:

If it be maintained that certain powers, such as self-consciousness, abstraction etc., are peculiar to man, it may well be that these are the incidental results of other highly advanced intellectual faculties, and these again are mainly the result of the continued use of a highly developed language.[5]

Perhaps now, armed with this new perspective on our patterning instinct and the coevolution of language and the pfc, we are finally equipped to tackle the conundrum of when and how language actually evolved.  Surely it was early and gradual after all, if Deacon and Evans are to be believed?  It’s difficult to conceive how it could be anything else.  But what, then, accounts for the Upper Paleolithic Great Leap Forward?  How could we have had language for millions of years and not produced anything else with symbolic qualities until forty thousand years ago?

Three stages of language evolution

There is, in fact, a possible resolution to this conundrum that permits both the “gradual and early” and the “sudden and recent” camps to both be right.  This approach views language as evolving in different stages, with major transitions occurring between each state, and was first proposed as a solution by linguist Ray Jackendoff in a paper entitled “Possible stages in the evolution of the language capacity.”[6] Jackendoff was well aware that his proposal could “help defuse a long-running dispute,” and he pointed out that if this approach to language evolution becomes widely accepted, it will no longer be meaningful to ask whether one or another hominid “had language,” but rather “what elements of a language capacity” did a particular hominid have.

Jackendoff proposed nine different stages of language development, but for the sake of simplicity, I’d like to suggest three clearly demarcated stages.  Additionally, I think these three stages can be closely correlated to the different stages of tool technology found in the archaeological record, so an approximate timeframe can also be applied to each stage.  Importantly, the last of these stages, the transition to modern language, would be contemporaneous with the Upper Paleolithic revolution, and would therefore solve the conundrum posed by the “sudden and recent” camp.  If we consider the analogy of language as a “net of symbols,” then we can visualize each stage as a different kind of net: the first stage may be visualized as a small net that you might use to catch a single fish in a pond; the second stage would be analogous to a series of those small nets tied together; and the third stage could be seen as the vast kind of net that a modern trawler uses to catch fish on an industrial scale.  I’ll describe each stage in turn.

Stage 1: Mimetic language. This stage may have begun as early as Ardi, over four million years ago, and continued until slightly before the advent of modern homo sapiens, around 200,000-300,000 years ago.  It would have been concurrent with what Merlin Donald calls the mimetic stage of human development, as discussed in Chapter 1.  It would have involved single words that began to be used in different contexts, thus differentiating them from the vervet calls discussed earlier which only have meaning in a specific context.  Examples of these words could be shhh, yes, no or hot.  Jackendoff gives the example of a little child first learning language, who has learned to say the single word kitty “to draw attention to a cat, to inquire about the whereabouts of the cat, to summon the cat, to remark that something resembles a cat, and so forth.”[7] If you were to imagine a campfire a million years ago, a hominid may have pointed to a stone next to the fire and said “hot!” and then might later have caused his friends to laugh by using the same word to describe how he felt after running on a sunny day.

Single fishing net: analogous to early mimetic language

The correlative level of technology would have been the Oldowan and Acheulean stone tools that, as described in Chapter 2, changed very little over millions of years.  Interestingly, a recent study employed brain scanning technology to analyze what parts of the brain people use when they make these kinds of stone tools.  Oldowan tool-making showed no pfc activity at all, while Acheulean tools required some limited use of the pfc, activating a particular area used for “the coordination of ongoing, hierarchically organized action sequences.”[8]

Stage 2: Protolanguage. This stage, which has also been proposed by linguist Derek Bickerton,[9] may have gradually emerged around 300,000 years ago (the period when Aiello and Dunbar believe that language “crossed the Rubicon”) and remained predominant until the timeframe suggested by Noble & Davidson for the emergence of modern language, roughly 70,000-100,000 years ago.   It would have involved chains of words linked together in a simple sentence, but without modern syntax.  If we go back to our Stone Age campfire, imagine that the fire’s gone out, but an early human wants to tell his friends that the stones from the fire are still hot.  He might point to the area and say “stone hot fire” or alternatively “fire hot stone” or even “hot fire stone.”  The breakthrough from mimetic language is that different concepts are now being placed together to create a far more valuable emergent meaning, but the words are still chained together without the magic weave of syntax.

Interestingly, it was around 300,000 years ago that new advances were being made in stone tool technology, leaving behind the old Acheulean stagnation.  As described by anthropologist Stanley Ambrose, “regional stylistic and technological variants are clearly identifiable, suggesting the emergence of true cultural traditions and culture areas.”  The new techniques, known as Levallois technology from the place in France where they were first discovered, represent according to Ambrose “an order-of-magnitude increase in technological complexity that may be analogous to the difference between primate vocalizations and human speech.”  Ambrose believes that this type of “composite tool manufacture” requires the kind of complex problem solving, planning and coordination that is mediated by the pfc, and may even have “influenced the evolution of the frontal lobe.”[10]

Fishing net: analogous to the magical weave of modern language

Stage 3: Modern language. This stage may have begun to emerge around 100,000 years ago, but possibly only achieved the magical weave of full syntax around the time of the Upper Paleolithic revolution, about 40,000 years ago.  By this time, our fully human ancestor could have told his friend: “I put the stone that you gave me in the fire and now it’s hot,” with full syntax and recursion.  The correlative level of technology would be the sophisticated tools associated with the Upper Paleolithic revolution, including grinding and pounding tools, spear throwers, bows, nets and boomerangs.  The same brains that could handle syntax and recursion could also handle the complex planning and hierarchy of the activities required to conceptualize and make these tools.

But as we’ve seen, the Great Leap Forward involved more than sophisticated tools.  It also delivered the first evidence of human behavior that’s not just purely functional but also has ritual or symbolic significance.  For the first time, humans are creating art, consistently decorating their bodies, constructing musical instruments and ritual artifacts.  I suggest that these innovations in the material realm are correlated to one particular aspect of language that may also have emerged at this time: the use of metaphor.


[1] Deacon, op. cit., 321-2.

[2] Ibid., 340.

[3] Ibid.

[4] Evans 2003, op. cit.

[5] Darwin (1871). The descent of man, and selection in relation to sex. Cited by Bickerton (2009) op. cit., 5.

[6] Jackendoff, R. (1999). “Possible stages in the evolution of the language capacity.” Trends in Cognitive Sciences, 3(7:July 1999), 272-279.

[7] Ibid.

[8] Stout, D., Toth, N., Schick, K., and Chaminade, T. (2008). “Neural correlates of Early Stone Age toolmaking: technology, language and cognition in human evolution.” Phil. Trans. R. Soc. Lond. B, 363, 1939-1949.

[9] Bickerton, D. (1990) Language and Species, Chicago: University of Chicago Press.

[10] Ambrose, S. H. (2001). “Paleolithic Technology and Human Evolution.” Science, 291(2 March 2001), 1748-1753.

September 27, 2010

Is the “language instinct” really a “patterning instinct”?

Posted in Language and Myth tagged at 11:15 pm by Jeremy

Steven Pinker’s theory of a “language instinct” has become highly influential in the last 15 years.  But I argue in this section of my book, Finding the Li: Towards a Democracy of Consciousness, that it may be something more fundamental – a “patterning instinct” – that enables humans to learn languages so easily.  This approach is supported by a barrage of recent criticisms of Pinker’s and Chomsky’s idea of a “universal grammar” innate in a human being.


The “language instinct”

“The language instinct” is, in fact, the name of a popular book published in 1994 by renowned cognitive scientist Steven Pinker.  Pinker’s title says it all, and he makes no bones about his position in the language debate.  “Language is not a cultural artifact that we learn the way we learn to tell time or how the federal government works,” he writes.  “Instead, it is a distinct piece of the biological makeup of our brains.”  He goes on to explain why he uses “the quaint term ‘instinct.’  It conveys the idea that people know how to talk in more or less the sense that spiders know how to spin webs.”  As if to draw a line in the sand of linguistic debate, Pinker makes himself even more clear: “Language is no more a cultural invention than is upright posture.  It is not a manifestation of a general capacity to use symbols.”[1]

Steven Pinker: argues for a "language instinct" in the tradition of Noam Chomsky

Pinker sees himself as following in a widely respected philosophical tradition begun by Noam Chomsky, probably the most famous linguist of the twentieth century, and generally considered to be the father of modern linguistics.  Chomsky believes that every human being has an innate knowledge of language, which he calls a “universal grammar.”  The differences in languages around the world merely reflect superficial variations in how the universal grammar is interpreted by different cultures.  With his penchant for catchy terms, Pinker calls this universal grammar “mentalese,” explaining that knowing a language is simply “knowing how to translate mentalese into strings of words and vice versa.  People without language would still have mentalese.”[2]

If language were in fact an instinct, that would surely seem to support the “gradual and early” camp of language evolution, and indeed Pinker makes himself equally clear on this issue, arguing that “there must have been a series of steps leading from no language at all to language as we now find it, each step small enough to have been produced by a random mutation or recombination. Every detail of grammatical competence that we wish to ascribe to selection must have conferred a reproductive advantage on its speakers, and this advantage must be large enough to have become fixed in the ancestral population.”[3]

The combination of Chomsky’s august authority and Pinker’s communicative skills have caused this theory of the origins of language to be widely influential for many years.  However, a barrage of criticism has recently been leveled against this theory based ultimately on the tenet that it “ignores the central organizing theory of modern biology and all that has sprung from it.”[4] The gist of the argument against a language instinct is that language is far too intricate and rapidly changing for any combination of genes to have evolved to control for it specifically.  It makes much more sense to look for the underlying capabilities that evolved to enable language, rather than to view language itself as the result of evolution.  To take a more extreme example for the sake of clarity, if someone argued that there was a “driving instinct” because of the ease with which most people around the world learned to drive a car, we’d want to argue back that we should instead look for the underlying evolved human traits that permitted cars and driving to become ubiquitous, such as our ability to see things far away, to respond quickly to changes in the line of vision, to rapidly assess changes in speed and to employ sophisticated hand/eye/foot coordination.  Just as automobiles, roads and freeways took their shape as a result of our human traits and capabilities, so language evolved as a function of what our brains were capable of doing.  In the words of one well-regarded team, “language is easy for us to learn and use, not because our brains embody knowledge of language, but because language has adapted to our brains.”[5]*

Developing infant: by nine months, she's identifying the sound patterns of her native tongue

An important breakthrough in the debate about a language instinct has been offered by researcher Patricia Kuhl, who has carefully studied how infants distinguish between the different sounds they hear when people speak to them.[6] Kuhl has shown that long before infants have any idea that such a thing as language exists, they are already able to distinguish the different sounds, or “phonetic units,” that make up human speech.  What’s fascinating is that an infant, in her first six months, will discriminate between all kinds of phonetic units, regardless of the language used.  However, at nine months, she’s already more interested in the phonetic units of her particular language.  So, for example, “American infants listen longer to English words, whereas Dutch infants show a listening preference for Dutch words.”  By twelve months, the infant has learned to ignore phonetic units that don’t exist in her own language, and can “no longer discriminate non-native phonetic contrasts.”  The likely reason for this is that right from the beginning, an infant’s mind uses a kind of “statistical inferencing” process[7],  looking for patterns in the sounds she hears, and locking into the more frequent sound patterns.  As time goes on, the infant gets increasingly adept at distinguishing the sound patterns of her own language and begins ignoring those that don’t fit into the patterns she’s already identified.  On the basis of these findings, we can perhaps say that humans possess a “patterning instinct” rather than a language instinct.  Because all infants grow up in societies where language is spoken, this underlying patterning instinct locks into the patterns of language; and it’s this second-order application of the “patterning instinct” that Chomsky and Pinker have seen as a “language instinct.”

But Kuhl’s research demonstrates something even more far-reaching in its implications than a resolution of this particular language debate.  It shows, in her words, that “language experience warps perception.”  By a very early age, the infant’s brain has literally been shaped by the language that she hears around her, causing her to notice some distinctions in sounds and to ignore others.  This early shaping quickly hardens, like a plastic molding, for the rest of her life.  As an example of this, Kuhl describes the inability of monolingual Japanese speakers to distinguish between the sounds /r/ and /l/, even though to a Western speaker this distinction seems obvious.  Japanese listeners “hear one category of sounds, not two.”  These results suggest to Kuhl that “linguistic experience produces mental maps for speech that differ substantially for speakers of different languages.”[8]

If we do, indeed, have a patterning instinct, and if the patterns of the sounds we hear as infants affect the sound patterns we hear for the rest of our lives, then what does that mean for the other kinds of patterns in language?  After all, language is not just about sounds.  It’s also about symbols and meaning.  Is it possible, then, that language shapes our perception, not just of the sounds we hear, but of the very symbols we perceive as having meaning?  If this is the case, it implies that, on an evolutionary timescale, language may perhaps have been instrumental in shaping how we think, perhaps even how the connections within our pfc evolved.


[1] Pinker, S. (1994). The Language Instinct: How the Mind Creates Language, New York: HarperPerennial, 4-5.

[2] Ibid., 44-73.

[3] Pinker, S. and Bloom, P. (1990).  “Natural Language and Natural Selection.” Behavioral and Brain Sciences, 13 (4), 707-784.

[4] Margoliash, D., and Nusbaum, H. C. (2009). “Language: the perspective from organismal biology.” Trends in Cognitive Sciences, 13(12), 505-510.

[5] Christiansen, M. H., and Chater, N. (2008). “Language as shaped by the brain.” Behavioral and Brain Sciences(31 (2008)), 489-558.  For other critiques of the theory of a “language instinct” and “universal grammar,” see Evans, N. (2003). “Context, Culture, and Structuration in the Languages of Australia.” Annual Review of Anthropology(32: 2003), 13-40; Chater, N., Reali, F., and Christiansen, M. H. (2009). “Restrictions on biological adaptation in language evolution.” PNAS, 106(4), 1015-1020; Deacon, op. cit., 27; Fauconnier, G., and Turner, M. (2002). The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities, New York: Basic Books, 173; Aboitiz, F., and Garcia, R. V. (1997). “The evolutionary origin of the language areas in the human brain.  A neuroanatomical perspective.” Brain Research Reviews, 25, 381-396; and Tomasello, M. (2000). The Cultural Origins of Human Cognition, Cambridge, Mass.: Harvard University Press, 94.

[6] Kuhl, P. K. (2000). “A new view of language acquisition.” PNAS, 97(22), 11850-11857.

[7] Fauconnier & Turner 2002, op. cit., 173.

[8] Kuhl, op. cit.

September 24, 2010

Language evolution: “gradual and early” or “sudden and recent”?

Posted in Language and Myth tagged , , at 6:04 pm by Jeremy

Did language evolve early and gradually in human evolution, or was it a more recent development?  This is a major topic of debate among linguists, archeologists and anthropologists, with significant implications for understanding how our minds work.  This section of my book, Finding the Li: Towards a Democracy of Consciousness, introduces this debate.


Language evolution: “gradual and early” or “sudden and recent”?

It seems, at first sight, fairly straightforward.  If language evolved socially as an increasingly sophisticated substitute for grooming, then it must have happened gradually, and a long time ago.  It’s therefore no surprise that Aiello and Dunbar, the grooming theorists, are also proponents of the “gradual and early” emergence of language, arguing that “the evolution of language involved a gradual and continuous transition from non-human primate communication systems,” beginning as far back as two million years ago.  They believe that language most likely “crossed the Rubicon” to its modern state about 300,000 years ago, shortly preceding the emergence of anatomically modern humans.  By 250,000 years ago,  (an era known as the Middle Paleolithic), they believe “groups would have become so large that language with a significant social information content would have been essential.”[1] They are certainly not alone in this view.  For example, another well regarded team of archaeologists describes “a sense of continuity, rather than discontinuity, between human and nonhuman primate cognitive and communicative abilities… We infer that some form of language originated early in human evolution, and that language existed in a variety of forms throughout its long evolution.”[2]

So what’s the problem?  Well, it’s probably become clear by now that language is a network of symbols, connected together by the magical weave of syntax.  If that’s the case, then whoever could produce the symbolic expression of language must have been thinking in a symbolic way, and therefore would likely have produced other material expressions of symbolism.  It therefore seems reasonable to expect that language users would have left some trace of symbolic artifacts such as body ornamentations (e.g. pierced and/or painted shells), carvings of figures, cave paintings, sophisticated hunting and trapping tools (e.g. boomerangs, bows, nets, spear throwers), and maybe even musical instruments.

And in fact, the archeological evidence does indeed point to a time when all these clear expressions of symbolic behavior suddenly emerged.  There’s just one problem.  That time was around thirty to forty thousand years ago in Europe.  Most certainly not 250,000 years ago, when Aiello and Dunbar believe that language was “essential.”  Here’s how Steve Mithen describes this “creative explosion”:

Art makes a dramatic appearance in the archaeological record.  For over 2.5 million years after the first stone tools appear, the closest we get to art are a few scratches on unshaped pieces of bone and stone.  It is possible that these scratches have symbolic significance – but this is highly unlikely.  They may not even be intentionally made.  And then, a mere 30,000 years ago … we find cave paintings in southwest France – paintings that are technically masterful and full of emotive power.[3]

Upper Paleolithic cave art: does it signify the emergence of modern language?

In recent years, as new archeological findings have been unearthed, the timing for what’s known as the “Upper Paleolithic revolution” has been pushed back to around forty to forty-five thousand years ago, but the shift remains as dramatic as ever.  We find the “first consistent presence of symbolic behavior, such as abstract and realistic art and body decoration (e.g., threaded shell beads, teeth, ivory, ostrich egg shells, ochre, and tattoo kits),” ritual artifacts and musical instruments.[4] It’s a veritable “crescendo of change.”[5] This revolution of symbols, which has been aptly named by scientist Jared Diamond the “Great Leap Forward,”[6] is so important that we’ll be reviewing it in more detail in the next chapter, but for now we need to focus on its implications for when language first emerged.

As you might expect, those who emphasize the symbolic nature of language are the strongest proponents of the “late and sudden” school of language emergence.  The most notable of these is the psychologist/archaeologist team Bill Noble and Iain Davidson, who boldly make their claim as follows:

The late emergence of language in the early part of the Upper Pleistocene accounts for the sharp break in the archaeological record after about 40,000 years ago.  This involved … world-wide changes in the technology of stone and especially bone tools, the first well-documented evidence for ritual and disposal of the dead, the emergence of regional variation in style, social differentiation, and the emergence of both fisher-gatherer-hunters and agriculturalists.  All these characteristics of modern human behavior can be attributed to the greater information flow, planning depth and conceptualization consequent upon the emergence of language.

Noble and Davidson don’t actually claim that language use began forty thousand years ago.  They point out that the first human colonization of Australia occurred about twenty thousand years earlier than that, and they believe this huge feat required the sophistication arising from language.  On account of this, they’re willing to push back their date of language emergence, concluding that “sometime between about 100,000 and 70,000 years before the present the behaviour emerged which has become identified as linguistic.”    Still, a lot later than Aiello and Dunbar’s 250,000 years ago.

The disagreement is not just a matter of timing.  It’s also about the way in which language arose.  Noble and Davidson believe that, because of the symbolic nature of language, you can no more have a “half-language” than you can be half-pregnant.  “Our criterion for symbol-based communication,” they state, “is ‘all-or-none.’… As with the notion of something having, or not having, ‘meaning’, symbols are either present or absent, they cannot be halfway there.”  They are joined in this view by Fauconnier and Turner, the team that described the “double scope conceptual blending” characteristic of language. The appearance of language, they write, is “a discontinuity…, a singularity much like the rapid crystallization that occurs when a dust speck is dropped into a supersaturated solution.”[7] The logic is powerful.  Once a group of humans realizes that one symbol (i.e. a word) can relate to another symbol through syntax, then the sky’s the limit.  Any word can work.  All you need is the underlying set of neural connections to make the realization in the first place, a community that stumbles upon this miraculous power, and then it’s all over.  The symbols weave themselves into language, which then reinforces other symbolic networks such as art, religion and tool use.  “Language assisted social interaction, social interaction assisted the cultural development of language, and language assisted the elaboration of tool use… all intertwined.”[8]

Archaeologist Richard Klein suggests a genetic mutation may have caused the emergence of modern language

Another celebrated archaeologist, Richard Klein, points out the difference in the sheer complexity of life from the Middle Paleolithic era to the Upper Paleolithic revolution.  The artifacts of the Middle Paleolithic were “remarkably homogeneous and invariant over vast areas and long time spans.  Their tools, camp sites and graves were all “remarkably simple.”  By contrast, Upper Paleolithic remains are far more complex, implying “ritual or ceremony.”  For Klein, the difference is so dramatic that he thinks it could be best explained by a “selectively advantageous genetic mutation” that was, “arguably… the most significant mutation in the human evolutionary series.”  What kind of mutation would this have been?  “It is especially tempting to conclude,” writes Klein, “that the change was in the neural capacity for language or for ‘symboling.'”  Another team of archaeologists gets even more specific, proposing that “a genetic mutation affected neural networks in the prefrontal cortex approximately 60,000 to 130,000 years ago.”[9]

It’s a powerful argument.  And one that seems incompatible with the “gradual and early” camp.  How should we make sense of it?  Perhaps there’s another way to approach the problem.  At the beginning of the chapter, I mentioned another raging debate over language: whether or not there’s a “language instinct.”  Surely this would help resolve the issue?  After all, if there is a language instinct, then you’d think it would be embedded so deep in the human psyche that we must have been talking to each other at least a few hundred thousand years ago.  So let’s see what light this other debate sheds on the problem.


[1] Aiello & Dunbar, op. cit.

[2] McBrearty, S., and Brooks, A. S. (2000). “The revolution that wasn’t: a new interpretation of the origin of modern human behavior.” Journal of Human Evolution, 39(2000), 453-563.

[3] Quoted in Fauconnier & Turner, op. cit., 183.

[4] Powell, A., Shennan, S., and Thomas, M. G. (2009). “Late Pleistocene Demography and the Appearance of Modern Human Behavior.” Science, 324(5 June 2009), 1298-1301.

[5] Hauser, M. D. (2009). “The possibility of impossible cultures.” Nature, 460(9 July 2009), 190-196.

[6] Diamond, J. (1993). The Third Chimpanzee: The Evolution and Future of the Human Animal, New York: Harper Perennial.

[7] Fauconnier, G., and Turner, M. (2002). The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities, New York: Basic Books, 183.

[8] Ibid.

[9] Coolidge, F. L., and Wynn, T. (2005). “Working Memory, its Executive Functions, and the Emergence of Modern Thinking.” Cambridge Archaeological Journal, 15(1), 5-26.

September 20, 2010

From grooming to gossip

Posted in Language and Myth tagged , at 9:53 pm by Jeremy

This section of my chapter on language looks at the social networking aspects of the evolution of language.  In a way, the development of language happened a lot like the recent growth of the internet.  Here’s the section, from the working draft of my book, Finding the Li: Towards a Democracy of Consciousness.


From grooming to gossip

Imagine you’re standing in a cafeteria line.  You hear multiple conversations around you: “… I heard that she bought it in…”, “can you believe what Joe did…”, “how much did that cost you…”, “so I said to him…”.  Random, meaningless gossip.  But don’t be so quick to dismiss it.  What you’re hearing may be the very foundation of human language and, as such, the key to our entire human civilization.

Was grooming the precursor of language?

This is the remarkable and influential hypothesis of anthropologist team Leslie Aiello and Robin Dunbar.  It begins with the well-recognized fact that chimpanzees and other primates use the time spent grooming each other as an important mode of social interaction, through which they establish and maintain cliques and social hierarchies.  You may recall, from the previous chapter, the “social brain hypothesis” which is based partially on the correlation noticed between primates living in larger groups and the size of their neocortex.  Aiello and Dunbar ingeniously calculated how much time different species would need to spend grooming for their social group to remain cohesive.  Larger groups meant significantly more time spent grooming, with some populations spending “up to 20% of their day in social grooming.”  When they then calculated the group sizes that early humans probably lived in, they realized that they would have had “to spend 30-45% of daytime in social grooming in order to maintain the cohesion of the groups.”  As they point out, this was probably an unsustainable amount of time.  Gradually, in their view, the mimetic forms of communication discussed in the previous chapter would have grown more significant, offering a more efficient form of social interaction than grooming, until finally developing into language.[1]

Researchers have also suggested that the miracle weave of language – its syntax – may have arisen from the complexity of social interactions.  “In fact,” they say, “the bulk of our grammatical machinery enables us to engage in the kinds of social interaction on which the efficient spread of these tasks would have depended. We can combine sentences about who did what to whom, who is going to do what to whom, and so on, in a fast, fluent and largely unconscious way. This supports the notion that language evolved in a highly social, potentially cooperative context.”[2]

Language is an interconnected network - just like the internet

Until now, we’ve been looking at language from the point of view of how an individual’s brain understands it and uses it.  But the crucial importance of the social aspect of language suggests that we also need to view it from the perspective of a network.  Many theorists, including Merlin Donald, see this perspective as all-important, placing “the the origin of language in cognitive communities, in the interconnected and distributed activity of many brains.”[3] Words like “interconnected” and “distributed” bring to mind the recent phenomenon of the rise of the internet, and this is no accident.  In many ways, the dynamics of language evolution offer an ancient parallel to the explosive growth of the internet.  One person could no more come up with language than one computer could create the internet.  In each case, the individual network node – the human brain or the individual computer – needed to achieve enough processing power to participate in a meaningful network, but once that network got going, it became far more important as a driver of change than any individual node.

Another interesting parallel between language and internet evolution is that, in both cases, their growth was self-organized, an emergent network arising from a great many unique interactions without a pre-ordained design.  Linguist Nicholas Evans points out that “language structure is seen to emerge as an unintentional product of intentional communicative acts, such as the wish to communicate or to sound (or not sound) like other speakers.”[4] Donald notes that, through these group dynamics, the complexity of language can become far greater than any single brain could ever design.  “Highly complex patterns,” he writes, “can emerge on the level of mass behavior as the result of interactions between very simple nervous systems… Language would only have to emerge at the group level, reflecting the complexity of a communicative environment.  Brains need to adapt to such an environment only as parts of a distributed web.  They do not need to generate, or internalize, the entire system as such.”[5]

Donald compares this dynamic to how an ant colony can demonstrate far more intelligence than any individual ant.  We’ll explore in more detail this kind of self-organized intelligence in the second section of this book, but at this point, the time has come to consider what these perspectives might bring to the unresolved question of when language actually arose in our history.


[1] Aiello, L. C., and Dunbar, R. I. M. (1993). “Neocortex Size, Group Size, and the Evolution of Language.” Current Anthropology, 34(2), 184-193.

[2] Szathmary, E., and Szamado, S. (2008), op. cit.

[3] Donald, op. cit., 253-4.

[4] Evans, N. (2003). “Context, Culture, and Structuration in the Languages of Australia.” Annual Review of Anthropology(32: 2003), 13-40.

[5] Donald, op. cit., 284.

August 30, 2010

The neuroanatomy of language

Posted in Language and Myth tagged , , at 11:24 pm by Jeremy

What parts of the brain are responsible for language?  Most people up to speed on the subject would argue for Broca’s area and Wernicke’s area.  But it’s really the prefrontal cortex and its symbolizing capability that’s responsible for our language capability.  Here’s a section of my book draft, Finding the Li: Towards a Democracy of Consciousness, that explains in more detail.

[Go to previous section]

The neuroanatomy of language

Considering the crucial importance of the pfc in enabling symbolic thought, it has been relatively ignored until recently as a major anatomical component of our capability for language.  Traditionally, when researchers studied the anatomical evolution of language, they focused attention not just on the brain’s capacity but also on our descended larynx, which was thought to be a unique feature of the human vocal tract.  However, recent studies have shown that a number of other species, including dogs barking, lower their larynx during vocalization, and some mammals even have a permanently descended larynx.  An even more powerful argument against the descended larynx as a prerequisite of language is that infants born deaf can learn American Sign Language with as much speed and fluency as hearing children learn spoken language.  There’s seems little doubt that the human larynx co-evolved with our language capacity to enable our fine, subtle distinctions in speech sounds, but it doesn’t seem to have been required for language development.[1] In the words of Merlin Donald, “it is the brain, not the vocal cords, that matters most.”[2]

Broca discovered a crucial area relating to language in the late 19th century

Even within the brain itself, the pfc hasn’t had much press in relation to language.  In the late nineteenth century, two European physicians named Paul Broca and Carl Wernicke focused attention on two different regions in the left hemisphere of the cerebral cortex – now named appropriately enough Broca’s area and Wernicke’s area – as the parts of the brain that control language.  They made their discoveries primarily through observing patients who had suffered physical damage to their brains in these regions and had lost their ability to speak normally (known as aphasia.)  For over a hundred years, it has become generally accepted that these two areas are the “language centers” of the brain.[3] Equally importantly, both of these areas were noticed to be on the left side of the brain, and in recent decades neuroanatomical research has shown that the left hemisphere is generally the one most used for sequential processing, for creating “a narrative and explanation for our actions,” for acting as our “interpreter.”[4]

However, although Broca’s and Wernicke’s areas have long been viewed as unique to humans, recent research has shown them also to be active in other primates.  In one study, for example, the brains of three chimpanzees were scanned as they gestured and called to a person requesting food that was out their reach.  As they did so, the chimps showed activation in the brain region that corresponds to Broca’s area in humans.[5] Terrence Deacon believes that, rather than view these areas as “language centers” controlling our ability to speak, we should rather think of language as using a network of different processes in the brain.  Broca’s area is adjacent to the part of the brain that controls our mouth, tongue and larynx; and Wernicke’s area is adjacent to our auditory cortex.  Therefore, these areas likely evolved as key nodes in the language network of the brain, which would explain the aphasia resulting from damage to them.  “Broca’s and Wernicke’s areas,” Deacon explains, “represent what might be visualized as bottlenecks for information flow during language processing; weak links in a chain of processes.”[6] Neuroscientist Jean-Pierre Changeux agrees, arguing that “efficient communication of contextualized knowledge involves the concerted activity of many more cortical areas than the ‘language areas’ identified by Broca and Wernicke.”[7]

Deacon also warns against reading too much into left hemisphere specialization, known as lateralization.  He sees lateralization as “probably a consequence and not a cause or even precondition for language evolution,” pointing out that several other mammals, including other primates, also show lateralization, and that even in humans, nearly 10 percent of people are “not left-lateralized in this way.”  Lateralization, in his view, “is more an adaptation of the brain to language than an adaptation of the brain for language.”[8]

So, if it’s not the larynx, not Broca’s and Wernicke’s areas, and not lateralization, is there anything about the human anatomy that makes it uniquely capable of creating language?  No prizes for guessing that the answer may be the pfc.  As Deacon puts it, “two of the most central features of the human language adaptation” are “the ability to speak and the ability to learn symbolic associations.”[9] We’ve already noted that skilled vocalizations are a helpful, but not a necessary, part of our language capability.  So that leaves “the symbol-learning problem,” which “can be traced to the expansion of the prefrontal cortical region, and the preeminence of its projections in competition for synapses throughout the brain.”[10] Changeux once again agrees, noting that “propositions and structured speech can be seen as evolutionary phenomena accompanying the expansion of the prefrontal cortex,”[11] as does celebrated neuroscientist Joaquin Fuster who writes that “given the role of prefrontal networks in cognitive functions, it is reasonable to infer that the development of those networks underlies the development of highly integrative cognitive functions, such as language.”[12]

If the pfc was, in fact, the central driver of the emergence of language, what light (if any) does that shed on those raging debates about when and at what rate language evolved, and whether there is something that can be called a “language instinct”?  In order to answer that, we need to understand a little more about the social context in which language emerged.


[1] For a full review of this issue, see Fitch, W. T. (2005). “The evolution of language: a comparative review.” Biology and Philosophy, 20, 193-230.

[2] Donald, M. (1991). Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition, Cambridge, Mass.: Harvard University Press, 39.

[3] See Donald op. cit., 45-94, for a full discussion of the history of anatomical theories of human language.

[4] Gazzaniga, M. S. (2009). “Humans: the party animal.” Dædalus(Summer 2009), 21-34.

[5] Taglialatela, J. P., Russell, J. L., Schaeffer, J. A., and Hopkins, W. D. (2008). “Communicative Signaling Activates ‘Broca’s’ Homolog in Chimpanzees.” Current Biology, 18, 343-348.

[6] Deacon, op. cit., 288.

[7] Changeux, J.-P. (2002). The Physiology of Truth: Neuroscience and Human Knowledge, M. B. DeBevoise, translator, Cambridge, Mass.: Harvard University Press, 123.

[8] Deacon, op. cit., 310, 315.  Italics in original.

[9] Ibid., 220.

[10] Ibid.

[11] Changeux, op. cit., 123-4.

[12] Fuster, J. M. (2002). “Frontal lobe and cognitive development.” Journal of Neurocytology, 31(December 2002), 373-385.

August 23, 2010

Language: weaving a net of symbols

Posted in Language and Myth tagged , at 10:46 pm by Jeremy

Here’s the first section of Chapter 3 of my book draft, Finding the Li: Towards a Democracy of Consciousness.  This chapter’s about the evolution of language.  This first section delves into what’s special about language, contrasting it to the calls of vervet monkeys described by Seyfarth & Cheney.  It parses a typical sentence to highlight linguistic features such as “double-scope conceptual blending,” displacement, counterfactuals, and the “magical weave” of syntax.

As always, constructive comments are warmly welcomed.

Weaving a net of symbols.

Given that it’s something every one of us uses every day of our lives, and which has been studied for millennia, it’s amazing how much the experts still disagree about language.  For example, consider the question of when language first emerged.  Some researchers argue for a long, slow, evolution of language, beginning in the time of our hominid ancestors several million years ago, and gradually developing into what we now think of as modern language.  Other experts argue for a much later and more sudden rise of language, perhaps as recently as 40,000 years ago.  There’s even more raging disagreement about the relationship of language and our brains.  Some famous theorists have proposed that we have a “language instinct,” an innate set of neural pathways that have evolved to comprehend the unique attributes of language such as syntax and grammar.  Other researchers argue back that this is impossible, and that what’s innate in our brains is something more fundamental than language itself.

One thing nobody seems to disagree about is the central importance of language to our human experience.  “More than any other attribute,” writes one team of biologists, “language is likely to have played a key role in driving genetic and cultural human evolution.”[1] When you consider your daily life, your interactions with your family, your work, even the way you think about things, you quickly realize that language is necessary for virtually everything.  In the words of one linguist, “everything you do that makes you human, each one of the countless things you can do that other species can’t, depends crucially on language.  Language is what makes us human.  Maybe it’s the only thing that makes us human.”[2]

As we’ll see, language is equally important to the rise of the pfc’s power in human consciousness.  We’ll explore in this chapter how language first gave the pfc the capability to expand its purview beyond its original biological function.  In pursuing this exploration, we’ll find that understanding language in terms of the pfc may help us to untangle some of those debates about language that continue to galvanize the experts, and as we do so, to uncover some insights into the very nature of how we think.

What’s special about language?

Vervet monkeys can call out different threats... but it's not language

First, though, we need to get a handle on what language really is and what’s so special about it.  Perhaps a good place to start is what language isn’t.  Back in 1980, a team of researchers spent over a year in the Amboseli National Park in Kenya, watching groups of vervet monkeys interact, and recording their vocalizations.  What they found made waves in the field of animal communication.  The monkeys have three important natural predators: leopards, eagles and pythons, each of which has a different style of attacking them, either jumping at them, attacking from the sky or from the ground.  The researchers discovered that the monkeys had developed completely different vocalizations to warn their group of each predator: short tonal calls for leopards, low-pitched staccato grunts for eagles and high-pitched “chutters” for snakes.  When the monkeys heard the leopard call, they’d climb up in the trees; an eagle call caused them to look up or run into dense bush; and a snake call had them looking down at the ground around them.  The researchers could induce the different behaviors in the monkeys by playing tape recordings of each call.  “By giving acoustically distinct alarms to different predators,” they explained, “vervet monkeys effectively categorized other species.” These fascinating findings showed that vervet monkeys were capable of what was described as “perceptual categorization… of rudimentary semantic signals.”[3] It certainly showed how smart vervet monkeys are.  But it wasn’t language.

A fundamental characteristic of language is that, in the words of researchers Noble and Davidson, it involves the “symbolic use of communicative signs”.[4] Anthropologist/neuroscientist Deacon agrees with this, suggesting that “when we strip away the complexity, only one significant difference between language and nonlanguage communication remains: the common, everyday miracle of word meaning and reference … which can be termed symbolic reference.”[5] But, an alert reader might ask at this point, wasn’t that what the vervet monkeys were doing?  If we consider the definition of “symbol” from the previous chapter, as something that has a purely arbitrary relationship to what it signifies, then the vervet calls seem to meet that definition.  It’s only because the other vervet monkeys understand the meaning of the grunts or chutters that they know whether to look up or look down.  That may be true, but there’s another aspect of language that sets it apart from the vervet calls: syntax.

“Animal communication is typically non-syntactic, which means that signals refer to whole situations,” explains a team of language researchers.  Human language, on the other hand, “is syntactic, and signals consist of discrete components that have their own meaning… The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolution of human language.”[6] So, when a vervet monkey gives a low-pitched grunt, he’s not saying the word “eagle.”  He’s saying, in one grunt: “There’s an eagle coming, and we’d all better head for the bushes.”  If he grunted twice, that wouldn’t mean “two eagles.”  And if he gave out a grunt followed by a chutter, that wouldn’t mean “an eagle just attacked a snake.”  The vervet monkeys can’t get out of the context of their specific situation.  They can’t use syntax to make “infinite use of finite means.”[7]

To fully understand the power of language, consider the following sentence:

You remember that guy from New York we met at the cocktail party the other day, who told us that if the Fed doesn’t ease the money supply, stocks would fall?

It seems like a simple sentence, but there’s a lot going on under the surface.  First, let’s begin with the words “cocktail party.”  A cocktail refers to a mixed drink.  A party refers to a group of people getting together.  But we all know that “cocktail party” refers to a specific type of party.  It wasn’t necessary for anyone to actually be drinking a cocktail to make it a “cocktail party.”  They might have been serving wine and champagne, but we wouldn’t call it a “wine and champagne party.”  This crucial element of language takes two completely separate aspects of reality – a mixed drink and a social gathering – and blends them together to create a brand new concept. Cognitive scientists Fauconnier and Turner have aptly called this “conceptual blending” and consider it to be “one (and perhaps the) mental operation whose evolution was crucial for language.”[8]

Stock prices don't really fall the way people do

But the complexity really gets going when we come to phrases like “ease the money supply” and “stocks would fall.”  Here, we meet one of the most ubiquitous aspects of modern language, which is the use of tangible metaphors to convey abstract meaning.  We’re so comfortable with these metaphors in our daily language that we don’t even consider them as such, but ponder for a moment what it means to “ease the money supply.”  There’s an underlying metaphor of some kind of reservoir of liquid, perhaps water, which would normally come flowing out to people.  But someone has their hands on a lever of some sort, which keeps the supply controlled.  Now, this person – the Fed – wants everyone to have a little more of the liquid, so they ease up on the lever, allowing some more to flow out.  Similarly, stocks don’t really fall.  People, animals or things might fall, off a table or out of a tree.  But of course when something falls, it goes from a high position to a lower position.  So, we naturally understand that a falling stock is one whose price is moving from higher to lower.  These metaphors are examples of what Fauconnier and Turner see as an advanced form of conceptual blending, which they term “double-scope conceptual blending.”  It’s called “double scope” because it integrates “two or more conceptual arrays… which typically conflict in radical ways on vital conceptual relations” – such as in this case stock prices and falling things – into a “novel conceptual array” which “develops emergent structure not found in either of the inputs.”[9]

There’s still more amazing complexity to that simple sentence.  Notice that it’s referring to someone we met “the other day.”  He’s not there talking to us now.  It all happened somewhere else and in the past, but through language we can bring the past back to the present in a matter of seconds, and we can whisk people or things from anywhere in the universe to be present in our minds with just a few words.  This near-magical power of language is known as displacement, “the ability to make reference to absent entities.”[10]

The magic of language goes even further than displacement.  Consider that we’re being asked to imagine a scenario where stocks would fall if the Fed doesn’t ease the money supply.  This is something that hasn’t actually happened.  It may never happen.  But we can still talk about the scenario with as much ease as if it were happening right now.  This ability of language to create hypothetical situations out of thin air is known as a “counterfactual,” a reference to something that’s not a concrete fact but can still exist in our minds and get communicated through language.

There’s already a lot to be impressed about in that one sentence, with its double-scope conceptual blending, its displacements and its counterfactuals.  But the coup de grace of this sentence and most other sentences in every language of the world is its syntax.  If language is like a net of symbols, we can think of syntax as a magical  weave that can link each section of the net to any other section at a moment’s notice.  Look at how many miraculous conceptual leaps we make while still holding a meaningful narrative together in our minds.  (1) “You remember” (asked in a questioning tone): we’re asked to access our memory;  (2) “that guy”: focus on the category of male humans;  (3) “from New York”: narrow down that category based on where the person is from; (4) “we met at the cocktail party the other day”: create a mental image of the party;  (5) “who told us”: shift from a mere recall of the person to a recollection of the conversation;  (6) “if the Fed doesn’t ease the money supply…”: abrupt transition from an image of the cocktail party to a hypothetical financial scenario.

Recursion: the magical weave of language

This magical weave that we pull off incessantly every day without even being aware of it is known as “recursion,” and is viewed as the most powerful and characteristic feature of modern language, accomplished by the proper placement and linkage of multiple concepts through the syntax of the sentence.  Humans alone took “the power of recursion to create an open-ended and limitless system of communication,” writes a team of linguistic experts, who propose that this power was perhaps “a consequence (by-product)” of some kind of “neural reorganization” that arose from evolutionary pressure on humans, causing previously separate modular aspects of the brain to connect together and create new meaning.[11]*  As we already know from the previous chapter, there’s one part of the brain that’s uniquely connected to permit this cognitive fluidity that underlies our human capabilities: the pfc.

[Next section]

[1] Szathmary, E., and Szamado, S. (2008). “Language: a social history of words.” Nature, 456(6 November, 2008), 40-41.

[2] Bickerton, D. (2009). Adam’s Tongue: How Humans Made Language, How Language Made Humans, New York: Hill and Wang, 4.

[3] Seyfarth, R. M., Cheney, D. L., and Marler, P. (1980). “Monkey Responses to Thee Different Alarm Calls: Evidence of Predator Classification and Semantic Communication.” Science, 210(November 14, 1980), 801-803.

[4] Noble, W., and Davidson, I. (1991). “The Evolutionary Emergence of Modern Human Behaviour: Language and its Archaeology.” Man, 26(2), 223-253.

[5] Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton, 43 – italics in the original.

[6] Nowak, M. A., Plotkin, J. B., and Jansen, V. A. A. (2000). “The evolution of syntactic communication.” Nature, 404(30 March 2000), 495-498.

[7] Nowak, ibid.

[8] Fauconnier, G., and Turner, M. (2008). “The origin of language as a product of the evolution of double-scope blending.” Behavioral and Brain Sciences, 31(5 (2008)), 520-521.

[9] Fauconnier & Turner, ibid.

[10] Liszkowski, U., Schafer, M., Carpenter, M., and Tomasello, M. (2009). “Prelinguistic Infants, but Not Chimpanzees, Communicate About Absent Entities.” Psychological Science, 20(5:17 April 2009), 654-660.

[11] Hauser, M. D., Chomsky, N., and Fitch, W. T. (2002). “The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?” Science, 298(22 November 2002), 1569-1579. Without denying the immense importance of recursion, I would  note that their view that “no other animal” possesses it is still open to debate.  There are possibilities of some kind of recursion in birdsong and in elephant, dolphin and whale communication, which has been described by Marler (1998) as “phonological syntax” in “Animal communication and human language” in: The Origin and Diversification of Language, eds., G. Jablonski & L. C. Aiello. California Academy of Sciences.

August 18, 2010

So what really makes us human?

Posted in Language and Myth tagged , , , , , , at 10:39 pm by Jeremy

Here’s a pdf file of Chapter 2 of my book draft, Finding the Li: Towards a Democracy of Consciousness.  This chapter’s called “So What Really Makes Us Human?”  It traces human evolution over several million years, reviewing Merlin Donald’s idea of mimetic culture and the crucial breakthrough of “theory of mind.”  It examines the “social brain hypothesis” and explores the current thinking in “altruistic punishment” as a key to our unique human capability for empathy and group values.  Finally, it looks at how social intelligence metamorphosed into cognitive fluidity, and how the pfc’s newly evolving connective abilities enabled humans to discover the awesome power of symbols.

Open the pdf file of Chapter 2: “So What Really Makes Us Human?”

As always, constructive comments and critiques from readers of my blog are warmly welcomed.

August 17, 2010

What the pfc did for early humans

Posted in Language and Myth tagged , , , , at 10:49 pm by Jeremy

This section of my book draft, Finding the Li: Towards a Democracy of Consciousness, looks at how the unique powers of the prefrontal cortex gave early humans the capability to construct tools, exercise self-control, and begin to control aspects of the environment around them.  But most of all it gave them the power of symbolic thought, which has become the basis of all human achievement since then.   As always, constructive comments are welcomed.


What the pfc did for early humans

Mithen’s “cognitive fluidity” and Coolidge and Wynn’s “enhanced working memory” are really two different ways of describing the same basic dynamic of the pfc connecting up diverse aspects of the mind’s intelligence to create coherent meaning that wasn’t there before.  But what specifically did this enhanced capability do for our early human ancestors?

To begin with, it enabled us to make tools.  It used to be conventional wisdom that humans are the only tool-makers, so much so that the earliest known genus of the species homo, which lived around two million years ago, is named homo habilis, or “handy man.”  Then, in the 1960s, Jane Goodall discovered that chimpanzees also used primitive tools, such as placing stalks of grass into termite holes.  When Goodall’s boss, Louis Leakey, heard this, he famously replied “Now we must now redefine ‘tool’, redefine ‘man’, or accept chimpanzees as humans!”[1] Well, as we’ve seen in the preceding pages, there’s been plenty of work done in redefining “man” since then, but none of this takes away from the fact that humans clearly use tools vastly more effectively than chimpanzees or any other mammals.

Oldowan tools: better than what any chimpanzee can do.

To be fair to our old “handy man” homo habilis, even the primitive stone tools they left behind, called Oldowan artifacts after the Olduvai Gorge in East Africa where they were first found, represented a major advance in working memory over our chimpanzee cousins.  Steve Mithen has pointed out that some Oldowan tools were clearly manufactured to make other tools, such as “the production of a stone flake to sharpen a stick.”[2] Making a tool to make another tool is unknown in chimpanzees, and requires determined planning, holding the idea of the second tool in your working memory while you’re preparing your first tool.  Oldowan artifacts remained the same for a million years, so even though they were an advance over chimp technology, there was none of the innovation that we associate with our modern pfc functioning.  The next generation of tools, called the Acheulian industry, required more skillful stone knapping, and show attractive bilateral symmetry, but they also remained the same for another million years or so.[3] It was around three hundred thousand years ago, shortly before anatomically  modern humans emerged, that stone knapping really took off, with stone-tipped spears and scrapers with handles representing “an order-of-magnitude increase in technological complexity.”[4]

Acheulian tools: improved on Oldowan but stayed the same for a million years

None of these tools – even the more primitive Oldowan and Acheulian – can be made by chimpanzees, and they could never have existed without the power of abstraction provided by the pfc.[5]*   Planning for this kind of tool-making required a concept of the future, when the hard work put into making the tool would turn out to be worthwhile.  As psychologists Liberman and Trope have pointed out, transcending the present to mentally traverse distances in time and in space “is made possible by the human capacity for abstract processing of information.”  Making function-specific tools, they note, “required constructing hypothetical alternative scenarios of future events,” which could only be done through activating a “brain network involving the prefrontal cortex.”[6]

Another fundamental human characteristic arising from this abstraction of past and future is the power of self-control.  As one psychologist observes, “self-control is nearly impossible if there is not some means by which the individual is capable of perceiving and valuing future over immediate outcomes.”[7] Anyone who has watched children grow up and gradually become more adept at valuing delayed rewards over immediate gratification will not be surprised at the fact that the pfc doesn’t fully develop in a modern human until her early twenties.

This abstraction of the future gave humans not only the power to control themselves but also to control things around them.  A crucial pfc-derived human characteristic is the notion of will, the conscious intention to perform a series of activities, sometimes over a number of years, to achieve a goal.  Given the fundamental nature of this capability, it’s not surprising that, as Tomasello points out, in many languages the word that denotes the future is also the word “for such things as volition or movement to a goal.”   In English, for example, the original notion of “I will it to happen” is embedded in the future tense in the form “It will happen.”[8]

This is already an impressive range of powerful competencies made available to early humans by the pfc.  But of all the powers granted to humans by the awesome connective faculties of the pfc, there seems little doubt that the most spectacular is the power to understand and communicate sets of meaningful symbols, known as symbolization.

The symbolic net of human experience

Ernst Cassirer: first to define humans as "animal symbolicum"

A full generation before Louis Leakey realized it was time to “redefine man,” a German philosopher named Ernst Cassirer who had fled the Nazis was already doing so, writing in 1944 that “instead of defining man as an animal rationale we should define him as an animal symbolicum.”[9] He wasn’t alone in this view.  A leading American anthropologist, Leslie White, also believed that the “capacity to use symbols is a defining quality of humankind.”[10] Because of our use of symbols, Cassirer wrote, “compared with the other animals man lives not merely in a broader reality; he lives, so to speak, in a new dimension of reality.”[11]

Why would the use of symbols take us to a different dimension of reality?  First, it’s important to understand what exactly is meant by the word “symbol.”  In the terminology adopted by cognitive anthropologists, we need to differentiate between an icon, an index, and a symbol.  A simple example may help us to understand the differences.  Suppose it’s time for you to feed your pet dog.  You open your pantry and look at the cans of pet food available.  Each can has a picture on it of the food that’s inside.  That picture is known as an icon, meaning it’s a “representative depiction” of the real thing.  Now, you open the can and your dog comes running, because he smells the food.  The smell is an index of the food, meaning it’s “causally linked” to what it signifies.  But now suppose that instead of giving your hungry dog the food, you wrote on a piece of paper “FOOD IN TEN MINUTES” and put it in your dog’s bowl.  That writing is a symbol, meaning that it has a purely arbitrary relationship to what it signifies, that can only be understood by someone who shares the same code.  Clearly, your dog doesn’t understand symbols, and now he’s pawing at the pantry door trying to get to his food.[12]*

A hungry dog doesn't respond to a note saying "food in ten minutes"

To understand how symbols arose, and why they are so important, it helps to begin with the notion of working memory discussed earlier.   Terrence Deacon has suggested that symbolic thought is “a way of offloading redundant details from working memory, by recognizing a higher-order regularity in the mess of associations, a trick that can accomplish the same task without having to hold all the details in mind.”[13] Remember the image of working memory as a blackboard?  Now imagine a teacher asking twenty-five children to come up and write on the blackboard what they had to eat that morning before they came to school.  The blackboard would quickly fill up with words like cereals and eggs, pancakes and waffles.  Now, suppose that, once the blackboard’s filled up, the teacher erases it all and just writes on the blackboard the word “BREAKFAST”.  That one word, by common consent, symbolizes everything that had previously been written on the blackboard.  And now it’s freed up the rest of the blackboard for anything else.

That’s the powerful effect that the use of symbols has on human cognition.  But there’s another equally powerful aspect of writing that one word “BREAKFAST” on the blackboard.  Every schoolchild has her own experience of what she ate that morning, but by sharing in the symbol “BREAKFAST,” she can rise above the specifics of her own particular meal and understand that there’s something more abstract that is being communicated, referring to the meal all the kids had before they came to school regardless of what it was.  For this reason, symbols are an astonishingly powerful means of communicating, allowing people to  transcend their individual experiences and share them with others.  Symbolic communication can therefore be seen as naturally emerging from human minds evolving on the basis of social intelligence.  This has led one research team to define modern human behavior as “behavior that is mediated by socially constructed patterns of symbolic thinking, actions, and communication.”[14]

Once it got going, symbolic thought became so powerful that it pervaded every aspect of how we think about the world.  In Cassirer’s words:

Man cannot escape from his own achievement… No longer in a merely physical universe, man lives in a symbolic universe.  Language, myth, art, and religion are parts of this universe.  They are the varied threads which weave the symbolic net, the tangled web of human experience.  All human progress in thought and experience refines upon and strengthens this net.[15]

Because of our symbolic capabilities, Deacon adds, “we humans have access to a novel higher-order representation system that… provides a means of representing features of a world that no other creature experiences, the world of the abstract.” We live our lives not just in the physical world, “but also in a world of rules of conduct, beliefs about our histories, and hopes and fears about imagined futures.” [16]

For all the power of symbolic thought, there was one crucial ingredient it needed before it could so dramatically take over human cognition.  It needed a means by which each individual could agree on the code to be used in referencing what they meant.  It had to be a code which everyone could learn and that could be communicated very easily, taking into account the vast array of different things that could carry symbolic meaning.  In short, it needed language – that all-encompassing network of symbols that we’ll explore in the next chapter.

[1] Cited in McGrew, W. C. (2010). “Chimpanzee Technology.” Science, 328, 579-580.

[2] Mithen 1996, op. cit., 96.

[3] Proctor, R. N. (2003). “The Roots of Human Recency: Molecular Anthropology, the Refigured Acheulean, and the UNESCO Response to Auschwitz.” Current Anthropology, 44(2: April 2003), 213-239.

[4] Ambrose, S. H. (2001). “Paleolithic Technology and Human Evolution.” Science, 291(2 March 2001), 1748-1753.

[5] Mithen 1996, op. cit., p. 97 relates a failed attempt to get a famous bonobo named Kanzi, who was very advanced in linguistic skills, to make Oldowan-style stone tools.

[6] Liberman, N., and Trope, Y. (2008). “The Psychology of Transcending the Here and Now.” Science, 322(21 November 2008), 1201-1205.

[7] Barkley, op. cit.

[8] Tomasello, op. cit., p. 43.

[9] Cassirer, E. (1944). An Essay on Man, New Haven: Yale University Press, 26.

[10] Cited by Renfrew, C. (2007). Prehistory: The Making of the Human Mind, New York: Modern Library: Random House, 91.

[11] Cassirer, op. cit.

[12] The distinction, originally made by American philosopher Charles Sanders Peirce, is described in detail in Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton; and is also referred to by Noble, W., and Davidson, I. (1996). Human Evolution, Language and Mind: A psychological and archaeological inquiry, New York: Cambridge University Press.  I am grateful to Noble  & Davidson for the powerful image of writing words to substitute for food in the dog’s bowl as an example of a symbol.

[13] Deacon, op. cit., p. 89.

[14] Henshilwood, C. S., and Marean, C. W. (2003). “The Origin of Modern Human Behavior: Critique of the Models and Their Test Implications”Current Anthropology. City, pp. 627-651.

[15] Cassirer, op. cit.

[16] Deacon, op. cit., p. 423.

August 10, 2010

From social intelligence to cognitive fluidity

Posted in Language and Myth tagged , at 9:47 pm by Jeremy

This section of my book draft, Finding the Li: Towards a Democracy of Consciousness, examines how humans originally developed a social intelligence which evolved over time to what’s been called “cognitive fluidity” by  renowned archaeologist Steve Mithen.  It ties in Mithen’s view with the theory of Coolidge & Wynn that enhanced working memory is responsible for this unique aspect of human cognition.  As always, constructive comments are welcomed.


From social intelligence to cognitive fluidity

Whether our social intelligence has caused us to be fundamentally cooperative, competitive, or both, there’s one aspect of it that most researchers can agree on: it’s driven by the actions of the pfc.  And increasingly, it’s believed that most of the special capabilities of the pfc emerged from its evolution as a tool of social intelligence.  Tomasello, among others, speculates that “the evolutionary adaptations aimed at the ability of human beings to coordinate their social behavior with one another” is what underlies “the ability of human beings to reflect on their own behavior and so to create systematic structures of explicit knowledge.”[1] Another researcher notes that “the neuropsychological functions that create the capacity for culture are very much akin to those capacities attributed to executive functioning—inhibition, self-awareness, self-regulation, imitation and vicarious learning.”[2]

In this view, many of our unique abilities that are mediated by the pfc – abstract thinking, rule-making, mental time travel into the past and the future – arose not because they were in themselves vital for human adaptation but as an accidental by-product of our social cognitive skills.  Surprisingly, this phenomenon has been found to be fairly common in evolution, and has been given the name “exaptation,” meaning a characteristic that evolved for other usages and later got co-opted for its current role.  A classic example of exaptation is bird feathers, which are thought to have originally evolved for regulation of body heat and only later became used as a means of flying.[3]

What is it, then, about the pfc that could take a set of social cognitive skills and transform them into an array of such varied and astonishing capabilities?  One answer to this question might be that the pfc is connected to virtually all other parts of the brain, and this gives it the unique capability to merge different inputs, such as vision and hearing, instinctual urges, emotions and memories, into one integrated story.  This has led one research team to speculate that the “outstanding intelligence of humans” may result not from “qualitative differences” compared with other primates, but from the pfc’s combination of the same functions which may have developed separately in other species.[4] In fact, the human pfc’s connectivity is dramatically greater than that of other primates.  The celebrated neuroscientist, Jean-Pierre Changeux, notes that “from chimpanzee to man one observes an increase of at least 70 percent in the possible connections among prefrontal workspace neurons – undeniably a change of the highest importance.”[5]

At what point did the human brain stop acting like a Swiss army knife?

Archaeologist Steve Mithen has proposed an influential theory of human evolution on this basis.[6] Mithen begins with the premise that early hominids may have developed specialized, or “domain-specific” skills.  For example, they may have developed social intelligence (as discussed above), technical intelligence for tool-making, or increasing knowledge about the natural world, but they were unable to connect these intelligences together.  It’s helpful to imagine these domain-specific intelligences like the blades and tools in a Swiss army knife.  You can use each of them, but you’d be hard pressed to use them all together at the same time.[7] But, Mithen suggests, at some time in the development of the modern human mind, we developed what he calls “cognitive fluidity,” whereby we started combining these domain-specific intelligences into an integrated meta-intelligence.  He gives an example of Neanderthals who may have been socially intelligent and technically able to make clothes and jewellery, but only modern humans, in his view, made the evolutionary jump to combine these skills and make their artefacts in a particular way to “mediate those social relationships.”[8]*

Working memory acts like the blackboard of the mind

Another research team, Coolidge and Wynn, have focused their attention on a particular pfc capability, known as “working memory,” which may have been the linchpin to permit this kind of cognitive fluidity in humans.[9] Working memory is the ability to consciously “hold something in your mind” for a short time.  For example, if someone tells you a phone number and you have to go across the room to write it down, you’ll use your working memory to hold it in your mind until it’s down on paper, at which point it’s freed up for something else.  But working memory is far more than just “short-term memory.”  Comparable to the random access memory (“RAM”) of a computer, it’s the process used by the mind to keep enough discrete items up and running so they can be joined together to arrive at a new understanding or a new plan.  It’s been referred as a “global workspace… onto which can be written those facts that are needed in a current mental program,”[10] or perhaps more concisely, “the blackboard of the mind.”[11] Changeux notes that there is “a very clear difference between … higher primates and man with regard to the quantity of knowledge that they are capable of holding on working memory for purposes of evaluation and planning,”[12] and Coolidge and Wynn have gone on to argue that it’s the enhanced working memory of humans that’s the crucial differentiating factor for our uniqueness.


[1] Tomasello, op. cit. p. 197.

[2] Barkley, R. A. (2001). “The Executive Functions and Self-Regulation: An Evolutionary Neuropsychological Perspective.” Neuropsychology Review, 11(1), 1-29.

[3] Gould, S. J., and Vrba, E. S. (1982). “Exaptation – A Missing Term in the Science of Form.” Paleobiology, 8(1: Winter 1982), 4-15.

[4] Roth, G., and Dicke, U. (2005). “Evolution of the brain and intelligence.” Trends in Cognitive Sciences, 9(5: May 2005), 250-253.

[5] Changeux, J.-P. (2002). The Physiology of Truth: Neuroscience and Human Knowledge, M. B. DeBevoise, translator, Cambridge, Mass.: Harvard University Press, 108-9.

[6] Mithen, S. (1996). The Prehistory of the Mind, London: Thames & Hudson.

[7] The Swiss army knife metaphor is attributed to Leda Cosmides & John Tooby in Mithen, op. cit. 42.

[8] It should be noted that, although Mithen contrasts modern humans with Neanderthals, the cognitive difference between the two is a matter of great controversy.  See Zilhao, J. (2010). “Symbolic use of marine shells and mineral pigments by Iberian Neandertals.” PNAS, 107(3), 1023-1028, for an argument that the difference between the two was demographic/social rather than genetic/cognitive.  However, the Neanderthal issue is not crucial to Mithen’s underlying thesis.

[9] Coolidge, F. L., and Wynn, T. (2001). “Executive Functions of the Frontal Lobes and the Evolutionary Ascendancy of Homo Sapiens.” Cambridge Archaeological Journal, 11(2:2001), 255-60. See also: Coolidge, F. L., and Wynn, T. (2005). “Working Memory, its Executive Functions, and the Emergence of Modern Thinking.” Cambridge Archaeological Journal, 15(1), 5-26.

[10] Duncan, J. (2001). “An Adaptive Coding Model of Neural Function in Prefrontal Cortex.” Nature Reviews: Neuroscience, 2, 820-829.

[11] Patricia Goldman-Rakic quoted by Balter, M. (2010). “Did Working Memory Spark Creative Culture?” Science, 328, 160-163

[12] Changeux, op. cit.

Previous page · Next page