September 24, 2010

Language evolution: “gradual and early” or “sudden and recent”?

Posted in Language and Myth tagged , , at 6:04 pm by Jeremy

Did language evolve early and gradually in human evolution, or was it a more recent development?  This is a major topic of debate among linguists, archeologists and anthropologists, with significant implications for understanding how our minds work.  This section of my book, Finding the Li: Towards a Democracy of Consciousness, introduces this debate.

[PREVIOUS SECTION]

Language evolution: “gradual and early” or “sudden and recent”?

It seems, at first sight, fairly straightforward.  If language evolved socially as an increasingly sophisticated substitute for grooming, then it must have happened gradually, and a long time ago.  It’s therefore no surprise that Aiello and Dunbar, the grooming theorists, are also proponents of the “gradual and early” emergence of language, arguing that “the evolution of language involved a gradual and continuous transition from non-human primate communication systems,” beginning as far back as two million years ago.  They believe that language most likely “crossed the Rubicon” to its modern state about 300,000 years ago, shortly preceding the emergence of anatomically modern humans.  By 250,000 years ago,  (an era known as the Middle Paleolithic), they believe “groups would have become so large that language with a significant social information content would have been essential.”[1] They are certainly not alone in this view.  For example, another well regarded team of archaeologists describes “a sense of continuity, rather than discontinuity, between human and nonhuman primate cognitive and communicative abilities… We infer that some form of language originated early in human evolution, and that language existed in a variety of forms throughout its long evolution.”[2]

So what’s the problem?  Well, it’s probably become clear by now that language is a network of symbols, connected together by the magical weave of syntax.  If that’s the case, then whoever could produce the symbolic expression of language must have been thinking in a symbolic way, and therefore would likely have produced other material expressions of symbolism.  It therefore seems reasonable to expect that language users would have left some trace of symbolic artifacts such as body ornamentations (e.g. pierced and/or painted shells), carvings of figures, cave paintings, sophisticated hunting and trapping tools (e.g. boomerangs, bows, nets, spear throwers), and maybe even musical instruments.

And in fact, the archeological evidence does indeed point to a time when all these clear expressions of symbolic behavior suddenly emerged.  There’s just one problem.  That time was around thirty to forty thousand years ago in Europe.  Most certainly not 250,000 years ago, when Aiello and Dunbar believe that language was “essential.”  Here’s how Steve Mithen describes this “creative explosion”:

Art makes a dramatic appearance in the archaeological record.  For over 2.5 million years after the first stone tools appear, the closest we get to art are a few scratches on unshaped pieces of bone and stone.  It is possible that these scratches have symbolic significance – but this is highly unlikely.  They may not even be intentionally made.  And then, a mere 30,000 years ago … we find cave paintings in southwest France – paintings that are technically masterful and full of emotive power.[3]


Upper Paleolithic cave art: does it signify the emergence of modern language?

In recent years, as new archeological findings have been unearthed, the timing for what’s known as the “Upper Paleolithic revolution” has been pushed back to around forty to forty-five thousand years ago, but the shift remains as dramatic as ever.  We find the “first consistent presence of symbolic behavior, such as abstract and realistic art and body decoration (e.g., threaded shell beads, teeth, ivory, ostrich egg shells, ochre, and tattoo kits),” ritual artifacts and musical instruments.[4] It’s a veritable “crescendo of change.”[5] This revolution of symbols, which has been aptly named by scientist Jared Diamond the “Great Leap Forward,”[6] is so important that we’ll be reviewing it in more detail in the next chapter, but for now we need to focus on its implications for when language first emerged.

As you might expect, those who emphasize the symbolic nature of language are the strongest proponents of the “late and sudden” school of language emergence.  The most notable of these is the psychologist/archaeologist team Bill Noble and Iain Davidson, who boldly make their claim as follows:

The late emergence of language in the early part of the Upper Pleistocene accounts for the sharp break in the archaeological record after about 40,000 years ago.  This involved … world-wide changes in the technology of stone and especially bone tools, the first well-documented evidence for ritual and disposal of the dead, the emergence of regional variation in style, social differentiation, and the emergence of both fisher-gatherer-hunters and agriculturalists.  All these characteristics of modern human behavior can be attributed to the greater information flow, planning depth and conceptualization consequent upon the emergence of language.

Noble and Davidson don’t actually claim that language use began forty thousand years ago.  They point out that the first human colonization of Australia occurred about twenty thousand years earlier than that, and they believe this huge feat required the sophistication arising from language.  On account of this, they’re willing to push back their date of language emergence, concluding that “sometime between about 100,000 and 70,000 years before the present the behaviour emerged which has become identified as linguistic.”    Still, a lot later than Aiello and Dunbar’s 250,000 years ago.

The disagreement is not just a matter of timing.  It’s also about the way in which language arose.  Noble and Davidson believe that, because of the symbolic nature of language, you can no more have a “half-language” than you can be half-pregnant.  “Our criterion for symbol-based communication,” they state, “is ‘all-or-none.’… As with the notion of something having, or not having, ‘meaning’, symbols are either present or absent, they cannot be halfway there.”  They are joined in this view by Fauconnier and Turner, the team that described the “double scope conceptual blending” characteristic of language. The appearance of language, they write, is “a discontinuity…, a singularity much like the rapid crystallization that occurs when a dust speck is dropped into a supersaturated solution.”[7] The logic is powerful.  Once a group of humans realizes that one symbol (i.e. a word) can relate to another symbol through syntax, then the sky’s the limit.  Any word can work.  All you need is the underlying set of neural connections to make the realization in the first place, a community that stumbles upon this miraculous power, and then it’s all over.  The symbols weave themselves into language, which then reinforces other symbolic networks such as art, religion and tool use.  “Language assisted social interaction, social interaction assisted the cultural development of language, and language assisted the elaboration of tool use… all intertwined.”[8]

Archaeologist Richard Klein suggests a genetic mutation may have caused the emergence of modern language

Another celebrated archaeologist, Richard Klein, points out the difference in the sheer complexity of life from the Middle Paleolithic era to the Upper Paleolithic revolution.  The artifacts of the Middle Paleolithic were “remarkably homogeneous and invariant over vast areas and long time spans.  Their tools, camp sites and graves were all “remarkably simple.”  By contrast, Upper Paleolithic remains are far more complex, implying “ritual or ceremony.”  For Klein, the difference is so dramatic that he thinks it could be best explained by a “selectively advantageous genetic mutation” that was, “arguably… the most significant mutation in the human evolutionary series.”  What kind of mutation would this have been?  “It is especially tempting to conclude,” writes Klein, “that the change was in the neural capacity for language or for ‘symboling.'”  Another team of archaeologists gets even more specific, proposing that “a genetic mutation affected neural networks in the prefrontal cortex approximately 60,000 to 130,000 years ago.”[9]

It’s a powerful argument.  And one that seems incompatible with the “gradual and early” camp.  How should we make sense of it?  Perhaps there’s another way to approach the problem.  At the beginning of the chapter, I mentioned another raging debate over language: whether or not there’s a “language instinct.”  Surely this would help resolve the issue?  After all, if there is a language instinct, then you’d think it would be embedded so deep in the human psyche that we must have been talking to each other at least a few hundred thousand years ago.  So let’s see what light this other debate sheds on the problem.

[NEXT SECTION]


[1] Aiello & Dunbar, op. cit.

[2] McBrearty, S., and Brooks, A. S. (2000). “The revolution that wasn’t: a new interpretation of the origin of modern human behavior.” Journal of Human Evolution, 39(2000), 453-563.

[3] Quoted in Fauconnier & Turner, op. cit., 183.

[4] Powell, A., Shennan, S., and Thomas, M. G. (2009). “Late Pleistocene Demography and the Appearance of Modern Human Behavior.” Science, 324(5 June 2009), 1298-1301.

[5] Hauser, M. D. (2009). “The possibility of impossible cultures.” Nature, 460(9 July 2009), 190-196.

[6] Diamond, J. (1993). The Third Chimpanzee: The Evolution and Future of the Human Animal, New York: Harper Perennial.

[7] Fauconnier, G., and Turner, M. (2002). The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities, New York: Basic Books, 183.

[8] Ibid.

[9] Coolidge, F. L., and Wynn, T. (2005). “Working Memory, its Executive Functions, and the Emergence of Modern Thinking.” Cambridge Archaeological Journal, 15(1), 5-26.

Advertisements