August 23, 2010
Language: weaving a net of symbols
Here’s the first section of Chapter 3 of my book draft, Finding the Li: Towards a Democracy of Consciousness. This chapter’s about the evolution of language. This first section delves into what’s special about language, contrasting it to the calls of vervet monkeys described by Seyfarth & Cheney. It parses a typical sentence to highlight linguistic features such as “double-scope conceptual blending,” displacement, counterfactuals, and the “magical weave” of syntax.
As always, constructive comments are warmly welcomed.
Weaving a net of symbols.
Given that it’s something every one of us uses every day of our lives, and which has been studied for millennia, it’s amazing how much the experts still disagree about language. For example, consider the question of when language first emerged. Some researchers argue for a long, slow, evolution of language, beginning in the time of our hominid ancestors several million years ago, and gradually developing into what we now think of as modern language. Other experts argue for a much later and more sudden rise of language, perhaps as recently as 40,000 years ago. There’s even more raging disagreement about the relationship of language and our brains. Some famous theorists have proposed that we have a “language instinct,” an innate set of neural pathways that have evolved to comprehend the unique attributes of language such as syntax and grammar. Other researchers argue back that this is impossible, and that what’s innate in our brains is something more fundamental than language itself.
One thing nobody seems to disagree about is the central importance of language to our human experience. “More than any other attribute,” writes one team of biologists, “language is likely to have played a key role in driving genetic and cultural human evolution.” When you consider your daily life, your interactions with your family, your work, even the way you think about things, you quickly realize that language is necessary for virtually everything. In the words of one linguist, “everything you do that makes you human, each one of the countless things you can do that other species can’t, depends crucially on language. Language is what makes us human. Maybe it’s the only thing that makes us human.”
As we’ll see, language is equally important to the rise of the pfc’s power in human consciousness. We’ll explore in this chapter how language first gave the pfc the capability to expand its purview beyond its original biological function. In pursuing this exploration, we’ll find that understanding language in terms of the pfc may help us to untangle some of those debates about language that continue to galvanize the experts, and as we do so, to uncover some insights into the very nature of how we think.
What’s special about language?
First, though, we need to get a handle on what language really is and what’s so special about it. Perhaps a good place to start is what language isn’t. Back in 1980, a team of researchers spent over a year in the Amboseli National Park in Kenya, watching groups of vervet monkeys interact, and recording their vocalizations. What they found made waves in the field of animal communication. The monkeys have three important natural predators: leopards, eagles and pythons, each of which has a different style of attacking them, either jumping at them, attacking from the sky or from the ground. The researchers discovered that the monkeys had developed completely different vocalizations to warn their group of each predator: short tonal calls for leopards, low-pitched staccato grunts for eagles and high-pitched “chutters” for snakes. When the monkeys heard the leopard call, they’d climb up in the trees; an eagle call caused them to look up or run into dense bush; and a snake call had them looking down at the ground around them. The researchers could induce the different behaviors in the monkeys by playing tape recordings of each call. “By giving acoustically distinct alarms to different predators,” they explained, “vervet monkeys effectively categorized other species.” These fascinating findings showed that vervet monkeys were capable of what was described as “perceptual categorization… of rudimentary semantic signals.” It certainly showed how smart vervet monkeys are. But it wasn’t language.
A fundamental characteristic of language is that, in the words of researchers Noble and Davidson, it involves the “symbolic use of communicative signs”. Anthropologist/neuroscientist Deacon agrees with this, suggesting that “when we strip away the complexity, only one significant difference between language and nonlanguage communication remains: the common, everyday miracle of word meaning and reference … which can be termed symbolic reference.” But, an alert reader might ask at this point, wasn’t that what the vervet monkeys were doing? If we consider the definition of “symbol” from the previous chapter, as something that has a purely arbitrary relationship to what it signifies, then the vervet calls seem to meet that definition. It’s only because the other vervet monkeys understand the meaning of the grunts or chutters that they know whether to look up or look down. That may be true, but there’s another aspect of language that sets it apart from the vervet calls: syntax.
“Animal communication is typically non-syntactic, which means that signals refer to whole situations,” explains a team of language researchers. Human language, on the other hand, “is syntactic, and signals consist of discrete components that have their own meaning… The vast expressive power of human language would be impossible without syntax, and the transition from non-syntactic to syntactic communication was an essential step in the evolution of human language.” So, when a vervet monkey gives a low-pitched grunt, he’s not saying the word “eagle.” He’s saying, in one grunt: “There’s an eagle coming, and we’d all better head for the bushes.” If he grunted twice, that wouldn’t mean “two eagles.” And if he gave out a grunt followed by a chutter, that wouldn’t mean “an eagle just attacked a snake.” The vervet monkeys can’t get out of the context of their specific situation. They can’t use syntax to make “infinite use of finite means.”
To fully understand the power of language, consider the following sentence:
You remember that guy from New York we met at the cocktail party the other day, who told us that if the Fed doesn’t ease the money supply, stocks would fall?
It seems like a simple sentence, but there’s a lot going on under the surface. First, let’s begin with the words “cocktail party.” A cocktail refers to a mixed drink. A party refers to a group of people getting together. But we all know that “cocktail party” refers to a specific type of party. It wasn’t necessary for anyone to actually be drinking a cocktail to make it a “cocktail party.” They might have been serving wine and champagne, but we wouldn’t call it a “wine and champagne party.” This crucial element of language takes two completely separate aspects of reality – a mixed drink and a social gathering – and blends them together to create a brand new concept. Cognitive scientists Fauconnier and Turner have aptly called this “conceptual blending” and consider it to be “one (and perhaps the) mental operation whose evolution was crucial for language.”
But the complexity really gets going when we come to phrases like “ease the money supply” and “stocks would fall.” Here, we meet one of the most ubiquitous aspects of modern language, which is the use of tangible metaphors to convey abstract meaning. We’re so comfortable with these metaphors in our daily language that we don’t even consider them as such, but ponder for a moment what it means to “ease the money supply.” There’s an underlying metaphor of some kind of reservoir of liquid, perhaps water, which would normally come flowing out to people. But someone has their hands on a lever of some sort, which keeps the supply controlled. Now, this person – the Fed – wants everyone to have a little more of the liquid, so they ease up on the lever, allowing some more to flow out. Similarly, stocks don’t really fall. People, animals or things might fall, off a table or out of a tree. But of course when something falls, it goes from a high position to a lower position. So, we naturally understand that a falling stock is one whose price is moving from higher to lower. These metaphors are examples of what Fauconnier and Turner see as an advanced form of conceptual blending, which they term “double-scope conceptual blending.” It’s called “double scope” because it integrates “two or more conceptual arrays… which typically conflict in radical ways on vital conceptual relations” – such as in this case stock prices and falling things – into a “novel conceptual array” which “develops emergent structure not found in either of the inputs.”
There’s still more amazing complexity to that simple sentence. Notice that it’s referring to someone we met “the other day.” He’s not there talking to us now. It all happened somewhere else and in the past, but through language we can bring the past back to the present in a matter of seconds, and we can whisk people or things from anywhere in the universe to be present in our minds with just a few words. This near-magical power of language is known as displacement, “the ability to make reference to absent entities.”
The magic of language goes even further than displacement. Consider that we’re being asked to imagine a scenario where stocks would fall if the Fed doesn’t ease the money supply. This is something that hasn’t actually happened. It may never happen. But we can still talk about the scenario with as much ease as if it were happening right now. This ability of language to create hypothetical situations out of thin air is known as a “counterfactual,” a reference to something that’s not a concrete fact but can still exist in our minds and get communicated through language.
There’s already a lot to be impressed about in that one sentence, with its double-scope conceptual blending, its displacements and its counterfactuals. But the coup de grace of this sentence and most other sentences in every language of the world is its syntax. If language is like a net of symbols, we can think of syntax as a magical weave that can link each section of the net to any other section at a moment’s notice. Look at how many miraculous conceptual leaps we make while still holding a meaningful narrative together in our minds. (1) “You remember” (asked in a questioning tone): we’re asked to access our memory; (2) “that guy”: focus on the category of male humans; (3) “from New York”: narrow down that category based on where the person is from; (4) “we met at the cocktail party the other day”: create a mental image of the party; (5) “who told us”: shift from a mere recall of the person to a recollection of the conversation; (6) “if the Fed doesn’t ease the money supply…”: abrupt transition from an image of the cocktail party to a hypothetical financial scenario.
This magical weave that we pull off incessantly every day without even being aware of it is known as “recursion,” and is viewed as the most powerful and characteristic feature of modern language, accomplished by the proper placement and linkage of multiple concepts through the syntax of the sentence. Humans alone took “the power of recursion to create an open-ended and limitless system of communication,” writes a team of linguistic experts, who propose that this power was perhaps “a consequence (by-product)” of some kind of “neural reorganization” that arose from evolutionary pressure on humans, causing previously separate modular aspects of the brain to connect together and create new meaning.* As we already know from the previous chapter, there’s one part of the brain that’s uniquely connected to permit this cognitive fluidity that underlies our human capabilities: the pfc.
 Szathmary, E., and Szamado, S. (2008). “Language: a social history of words.” Nature, 456(6 November, 2008), 40-41.
 Bickerton, D. (2009). Adam’s Tongue: How Humans Made Language, How Language Made Humans, New York: Hill and Wang, 4.
 Seyfarth, R. M., Cheney, D. L., and Marler, P. (1980). “Monkey Responses to Thee Different Alarm Calls: Evidence of Predator Classification and Semantic Communication.” Science, 210(November 14, 1980), 801-803.
 Noble, W., and Davidson, I. (1991). “The Evolutionary Emergence of Modern Human Behaviour: Language and its Archaeology.” Man, 26(2), 223-253.
 Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton, 43 – italics in the original.
 Nowak, M. A., Plotkin, J. B., and Jansen, V. A. A. (2000). “The evolution of syntactic communication.” Nature, 404(30 March 2000), 495-498.
 Nowak, ibid.
 Fauconnier, G., and Turner, M. (2008). “The origin of language as a product of the evolution of double-scope blending.” Behavioral and Brain Sciences, 31(5 (2008)), 520-521.
 Fauconnier & Turner, ibid.
 Liszkowski, U., Schafer, M., Carpenter, M., and Tomasello, M. (2009). “Prelinguistic Infants, but Not Chimpanzees, Communicate About Absent Entities.” Psychological Science, 20(5:17 April 2009), 654-660.
 Hauser, M. D., Chomsky, N., and Fitch, W. T. (2002). “The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?” Science, 298(22 November 2002), 1569-1579. Without denying the immense importance of recursion, I would note that their view that “no other animal” possesses it is still open to debate. There are possibilities of some kind of recursion in birdsong and in elephant, dolphin and whale communication, which has been described by Marler (1998) as “phonological syntax” in “Animal communication and human language” in: The Origin and Diversification of Language, eds., G. Jablonski & L. C. Aiello. California Academy of Sciences.