August 17, 2010

What the pfc did for early humans

Posted in Language and Myth tagged , , , , at 10:49 pm by Jeremy

This section of my book draft, Finding the Li: Towards a Democracy of Consciousness, looks at how the unique powers of the prefrontal cortex gave early humans the capability to construct tools, exercise self-control, and begin to control aspects of the environment around them.  But most of all it gave them the power of symbolic thought, which has become the basis of all human achievement since then.   As always, constructive comments are welcomed.


What the pfc did for early humans

Mithen’s “cognitive fluidity” and Coolidge and Wynn’s “enhanced working memory” are really two different ways of describing the same basic dynamic of the pfc connecting up diverse aspects of the mind’s intelligence to create coherent meaning that wasn’t there before.  But what specifically did this enhanced capability do for our early human ancestors?

To begin with, it enabled us to make tools.  It used to be conventional wisdom that humans are the only tool-makers, so much so that the earliest known genus of the species homo, which lived around two million years ago, is named homo habilis, or “handy man.”  Then, in the 1960s, Jane Goodall discovered that chimpanzees also used primitive tools, such as placing stalks of grass into termite holes.  When Goodall’s boss, Louis Leakey, heard this, he famously replied “Now we must now redefine ‘tool’, redefine ‘man’, or accept chimpanzees as humans!”[1] Well, as we’ve seen in the preceding pages, there’s been plenty of work done in redefining “man” since then, but none of this takes away from the fact that humans clearly use tools vastly more effectively than chimpanzees or any other mammals.

Oldowan tools: better than what any chimpanzee can do.

To be fair to our old “handy man” homo habilis, even the primitive stone tools they left behind, called Oldowan artifacts after the Olduvai Gorge in East Africa where they were first found, represented a major advance in working memory over our chimpanzee cousins.  Steve Mithen has pointed out that some Oldowan tools were clearly manufactured to make other tools, such as “the production of a stone flake to sharpen a stick.”[2] Making a tool to make another tool is unknown in chimpanzees, and requires determined planning, holding the idea of the second tool in your working memory while you’re preparing your first tool.  Oldowan artifacts remained the same for a million years, so even though they were an advance over chimp technology, there was none of the innovation that we associate with our modern pfc functioning.  The next generation of tools, called the Acheulian industry, required more skillful stone knapping, and show attractive bilateral symmetry, but they also remained the same for another million years or so.[3] It was around three hundred thousand years ago, shortly before anatomically  modern humans emerged, that stone knapping really took off, with stone-tipped spears and scrapers with handles representing “an order-of-magnitude increase in technological complexity.”[4]

Acheulian tools: improved on Oldowan but stayed the same for a million years

None of these tools – even the more primitive Oldowan and Acheulian – can be made by chimpanzees, and they could never have existed without the power of abstraction provided by the pfc.[5]*   Planning for this kind of tool-making required a concept of the future, when the hard work put into making the tool would turn out to be worthwhile.  As psychologists Liberman and Trope have pointed out, transcending the present to mentally traverse distances in time and in space “is made possible by the human capacity for abstract processing of information.”  Making function-specific tools, they note, “required constructing hypothetical alternative scenarios of future events,” which could only be done through activating a “brain network involving the prefrontal cortex.”[6]

Another fundamental human characteristic arising from this abstraction of past and future is the power of self-control.  As one psychologist observes, “self-control is nearly impossible if there is not some means by which the individual is capable of perceiving and valuing future over immediate outcomes.”[7] Anyone who has watched children grow up and gradually become more adept at valuing delayed rewards over immediate gratification will not be surprised at the fact that the pfc doesn’t fully develop in a modern human until her early twenties.

This abstraction of the future gave humans not only the power to control themselves but also to control things around them.  A crucial pfc-derived human characteristic is the notion of will, the conscious intention to perform a series of activities, sometimes over a number of years, to achieve a goal.  Given the fundamental nature of this capability, it’s not surprising that, as Tomasello points out, in many languages the word that denotes the future is also the word “for such things as volition or movement to a goal.”   In English, for example, the original notion of “I will it to happen” is embedded in the future tense in the form “It will happen.”[8]

This is already an impressive range of powerful competencies made available to early humans by the pfc.  But of all the powers granted to humans by the awesome connective faculties of the pfc, there seems little doubt that the most spectacular is the power to understand and communicate sets of meaningful symbols, known as symbolization.

The symbolic net of human experience

Ernst Cassirer: first to define humans as "animal symbolicum"

A full generation before Louis Leakey realized it was time to “redefine man,” a German philosopher named Ernst Cassirer who had fled the Nazis was already doing so, writing in 1944 that “instead of defining man as an animal rationale we should define him as an animal symbolicum.”[9] He wasn’t alone in this view.  A leading American anthropologist, Leslie White, also believed that the “capacity to use symbols is a defining quality of humankind.”[10] Because of our use of symbols, Cassirer wrote, “compared with the other animals man lives not merely in a broader reality; he lives, so to speak, in a new dimension of reality.”[11]

Why would the use of symbols take us to a different dimension of reality?  First, it’s important to understand what exactly is meant by the word “symbol.”  In the terminology adopted by cognitive anthropologists, we need to differentiate between an icon, an index, and a symbol.  A simple example may help us to understand the differences.  Suppose it’s time for you to feed your pet dog.  You open your pantry and look at the cans of pet food available.  Each can has a picture on it of the food that’s inside.  That picture is known as an icon, meaning it’s a “representative depiction” of the real thing.  Now, you open the can and your dog comes running, because he smells the food.  The smell is an index of the food, meaning it’s “causally linked” to what it signifies.  But now suppose that instead of giving your hungry dog the food, you wrote on a piece of paper “FOOD IN TEN MINUTES” and put it in your dog’s bowl.  That writing is a symbol, meaning that it has a purely arbitrary relationship to what it signifies, that can only be understood by someone who shares the same code.  Clearly, your dog doesn’t understand symbols, and now he’s pawing at the pantry door trying to get to his food.[12]*

A hungry dog doesn't respond to a note saying "food in ten minutes"

To understand how symbols arose, and why they are so important, it helps to begin with the notion of working memory discussed earlier.   Terrence Deacon has suggested that symbolic thought is “a way of offloading redundant details from working memory, by recognizing a higher-order regularity in the mess of associations, a trick that can accomplish the same task without having to hold all the details in mind.”[13] Remember the image of working memory as a blackboard?  Now imagine a teacher asking twenty-five children to come up and write on the blackboard what they had to eat that morning before they came to school.  The blackboard would quickly fill up with words like cereals and eggs, pancakes and waffles.  Now, suppose that, once the blackboard’s filled up, the teacher erases it all and just writes on the blackboard the word “BREAKFAST”.  That one word, by common consent, symbolizes everything that had previously been written on the blackboard.  And now it’s freed up the rest of the blackboard for anything else.

That’s the powerful effect that the use of symbols has on human cognition.  But there’s another equally powerful aspect of writing that one word “BREAKFAST” on the blackboard.  Every schoolchild has her own experience of what she ate that morning, but by sharing in the symbol “BREAKFAST,” she can rise above the specifics of her own particular meal and understand that there’s something more abstract that is being communicated, referring to the meal all the kids had before they came to school regardless of what it was.  For this reason, symbols are an astonishingly powerful means of communicating, allowing people to  transcend their individual experiences and share them with others.  Symbolic communication can therefore be seen as naturally emerging from human minds evolving on the basis of social intelligence.  This has led one research team to define modern human behavior as “behavior that is mediated by socially constructed patterns of symbolic thinking, actions, and communication.”[14]

Once it got going, symbolic thought became so powerful that it pervaded every aspect of how we think about the world.  In Cassirer’s words:

Man cannot escape from his own achievement… No longer in a merely physical universe, man lives in a symbolic universe.  Language, myth, art, and religion are parts of this universe.  They are the varied threads which weave the symbolic net, the tangled web of human experience.  All human progress in thought and experience refines upon and strengthens this net.[15]

Because of our symbolic capabilities, Deacon adds, “we humans have access to a novel higher-order representation system that… provides a means of representing features of a world that no other creature experiences, the world of the abstract.” We live our lives not just in the physical world, “but also in a world of rules of conduct, beliefs about our histories, and hopes and fears about imagined futures.” [16]

For all the power of symbolic thought, there was one crucial ingredient it needed before it could so dramatically take over human cognition.  It needed a means by which each individual could agree on the code to be used in referencing what they meant.  It had to be a code which everyone could learn and that could be communicated very easily, taking into account the vast array of different things that could carry symbolic meaning.  In short, it needed language – that all-encompassing network of symbols that we’ll explore in the next chapter.

[1] Cited in McGrew, W. C. (2010). “Chimpanzee Technology.” Science, 328, 579-580.

[2] Mithen 1996, op. cit., 96.

[3] Proctor, R. N. (2003). “The Roots of Human Recency: Molecular Anthropology, the Refigured Acheulean, and the UNESCO Response to Auschwitz.” Current Anthropology, 44(2: April 2003), 213-239.

[4] Ambrose, S. H. (2001). “Paleolithic Technology and Human Evolution.” Science, 291(2 March 2001), 1748-1753.

[5] Mithen 1996, op. cit., p. 97 relates a failed attempt to get a famous bonobo named Kanzi, who was very advanced in linguistic skills, to make Oldowan-style stone tools.

[6] Liberman, N., and Trope, Y. (2008). “The Psychology of Transcending the Here and Now.” Science, 322(21 November 2008), 1201-1205.

[7] Barkley, op. cit.

[8] Tomasello, op. cit., p. 43.

[9] Cassirer, E. (1944). An Essay on Man, New Haven: Yale University Press, 26.

[10] Cited by Renfrew, C. (2007). Prehistory: The Making of the Human Mind, New York: Modern Library: Random House, 91.

[11] Cassirer, op. cit.

[12] The distinction, originally made by American philosopher Charles Sanders Peirce, is described in detail in Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton; and is also referred to by Noble, W., and Davidson, I. (1996). Human Evolution, Language and Mind: A psychological and archaeological inquiry, New York: Cambridge University Press.  I am grateful to Noble  & Davidson for the powerful image of writing words to substitute for food in the dog’s bowl as an example of a symbol.

[13] Deacon, op. cit., p. 89.

[14] Henshilwood, C. S., and Marean, C. W. (2003). “The Origin of Modern Human Behavior: Critique of the Models and Their Test Implications”Current Anthropology. City, pp. 627-651.

[15] Cassirer, op. cit.

[16] Deacon, op. cit., p. 423.

March 2, 2010

So What Really Makes Us Human?

Posted in Language and Myth tagged , , , , , , , at 2:38 pm by Jeremy

What is the human version of the elephant’s trunk?

Elephants have trunks; giraffes have necks; anteaters have tongues.  What do we humans have that makes us unique?  At first blush, it seems that answering that should be pretty easy, since we do so much that no other animal does: we build cities, write books, send rockets into space, create art and play music.  But these are all the results of our uniqueness, not the cause.  OK, how about language?  That seems to be something universal to all human beings, which no other animal possesses.[1] Language definitely is a major element in human uniqueness.  But what if we try to go back even further, before language as we know it fully developed?  What was it about our early ancestors that caused them to even begin the process that ended in language?

The influential cognitive neuroscientist Merlin Donald has suggested that, beginning as far back as two million years ago, there was a long and crucial period of human development that he calls the “mimetic phase.”  Here’s how he describes it:

a layer of cultural interaction that is based entirely on a collective web of conventional, expressive nonverbal actions.  Mimetic culture is the murky realm of eye contact, facial expressions, poses, attitude, body language, self-decoration, gesticulation, and tones of voice.[2]

What’s fascinating about the mimetic phase is that we modern humans never left it behind.  We’ve added language on top of it, but our mimetic communication is still, in Donald’s words, “the primary dimension that defines our personal identity.”  You can get a feeling for the power of mimetic expression when you think of communications we make that are non-verbal: prayer rituals, chanting and cheering in a sports stadium, expressions of contempt or praise, intimacy or hostility.  It’s amazing how much we can communicate without using a single word.

A cheering crowd reflects the continued power of mimetic communication.

So before we talked, we chanted, grunted, cheered and even sang.[3] But that still doesn’t explain how we started doing these things that no other creature had done over billions of years of evolution.  Over the past twenty years, a powerful theory, called the Social Brain Hypothesis, has gained increasing acceptance as an explanation for the development of our unique human cognition.  This hypothesis states that “intelligence evolved not to solve physical problems, but to process and use social information, such as who is allied with whom and who is related to whom.”[4]

The underlying logic of this approach is that, when hominids first began adapting themselves to a less wooded environment, they didn’t have a lot of physical advantages: they couldn’t compete well with other predators for food, and were pretty vulnerable themselves to hungry carnivores.  So, more than ever before, they banded together.  As they did so, they faced ever-increasing cognitive demands from being in bigger social groups.  And it wasn’t just the size of the group, but the complexity of the lifestyle that increased.  If you were going out with your buddies on a long hunting trip, how could you know for sure that nobody else was going to jump into bed with your partner while you were gone?

With dilemmas like this to face, early hominids got involved in “an ‘arms race’ of increasing social skill”, learning to use “manipulative tactics” to their best advantage.[5] But a newly emerging implication of this line of research is that cooperation may have played just as large a part as competition in contributing to our human uniqueness.  Neuroscientist Michael Gazzaniga summarizes this viewpoint as follows:

although cognition in general was driven mainly by social competition, the unique aspects of human cognition – the cognitive skills of shared goals, joint attention, joint intentions, and cooperative communication needed to create such things as complex technologies, cultural institutions, and systems of symbols – were driven, even constituted, not by social competition, but social cooperation.[6]

In fact, some prominent anthropologists go farther and suggest that it was “the particular demands” of our unusually “intense forms of pairbonding that was the critical factor that triggered” the evolution of our large brains.[7]

Whether it was competition, cooperation or love, all these new forms of social complexity required a radical breakthrough in the human brain: the ability to look at others and realize that they had a mind that functioned somewhat like your own; to realize that when they did something, they were most likely being motivated by the same sort of things that motivated you.  This realization has been called “theory of mind,” and in the past thirty years, has come to be recognized as fundamental to human development. [8]

Once you become aware that other people seem to have minds like yours, you will naturally start speculating on what goals, beliefs and intentions they might be holding in those minds.  And at that point, it takes only a small step to turn the gaze inward and start asking the same questions about your own mind.  A small step, but a gigantic leap for mankind.  Because that inward gaze, that application of “theory of mind” to one’s own mind, was what led to the dramatic emergence of self-awareness, the consciousness of oneself as an autonomously existing entity.[9]

It was in this momentous evolutionary transformation that the human prefrontal cortex (pfc) first initiated what Terrence Deacon calls “the translation of social behavior into symbolic form,” which I refer to as its “stirrings of power.” Neuroscientists have identified that the unique pfc-mediated attributes known as “executive function” – self-awareness, inhibition, symbolization, etc. – are the same skills required for theory of mind and other aspects of social interaction.[10] In particular, a part of the pfc called the medial frontal cortex has been identified as having “a special role in social cognition,” including knowledge about the self and perceptions of the motivations of others.[11]

Identifying hoofprints: an early adaptive advantage of the pfc.

In terms of daily survival, this meant that our early ancestors could take the same cognitive tools they were using to figure out the motivations of others, and apply them to the external world.  This would have opened up an enormous set of possibilities for better foraging.  Archaeologist Steve Mithen gives a great example of identifying hoofprints.  “Footprints,” he points out, “just like symbols, must be placed into an appropriate category if correct meaning is to be attributed.”[12] A deer’s hoofprint looks nothing like the deer itself, so only a human mind, equipped with its symbolizing pfc, is capable of imputing the meaning of the one from the other.

Another powerful capability arising from this new cognitive toolkit was a sense of past and future.  Research in neuroscience has shown that “thinking about the future, remembering the past, and taking another person’s perspective activate a common brain network involving the prefrontal cortex.”[13] As has been noted by psychiatrist Russell Barkley, without a sense of the future, it would be “nearly impossible” to exercise self-control.[14] After all, you’re not going to stop yourself from instant gratification unless you can convince yourself that the same “you” will still be existing in some future period.  This is the reason why little children, with undeveloped pfcs, have such difficulty deferring immediate rewards.

The cognitive toolbox of the pfc, in Deacon’s words, “provides a means of representing features of a world that no other creature experiences, the world of the abstract.”[15] But these great leaps in human capability didn’t come without some dire costs.  Perhaps the greatest of them all is awareness of our own eventual deaths.  As psychologist Gordon Gallup points out, “to be aware of your own existence raises the possibility of confronting the inevitability of your eventual demise… Death awareness is a unique price that we pay for self-awareness.”[16]

We humans, alone among the animals of this world, know that we’re going to die.  We’re also alone, as far as we know, in asking the question “Why?”  The “social intelligence”-driven question of “Why did my partner do what she did?” eventually leads to: “Why does the sun rise in the morning?”  “Why did my loved one have to die?” and ultimately: “Why are we here?”

Other animals, to varying degrees and in different forms, ask the other major questions in life: “What?” “Who?” “Where?” and even to a limited extent “When?”  But only we humans seem to have the capability to ask “Why?”

So the next time someone asks you what’s really unique about humans, I suggest that the best way to respond to them is with this very simple, profound and memorable verse:

Fish gotta swim
Bird gotta fly
Man gotta sit and say
Why why why.[17]

[1] Other animals such as chimpanzees, parrots and dolphins, have been shown to have the rudimentary capabilities of language; but no other animals appear able to communicate with each other using the complex, recursive web of symbols characteristic of human language.

[2] Donald, M. (2001). A Mind So Rare: The Evolution of Human Consciousness, New York: Norton, 265.

[3] A number of recent theories of language suggest that we sang long before we spoke, and raise the possibility that language evolved from a form of song.  See, for example, Mithen, S. (2006). The Singing Neanderthals: The Origins of Music, Language, Mind, and Body, Cambridge, Mass.: Harvard University Press.

[4] Emery, N. J., and Clayton, N. S. (2004). “The Mentality of Crows: Convergent Evolution of Intelligence in Corvids and Apes.” Science, 306(December 10, 2004), 1903-1907, summarizing the original hypothesis published by Byrne & Whiten in Machiavellian Intelligence: Social Evolution in Monkeys, Apes and Humans.

[5] Byrne & Whiten quoted by Gazzaniga, M. S. (2009). “Humans: the party animal.” Dædalus(Summer 2009), 21-34.

[6] Gazzaniga, op. cit. describing the so-called “Vygotskian Intelligence Hypothesis” of Henrike Moll and Michael Tomasello.

[7] Dunbar, R. I. M., and Shultz, S. (2007). “Evolution in the Social Brain.” Science, 317(7 September 2007).

[8] See Povinelli, D. J., and Preuss, T. M. (1995). “Theory of mind: evolutionary history of a cognitive specialization.” Trends in Neurosciences, 18(9:November 9, 1995), 418-424; also Singer, T. (2006). “The neuronal basis and ontogeny of empathy and mind reading: Review of literature and implications for future research.” Neuroscience and Biobehavioral Reviews(30 (2006)), 855-863.

[9] See Povinelli & Preuss; Singer op. cit.  Note that some theorists (e.g. Gallup, G. G. Jr. (1998). “Self-awareness and the evolution of social intelligence.” Behavioural Processes, 42, 239-247) propose a different direction of development than my description, from self-awareness to theory of mind.

[10] See Barkley, R. A. (2001). “The Executive Functions and Self-Regulation: An Evolutionary Neuropsychological Perspective.” Neuropsychology Review, 11(1), 1-29; also Roth, G., and Dicke, U. (2005). “Evolution of the brain and intelligence.” Trends in Cognitive Sciences, 9(5: May 2005), 250-253.

[11] Amodio, D. M., and Frith, C. D. (2006). “Meeting of minds: the medial frontal cortex and social cognition.” Nature Reviews: Neuroscience, 7(April 2006), 268-277.

[12] Mithen, S. (1996). The Prehistory of the Mind, London: Thames & Hudson, 161-2.

[13] Liberman, N., and Trope, Y. (2008). “The Psychology of Transcending the Here and Now.” Science, 322(21 November 2008), 1201-1205.

[14] Barkley, op. cit.

[15] Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton, 423.

[16] Gallup, op. cit.

[17] Quoted by McEvilley, T. (2002). The Shape of Ancient Thought: Comparative Studies in Greek and Indian Philosophies, New York: Allworth Press.

February 12, 2010

What does “the tyranny of the prefrontal cortex” really mean? A detailed critique.

Posted in Pfc tyranny: overview tagged , , , , , at 3:26 pm by Jeremy

Throughout this blog, I make the argument that in our modern society we are experiencing a tyranny of the prefrontal cortex (pfc) over other aspects of our consciousness.  Some people have a hard time swallowing this argument, for a number of reasons.  So I’ve written this post for anyone who’s interested enough to read further, but who’s feeling skeptical about what I’m suggesting.

[If you’re not familiar with my blog, please click here for an introduction to my theme, and then click back to this post for more detail.]

How could the prefrontal cortex really be a tyrant?

First, what do I mean by “tyranny”?  I’m suggesting that the unique evolutionary expansion of the pfc in the human brain, combined with the dynamics of culture (itself a product of pfc activity) has created a positive feedback loop leading to an imbalance within the human psyche, both collectively and individually.  Collectively, this imbalance manifests in the extreme characteristics of our global society, such as our unsustainable use of natural resources to fuel exponentially accelerating material growth.  Individually, this tyranny refers to our unreflective absorption of fundamental values that prioritize pfc-mediated attributes at the expense of other aspects of our humanity.  I believe that this dynamic is the ultimate source of a large part of the social and individual discontent we all experience on a daily basis.

This entire blog is dedicated to explaining and providing the evidence for this argument.  The rest of this post, however, raises some fundamental and reasonable objections to my use of the phrase “tyranny of the pfc” to describe this dynamic, and attempts to answer them.

Please feel free to leave comments below if you find yourself with objections to my approach that remain unanswered.

“How can you refer to the pfc as a ‘tyrant’ when it’s just a part of our brain?”

This is a great place to begin.  Back in 2003, neuroscientist M.R. Bennett and philosopher P.M.S. Hacker teamed up to accuse many other neuroscientists of committing what they called the “mereological fallacy in neuroscience.”[1] This, they explained, is the fallacy of ascribing human attributes like thinking, believing, understanding, etc., to the human brain, when these attributes can only reasonably be applied to the complete human being.  “Only a human being,” they write, “can intelligibly and literally be said to see or be blind, hear or be deaf, ask questions or refrain from asking.”  It’s called the “mereological” fallacy because mereology is the study of relations between parts and wholes.

So, clearly, accusing the pfc of tyranny falls foul of the mereological fallacy?  The pfc can’t act like a tyrant.  Only a person can.  Well, that’s true to the extent that a tyranny literally means rule by a tyrant.  But, as Merriam-Webster tells us, a tyranny can also refer to “a rigorous condition imposed by some outside agency or force,” such as in the phrase “living under the tyranny of the clock.”  That’s the way in which I’m using the word.  There’s one definition of tyranny that I came across (unfortunately I can no longer find its source) which captures well what I’m describing.  It goes as follows:

Excessive control wrested by one particular agent disrupting a previous balance, in which power is maintained and used for the benefit of the controlling agent to the potential detriment of the group(s) being tyrannized.

So, when I refer to the pfc’s imbalance as a tyranny, I mean that there’s been a shift in power within our individual and collective consciousness, and the predominant pfc-mediated values that have arisen in our global society, as a result of this imbalance, work to the detriment of other aspects of our humanity.

The pfc is more frequently viewed as our benign chief executive.

By the way, this “mereological fallacy” is pervasive throughout neuroscientific thought, especially when applied to the pfc.  Usually, though, the pfc is referred to in more benign terms as our “chief executive” rather than our tyrant.  For example, in his book on the prefrontal cortex, neuroscientist Elkhonon Goldberg refers to:

…the frontal lobes as the brain’s CEO, capable of taking ‘an aerial view’ of all the other functions of the brain and coordinating them; the frontal lobes as the brain’s conductor, coordinating the thousand instruments in the brain’s orchestra.  But above all, the frontal lobes as the brain’s leader, leading the individual into the novelty, the innovations, the adventures of life.[2]

I think that everything Goldberg says about the pfc here makes sense (if you can accept the mereological fallacy).  The difference is: I argue that in Western civilization over the past two thousand years, our “leader” has taken inordinate control, and this leadership might now be viewed more accurately as a tyranny.

“But why should we even use a political metaphor in the first place to describe the workings of the human brain?”

Using a political metaphor in describing our human cognitive process is part of an old tradition that linguistic philosophers Lakoff & Johnson refer to as the “society of mind” metaphor.  Here’s how they describe it:

The Society of Mind metaphor is basic to faculty psychology.  In the metaphor, the mind is conceptualized as a society whose members perform distinct, nonoverlapping tasks necessary for the successful functioning of that society.  The capacities of the mind are thereby conceptualized as autonomous, individual people, each with a different job and each with a distinct, appropriate personality.[3]

They then go on to describe in detail the “folk model of faculty psychology” composed of “individual people, each with a different job and each with a distinct, appropriate personality.”  For example, Feeling is “undisciplined, volatile, and sometimes out of control.”  Reason “has good judgment, is cool, controlled, wise, and utterly reliable.”  Will “is the only person in the society who can move the body to action.”  They note that, “after several hundred years, a version of this folk theory of the mind is still influential in philosophy of mind, as well as in the various cognitive sciences.”

In support of this claim, three leading cognitive scientists (Varela, Thompson & Rosch) strongly defend the “model of the mind as a society of numerous agents,” arguing that:

… the overall picture of mind not as a unified, homogenous entity, nor even as a collection of entities, but rather as a disunified, heterogeneous collection of networks of processes seems not only attractive but also strongly resonant with the experience accumulated in all the fields of cognitive science.[4]

So in this blog, I’m taking a model used by others, but turning it around somewhat, arguing that these friendly old characters like Reason and Will may actually be agents of a force that’s become tyrannical, and that perhaps some of the other folk, like Feeling, may be have been unfairly tarnished by the tyrant’s propaganda.

In fact, I’ve come to believe that the “society of mind” metaphor may have run its useful course, and that as our understanding of consciousness reaches a new level of sophistication, there may be far more helpful metaphors to use, such as “music”, in describing the workings of human cognition.  I offer this approach in my other blog, called Finding the Li: Towards a Democracy of Consciousness.

“You can’t localize any significant brain function in one place, like the pfc.  All major brain functions are highly distributed.  This is like positing the pfc as a ‘homunculus,’ an idea that’s been discredited in neuroscience.”

Don’t look for a homunculus in the brain. There isn’t one.

I agree with the fact that all major brain functions are highly distributed.  And it’s wrong to attribute “intelligence” or “agency” to any one part of the brain, including the pfc.  However, it’s equally apparent from neuroscience that certain parts of the brain are necessary (but not sufficient) for enabling a particular function.  Obvious examples are Broca’s and Wernicke’s areas for language; the visual cortex for sight; amygdala for fear responses, etc.  There is a vast body of evidence from the past twenty years of neuro-imaging that the pfc is responsible for mediating symbolic meaning, among its other functions.  And it’s this symbolizing function of the pfc that I believe has led to its tyranny.

Here’s how anthropologist/neuroscientist Terrence Deacon describes the evolutionary process:

The prominent enlargement of prefrontal cortex and the correlated shifts in connection patterns that occurred during human brain evolution … gave human prefrontal circuits a greater role in many neural processes unrelated to language.

… prefrontal overdevelopment has made us all savants of language and symbolic learning… We tend to apply our one favored cognitive style to everything… we cannot help but see the world in symbolic categorical terms, dividing it up according to opposed features, and organizing our lives according to themes and narratives…  We find pleasure in manipulating the world so that it fits into a symbolic Procrustean[5] bed, and when it does fit and seems to obey symbolic rules, we find the result comforting, even beautiful.[6]

“Your separation of conceptual consciousness (pfc-mediated) from animate consciousness makes no sense.  Brain processes are all integrated and embodied.  There is no separate conceptual consciousness.”

On a neurophysiological basis, this is absolutely true.  I’m not suggesting that there are separate neural pathways for conceptual consciousness.  But most sophisticated analyses of consciousness distinguish primary consciousness (which we share with other animals) from secondary consciousness, which is uniquely human (with the possible exception, to a very limited degree, of chimps and bonobos.)    Here’s how neuroscientist Gerald Edelman describes the distinction:

In animals with primary consciousness, the self that emerges and serves as a reference is not self-conscious. Only with the flowering of higher-order consciousness and linguistic capabilities does a self arise that is nameable to itself…

[H]igher order consciousness… is dependent on the emergence of semantic capabilities and, ultimately, of language… [W]e can, through symbolic exchange and higher-order consciousness, create narratives, fictions, and histories.  We can ask questions about how we can know and thereby deliver our selves to the doorstep of philosophy.[7]

When I’m describing conceptual consciousness, I’m referring to the exclusively human attributes of what Edelman calls our “higher order consciousness.”

“So how can that be a bad thing?  In describing a ‘tyranny of the pfc,’ aren’t you criticizing the very essence of what makes us human?”

Criticizing the prefrontal cortex is as nonsensical as criticizing the heart or the liver.  It’s a fundamental part of our existence and, as we’ve seen above, is probably the most significant part of our anatomy that distinguishes us from other animals.

Most people who study the pfc end up marveling at its awesome creative power.  Goldberg proposes that “without the great development of the frontal lobes in the human brain … civilization could never have arisen.”[8] I wholeheartedly agree with him.  The prominent neuroscientist Antonio Damasio describes the “admirable” and “sublime” operations of the pfc in providing us the mechanisms for “consciousness, reasoned deliberation, and willpower.”[9] I share his admiration and awe.

We are all ensnared in a web of symbols.

But I’m not criticizing the pfc.  Rather, I’m describing a dynamic that has evolved through the combined interplay of the pfc and the human culture it helped created, specifically the culture that has arisen in the Western world over the past two thousand years.  This is the dynamic that, in my view, has led to a tyranny, to an imbalance in our individual psyches and in our society that is both harmful and unsustainable.  As Terrence Deacon puts it:

… the symbolic universe has ensnared us in an inescapable web… and now by virtue of the irresistible urge it has instilled in us to turn everything we encounter and everyone we meet into symbols, we have become the means by which it unceremoniously propagates itself throughout the world…

[T]he invention of durable icons… was the beginning of a new phase of cultural evolution – one that is much more independent of individual human brains and speech, and one that has led to a modern runaway process which may very well prove to be unsustainable into the distant future.[10]

That’s the “tyranny” that I’ll be tracking in the rest of this blog.

“OK.  But I still don’t get it.  Neuroscience is one thing.  Human history is something quite different.  How can you meaningfully analyze history in terms of a neurological function, even one as pervasive as the pfc?”

In this blog, I’ll be attempting to construct what I call a “cognitive history” of human cultural evolution.  This is something that I believe is fairly ground-breaking, but not unique.  For example, psychiatrist Iain McGilchrist has recently published a book called The Master and his Emissary, which traces the development of Western philosophy, art and literature in terms of conflict between the right and left hemispheres of the brain.[11]

It’s an approach which would hopefully go some way to answering the call of prominent anthropologist Bruce Trigger, who believes that the study of human behavior needs to be driven more by a biological, neuroscience component, and who writes:

What is needed is a better understanding, derived from psychology and neuroscience, of how the human brain shapes understanding and influences behavior… Evolution, both biological and cultural, is a process that adapts humans with specific but as yet poorly understood biological, social, and psychological predispositions and needs to the natural and social environment in which they live…Social and cultural phenomena have their own emergent properties and cannot wholly be explained in psychological or biological terms.  Yet neither can human behavior or the nature of society and culture be understood without judiciously taking account of the findings of evolutionary psychology and neuroscience.[12]

I believe that looking at human history within the framework of the ever-increasing domination of the pfc’s functionality permits us to distinguish key stages of human development – language, agriculture, dualism, scientific method –through which we can trace the dynamics of our current civilization from a cognitive historical perspective.  It can allow us to see where Western thought diverged from other thought traditions, such as the one that evolved in East Asia.  It can identify foundational concepts, such as “truth” or “progress”, which we take for granted in today’s world, as products of a unique Western set of values. Finally, I believe that such an approach also leads the way to perceiving what we can do as individuals to undo some of the pfc’s tyranny and work towards what I call a “democracy of consciousness.”  This is something that I explore in more detail in my sister blog, Finding the Li.

So, if you’ve read this far, please browse the blog and enjoy, and don’t hesitate to leave any comments below if you’re still not convinced!

[1] M. R. Bennett, P. M. S. Hacker (2003).  Philosophical Foundations of Neuroscience, Blackwell Publishing, pp. 68-73.

[2] Goldberg, E. (2001). The Executive Brain: Frontal Lobes and the Civilized Mind, New York: Oxford University Press, ix.

[3] Lakoff, G., and Johnson, M. (1999). Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought, New York: Basic Books, 410.

[4] Varela, F. J., Thompson, E., and Rosch, E. (1993). The Embodied Mind: Cognitive Science and Human Experience, Cambridge, Massachusetts: MIT Press.

[5] Procrustean: Producing or designed to produce strict conformity by ruthless or arbitrary means – American Heritage Dictionary

[6] Deacon, T. W. (1997). The Symbolic Species: The Co-evolution of Language and the Brain, New York: Norton, 416-17.

[7] Edelman, G. M. (2003). “Naturalizing consciousness: A theoretical framework.” PNAS, 100(9), 5520-5524 and Edelman, G. M., and Tononi, G. (2000). A Universe of Consciousness: How Matter Becomes Imagination, New York: Basic Books.

[8] Goldberg, op. cit., ix.

[9] Damasio, A. (1994). Descartes’ Error: Emotion, Reason, and the Human Brain, New York: Penguin Books, 123-4.

[10] Deacon, op. cit., 436, 375.

[11] McGilchrist, I. (2009).  The Master and His Emissary: The Divided Brain and the Making of the Western World. London: Yale University Press.

[12] Trigger, B. G. (2003). Understanding Early Civilizations, New York: Cambridge University Press, 686-7.