A debate has raged over the past fifteen years as to whether music is an adaptation, a trait that emerged as a result of an evolutionary process. Several well-known psychologists and neuroscientists have weighed in on this subject including Steven Pinker, Daniel J. Levitin, and Pascal Boyer. Pinker, on the one hand, has dismissed music as "auditory cheesecake," while Levitin is convinced that music is an adaptation. Neither, in my view, is precisely correct. In the parlance of evolution, an adaptation is a trait that has a functional role in the life of an organism that evolved and is maintained as a result of natural selection. What we typically think of as music --- a song, with or without words --- is not an adaptation; it is not a trait. As is the case with language and learning, it is our capacity for appreciating, learning, creating, and performing music that is the trait. That capacity has a functional role in the life of an organism.
In How the Mind Works, Steven Pinker referred to music as "auditory cheesecake." It confers no survival advantage, he asserts, but music is merely a "confection crafted to tickle the sensitive spots of at least six of our mental faculties." Music, says Pinker, is a "technology [or a "spandrel" as Stephen Jay Gould might have said], not an adaptation." With music, humans merely exploit the language and communication system that evolved through survival and sexual selection pressures.
Levitin, a former musician and sound specialist turned neuroscientist at McGill University, explains that he "took notice" when Pinker called music "auditory cheesecake" and described music as "useless" as far as biological cause and effect are concerned. In contrast, Pinker puts language, vision, and social reasoning in the category of adaptations that have survival value for the human species. "Music could disappear tomorrow," Pinker says, and our "lifestyle would be virtually unchanged." The musician in Levitin was clearly professionally enraged. So he wrote a book on the subject, This is Your Brain on Music. The final chapter of this book, where Levitin challenges Pinker's views, is titled "The Music Instinct," borrowing a phrase from an earlier Pinker book titled The Language Instinct.
By "instinct," Pinker means that language does not have to be learned. But at its most fundamental level, Pinker does not mean that a language --- the English language, or the Chinese language, or the French language --- does not have to be learned. Humans are not born with genes for English, Chinese, or French language; specific languages are not inherited. At the biological level, we are born with a capacity for language and speech and a capacity for learning a language and speaking that language. In support of his claim that language capacity is a human adaptation, Pinker relies on several attributes, including the fact that it is universal across all cultures and that there are specific brain structures that recognize the rules of speech.
Levitin argues that our capacity for music and learning and creating and performing music is no different. The auditory system that detects, senses, and computes the attributes of music, as well as hands and feet that can be used to to create or establish rhythm, the vocal system that can create tone and pitch, and brain structures that uniquely relate to tone, rhythm, pitch, and chords are physical traits. "Music's evolutionary origin," Levitin writes, "is established because it is present across all humans (meeting the biologists' criterion of being widespread in a species); it has been around a long time (refuting the notion that it is merely auditory cheesecake); it involves specialized brain structures, including dedicated memory systems that can remain functional when other memory systems fail (when a physical brain system develops across all humans, we assume that it has an evolutionary basis); and it is analogous to music making in other (non-human) species. Rhythmic sequences optimally excite recurrent neural networks in mammalian brains, including feedback loops among the motor cortex, the cerebellum, and the frontal regions. Tonal systems, pitch transitions, and chords scaffold on certain properties of the auditory system that were themselves products of the physical world, of the inherent nature of vibrating objects. Our auditory system develops in ways that play on the relation between scales and the overtone series. Musical novelty attracts attention and overcomes boredom, increasing memorability."
Like the English, Chinese or French languages, we still need to learn classical music, folk music, jazz music, and rock and roll music and we need to learn how to perform (speak) these different types of music. And we create technologies for performing music, just like we have created technologies for communicating words.
In The Information, James Gleick (see August 15, 2011 post) cites a 19th century missionary's experience in Africa with tribesmen who communicated across great distances with drum beats, on the one hand an early form of Morse Code, and on the other hand, the drumming relied just as much on rhythm as well as the tone from the beat for conveying meaning. Although it will likely prove impossible to determine, I do not think we can rule out that music (not the same kind of music we think of today) may have been an early prototype for language. Linguistics has led to the discovery that the human brain has formal rules for language syntax. Is the brain not hardwired with formal rules for mathematics and music as well?
Pascal Boyer writes in Religion Explained, " The fact that the brain comes equipped with many specialized inferences and can run them in the decoupled mode may explain why humans the world over engage in a host of activities that carry no clear adaptive value. To illustrate this, consider the auditory cortex of humans, which must perform several complicated tasks. One of these is to sort out the sounds of language from other noises. Information about noises is sent to associative cortical areas that categorize the sounds and identify the nature of their source. Information about the source's location is handled by other specialized circuitry and sent to specific systems. The auditory system must also isolate the sounds of language. All normal humans have the ability to segment a stream of sound emerging from someone else's mouth in terms of isolated sounds, then send this purified representation to cortical areas specialized in word identification. To turn a stream into segments, the system must pay attention to the specific frequencies that define each vowel and the complex noises of consonants, as well as their duration and their effects on each other. To do this, the auditory cortex comprises different subsystems some of which specialize in pure tones and others in more complex stimuli. All this is clearly part of a complex, evolved architecture specialized in fine-grained sound analysis, a task of obvious adaptive value for a species that depends on speech for virtually all communication. But it is also has the interesting consequence that humans are predisposed to detect, produce, remember, and enjoy music. This is a human universal. There is no human society without some musical tradition. Although the traditions are very different, some principles can be found everywhere. For instance, musical sounds are always closer to pure sound than to noise. The equivalence between octaves and the privileged role of particular intervals like fifths and fourths are consequences of the organization of the cortex. To exaggerate a little, what you get from musical sounds are super-vowels (the pure frequencies as opposed to the mixed ones that define ordinary vowels) and pure consonants (produced by rhythmic instruments and the attack of most instruments). These properties make music an intensified form of sound experience from which the cortex receives purified and therefore intense doses of what usually activates it. So music is not really a direct product of our dispositions but a cultural product that is particularly successful because it activates some of our capacities in a particularly intense way." Boyer adds, "This phenomenon is not unique to music. Humans everywhere also fill their environments with artifacts that overstimulate their visual cortex, for instance by providing pure saturated color instead of dull browns and greens of their familiar environment. . . . These activities recruit our cognitive capacities in ways that make some cultural artifacts very salient and likely to be transmitted."
My own view is that language and music are means of communicating information and that language and music were preceded by proto-language and proto-music, both probably emerging in the same relative human time period. Both, in my view, were likely essential to human evolution as a social species. It also may be true, as Darwin surmised, that music was favored by sexual selection pressures. I think back to the views of V.S. Ramachandran (see October 25, 2011 post) attempting to resolve the discrepancy of views between Steven Pinker and S.J. Gould on language and evolution. For Ramanchandran, language did not evolve from some general mechanism for thinking (Gould), but neither did it evolve specifically for purposes of communication (Pinker). What is innate and what evolved, says Ramachandran, is the competence to acquire rules of language. The actual acquisition of language occurs as a result of social interaction. Ramachandran believes that language was enabled by cross linkages in the brain between different motor maps (e.g. the area responsible for manual gestures and the area responsible for orafacial movements). Can we say that what is innate about music is the competence to acquire rules of music, and that the actual acquisition of music occurs as a result of social interaction? If we think of music simply (at least initially) in terms of rhythm and vocal intonation, there is little to separate music and language including symbolic attachments. The most significant difference, however, is that music appears to reach and appeal to human emotions in a way that language perhaps does not. (See January 14, 2012 post and November 6, 2011 post).
Neuroscientist Daniel J. Levitin has made his career studying music and the human brain. In This Is Your Brain on Music, Levitin explains each of the attributes of music --- pitch, rhythm, tempo, contour, timbre, loudness, reverberation, meter, key, melody, and harmony --- and describes how the brain's architecture is essentially hardwired to deal with each element. "Different aspects of music are handled by different neural regions --- the brain uses functional segregation for music processing, and employs a system of feature detectors whose job it is to analyze specific aspects of the musical signal such as pitch, tempo, timbre, and so on. Some of the music processing has points in common with the operations required to analyze other sounds; understanding speech, for example, requires that we segregate a flurry of sounds in words, sentences, and phrases, and that we are able to understand aspects beyond the words, such as sarcasm. Several different dimensions of a musical sound need to be analyzed --- usually involving several quasi-independent neural processes --- and they need to be brought together to form a coherent representation of what we are listening to."
When we comprehend music, not as a song, but in terms of its attributes --- pitch, rhythm, tempo, contour, timbre, loudness, reverberation, meter, key, melody, and harmony --- we can recognize that these are not "technologies" as Pinker refers to music.
In the discussion of Christof Wolff's biography of J.S. Bach (see January 14, 2012 post), I mentioned the long-associated relationship of music and mathematics. Steven Pinker describes our "mathematical intuition" --- babies have the capacity to register quantities very early, which may not necessarily involve counting as we know it, but to distinguish between more or less and later to distinguish intuitively in terms of probabilities. From early mathematical intuition emerges human activity such as counting, measuring, shaping, estimating, moving, and proving. Each of these activities leads to more formal mathematical reasoning: counting (arithmetic), measuring (real numbers, calculus) shaping (geometry, topology) estimating (probability, statistics), moving (mechanics, calculus, dynamics), proving (logic). Formal mathematics emerges from our mathematical intuition, says Pinker. This same reasoning informs me that formal music emerges from our musical intuition --- our capacity for appreciating, learning, creating, and performing music.
While music exploits some of the same neural pathways that speech and language exploit, the fact that there are specially evolved components of the brain that are used in processing some of the elements of music would suggest that our capacity to appreciate, learn, and manipulate the attributes of music had some independent evolutionary value. Levitin observes that music "technology" has been around a long time --- at least 60,000 years based on musical artifacts that have been uncovered. But our capacity for music --- by which I mean our capacity for appreciating, learning, creating and performing music --- must have preceded the creation of musical artifact: there is music in song; there is music in tapping fingers and feet, which does not require a flute or drum. The origins of human language and whether it preceded music are, like the origins of music, murky. There is evidence that human language is at least 50,000 - 100,000 years old. For those who subscribe to the view that a FoxP2 gene mutation contributed to the development of human speech, that might put language in the 50,000 - 60,000 years ago range, about the same time as the oldest age of musical artifacts. It is therefore not entirely outside the realm of plausibility that our language instincts and our music instincts co-evolved or that one only slightly --- in the eons of evolutionary timescale --- preceded the other in human evolution.
Wednesday, February 15, 2012
Saturday, February 4, 2012
Robert Trivers, The Folly of Fools (2011)
The subtitle of Robert Trivers' tome to his nearly lifelong obsession with deceit is The Logic of Deceit and Self-Deception in Human Life. Here is the "logic":
1) Deception is widespread across nature. We are most familiar with this fact in the context of camouflage. Evolution has favored genetic characteristics that conceal a species from its predators. This is not deception in the sense of an intentional or purposeful mental act, but it is nevertheless deception. Trivers also documents deceptive behavioral acts in the non-human animal kingdom that are designed to enhance reproductive success, but again these are typically not intentional or purposeful mental acts, but genetically-driven hardwired behavior in the species. Cognitive-based acts of deception in the non-human primate community appear to occur.
2) Deception is therefore an adaptive strategy that favors survival and reproductive success and must be understood in that light.
3) Deception is widespread in the human species. Trivers cites examples of deception in courtship that appear to be related to reproductive success. There are examples of deception that are related to survival. The Folly of Fools is a catalogue of the examples and means by which humans deceive other humans, a subject that a prior posting in Shakespeare's King Lear addressed (see August 28, 2011 post). And while deceit is a frequent, clever device in Shakespeare's works, deceit has been a feature of the literature of tragedy and comedy since ancient times. False identity is a frequent dramatic device in comedies. The big difference, however, in acts of deception by humans and deception in other species is that human deception is largely, but not entirely a cognitive phenomenon.
This brings us to the larger topic of Trivers' book: self-deception in human life. The question he poses from the outset of this book is this: our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner. Why then, do we act to "destroy" or depreciate the quality of that information through self-deception? Intuitively, one would think that gathering, correctly interpreting and using accurate, detailed information would the be the successful evolutionary adaptive strategy; however, the extensive catalog of the ways in which humans deceive themselves, and The Folly of Fools overwhelms its readers with examples that make you feel that that is all that we do (a point I will come back to later), suggests, by its prevalence, that self-deception is the successful evolutionary strategy. Trivers' explanation: self-deception is essential to humans' ability to practice deception --- "we deceive ourselves the better to deceive others." Trivers' lament is that if humanity understood better that the reinforcing cycle of self-deception and deception we practice frequently has such disastrous consequences for humans --- he discusses aviation and space disasters, war and other conflicts, and even professional disasters in the social sciences --- we would be better at fighting self-deception and reap benefits, both individual and social that we are foregoing by succumbing too easily to deceit.
Readers who are willing to cast aside his and her various biases --- cultural and religious, personal including emotional --- and mentally transport themselves to a state that one of my college professors, John Harsanyi, and later John Rawls called "the veil of ignorance" (see May 12, 2010 post and January 11, 2011 post) will easily accept that the litany of ways Trivers describes that we deceive ourselves are true. The telling of false historical narratives begins with self-deception. This includes self-deception that is deployed for purposes of in-group-integration, nation-building and the construction of religion and religious institutions. Importantly, self-deception is aimed at inflating the self (ego), or, correlatively, derogating others, inducing a sense of empowerment, moral superiority, and control. These examples occur at the level of individuals, however they are deployed at the group level and have their group-level "us versus them" correlates: inflating the family, the community, the corporation, the tribe, the nation, the religion, the race, the species, etc. and derogating other families, communities, corporations, tribes, nations, religions, races and species.
Where The Folly of Fools falls flat is the absence of any significant discussion of how self-deception actually occurs. The fact of self-deception is well-documented by Trivers, but the mechanism is not. There is certainly a larger story here, and some of the prior postings on this blog cover some of the elements: bias, imagination, and memory. The term "bias" appears many times in this book, but it is nowhere systematically explored as it is in Robert Shermer's The Believing Brain (see June 12, 2011 post). Trivers' tome would benefit from inclusion, even if only by reference, of a discussion of the literature of bias. It is instructive for how false historical narratives, political beliefs, religious beliefs, and even our assumptions about the behavior of others are formed. Underlying the formation of bias is how the mind really works --- something we know a lot about now although our knowledge is by no means complete either. The idea of heuristics, as described by Shermer (again, see June 12, 2011 post), the brain's capacity to solve problems through intuition, trial and error, rules of thumb, or other informal shortcuts, when there is no formal means for solving the problem is significant in the formation of beliefs. If Shermer is right that evolution has brought us to form beliefs first and only later do we try to inform our beliefs with facts, then Trivers' starting point to his thesis (stated above) --- that our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner --- is misplaced, or is at least missing an important aspect of how our mind works, that we do not always take in "accurate and detailed" information.
Other postings on this blog have discussed the fact that areas of our brain are devoted in part to trying to explain the information that our sensory organs have delivered to the brain. (See e.g. May 22, 2011 post and November 6, 2011 post). In other posts I have referred to this as our "storytelling" capability, but it includes our capacity for abstraction and imagination and analysis. Imagination is deployed for a variety of mental acts: to explain the physical world that is either to large or too small for us to see (see July 30, 2011 post and November 6, 2011 post); to explain history after rigorous research supported by contemporaneous documentation (see January 14, 2012 post, December 16, 2010 post and March 24, 2010 post); to create pure fantasy (see e.g., June 28, 2011 post and March 28, 2010 post); and to merge both fantasy and history in a retelling that is is either fiction or historical fiction (see July 17, 2011 post and November 16, 2011 post). Douglas Hofstadter, whom I mentioned in the previous post, had this to say in Godel, Escher, Bach:
"Not all descriptions of a person need to be attached to some central symbol for that person, which stores that person's name. Descriptions can be manufactured and manipulated in themselves. We can invent non-existent people by making descriptions of them; we can merge two descriptions when we find they represent a single entity; we can split one description into two when we find it represents two things, not one --- and soon. This 'calculus of descriptions' is at the heart of thinking. It is said to be intensional and not extensional, which means we can 'float' without being anchored down to specific objects. The intensionality of thought is connected to its flexibility; it gives us the ability to imagine hypothetical worlds, to amalgamate different descriptions or chop one description into separate pieces, and so on. Fantasy and fact intermingle very closely in our minds and this is because thinking involves the manufacture and manipulation of complex descriptions."
This is what our mind does, and self-deception is one potential outcome of our cognitive processes. Sometimes that self-deception is accidental, sometimes it is unknowing, and sometimes it is intentional. As Robert Shermer and others have documented, the deception can begin at a very early age before we are of a maturity to act against it and by the time we reach an appropriate age to question what we believe, we are too invested in or it is too costly to rebut engrained beliefs. At this point, it is a matter of memory and human memory is not limited to "accurate and detailed information." (See September 20, 2011 post). The brain has ways of categorizing information in less than a detailed way.
It is possible to read The Folly of Fools and conclude that humans suffer from a persistent state of delusion. If that is the true, the skepticism that Enlightenment philosophers confronted (see February 27, 2011 post) may well still be warranted, and perhaps John Searle's view of the 21st century that the era of skepticism was long past (see January 21, 2011 post) is perhaps unwarranted. I don't think so. In Mapping the Mind (see November 6, 2011 post), Rita Carter cites research indicating that truth telling appears to be the default position of the human brain, and that deception involves extra cognitive effort requiring more energy. The question of whether humans are more inclined to tell the truth or to self-delude themselves is an unanswered question in my view, but I am not inclined to the view that our default state is self-deception. In the prior post discussing Carter's observation, I noted she had not discussed or accounted for the mental short-cuts we often engage in (heuristics) that may rely on certain biases in our perception or understanding of things observed. Those short-cuts may circumvent the extra cognitive effort that self-deception requires.
Trivers seems to think that our difficulty in addressing reality is that neurophysiological system (and hence our conscious experience) is always a fraction of a second behind the actual sensory experience. "Regarding one's personal life, the problem with learning from living is that living is like riding a train while facing backward. That is, we see reality only after it has passed us by. Neurophysiologists have shown that this is literally true. We see (consciously) incoming information as well as our internal intention to act, well after the fact. It seems as if it is difficult to learn after the fact what to predict ahead of the fact; thus our ability to see the future, even that of our own behavior, is often very limited." It is true, as Trivers says, that the left side of our brain devoted to explaining what it is we are experiencing follows by milliseconds the actual sensory experience of what is actually happening to us. But we are talking milliseconds. The fact of the matter is that humans do have the ability to see the future coming (sometimes imperfectly, but sometimes with greater prescience than we realize). As we live our lives, we are, during our waking hours, facing forward. One of the most amazing capacities of the human mind, and perhaps some other species as well, but certainly in humans is that it plays what Antonio Damasio referred to as the "movie within a movie," and we are able mentally visualize and anticipate what is about to happen. Mirror neurons may trigger something as we watch another person that enable us to anticipate what is going to happen to someone else. So we are not living our lives facing backward. Perhaps it is when we are asleep, as Rita Carter noted (see November 6, 2011 post), and our mind is busy building memories, that we are looking backwards.
1) Deception is widespread across nature. We are most familiar with this fact in the context of camouflage. Evolution has favored genetic characteristics that conceal a species from its predators. This is not deception in the sense of an intentional or purposeful mental act, but it is nevertheless deception. Trivers also documents deceptive behavioral acts in the non-human animal kingdom that are designed to enhance reproductive success, but again these are typically not intentional or purposeful mental acts, but genetically-driven hardwired behavior in the species. Cognitive-based acts of deception in the non-human primate community appear to occur.
2) Deception is therefore an adaptive strategy that favors survival and reproductive success and must be understood in that light.
3) Deception is widespread in the human species. Trivers cites examples of deception in courtship that appear to be related to reproductive success. There are examples of deception that are related to survival. The Folly of Fools is a catalogue of the examples and means by which humans deceive other humans, a subject that a prior posting in Shakespeare's King Lear addressed (see August 28, 2011 post). And while deceit is a frequent, clever device in Shakespeare's works, deceit has been a feature of the literature of tragedy and comedy since ancient times. False identity is a frequent dramatic device in comedies. The big difference, however, in acts of deception by humans and deception in other species is that human deception is largely, but not entirely a cognitive phenomenon.
This brings us to the larger topic of Trivers' book: self-deception in human life. The question he poses from the outset of this book is this: our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner. Why then, do we act to "destroy" or depreciate the quality of that information through self-deception? Intuitively, one would think that gathering, correctly interpreting and using accurate, detailed information would the be the successful evolutionary adaptive strategy; however, the extensive catalog of the ways in which humans deceive themselves, and The Folly of Fools overwhelms its readers with examples that make you feel that that is all that we do (a point I will come back to later), suggests, by its prevalence, that self-deception is the successful evolutionary strategy. Trivers' explanation: self-deception is essential to humans' ability to practice deception --- "we deceive ourselves the better to deceive others." Trivers' lament is that if humanity understood better that the reinforcing cycle of self-deception and deception we practice frequently has such disastrous consequences for humans --- he discusses aviation and space disasters, war and other conflicts, and even professional disasters in the social sciences --- we would be better at fighting self-deception and reap benefits, both individual and social that we are foregoing by succumbing too easily to deceit.
Readers who are willing to cast aside his and her various biases --- cultural and religious, personal including emotional --- and mentally transport themselves to a state that one of my college professors, John Harsanyi, and later John Rawls called "the veil of ignorance" (see May 12, 2010 post and January 11, 2011 post) will easily accept that the litany of ways Trivers describes that we deceive ourselves are true. The telling of false historical narratives begins with self-deception. This includes self-deception that is deployed for purposes of in-group-integration, nation-building and the construction of religion and religious institutions. Importantly, self-deception is aimed at inflating the self (ego), or, correlatively, derogating others, inducing a sense of empowerment, moral superiority, and control. These examples occur at the level of individuals, however they are deployed at the group level and have their group-level "us versus them" correlates: inflating the family, the community, the corporation, the tribe, the nation, the religion, the race, the species, etc. and derogating other families, communities, corporations, tribes, nations, religions, races and species.
Where The Folly of Fools falls flat is the absence of any significant discussion of how self-deception actually occurs. The fact of self-deception is well-documented by Trivers, but the mechanism is not. There is certainly a larger story here, and some of the prior postings on this blog cover some of the elements: bias, imagination, and memory. The term "bias" appears many times in this book, but it is nowhere systematically explored as it is in Robert Shermer's The Believing Brain (see June 12, 2011 post). Trivers' tome would benefit from inclusion, even if only by reference, of a discussion of the literature of bias. It is instructive for how false historical narratives, political beliefs, religious beliefs, and even our assumptions about the behavior of others are formed. Underlying the formation of bias is how the mind really works --- something we know a lot about now although our knowledge is by no means complete either. The idea of heuristics, as described by Shermer (again, see June 12, 2011 post), the brain's capacity to solve problems through intuition, trial and error, rules of thumb, or other informal shortcuts, when there is no formal means for solving the problem is significant in the formation of beliefs. If Shermer is right that evolution has brought us to form beliefs first and only later do we try to inform our beliefs with facts, then Trivers' starting point to his thesis (stated above) --- that our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner --- is misplaced, or is at least missing an important aspect of how our mind works, that we do not always take in "accurate and detailed" information.
Other postings on this blog have discussed the fact that areas of our brain are devoted in part to trying to explain the information that our sensory organs have delivered to the brain. (See e.g. May 22, 2011 post and November 6, 2011 post). In other posts I have referred to this as our "storytelling" capability, but it includes our capacity for abstraction and imagination and analysis. Imagination is deployed for a variety of mental acts: to explain the physical world that is either to large or too small for us to see (see July 30, 2011 post and November 6, 2011 post); to explain history after rigorous research supported by contemporaneous documentation (see January 14, 2012 post, December 16, 2010 post and March 24, 2010 post); to create pure fantasy (see e.g., June 28, 2011 post and March 28, 2010 post); and to merge both fantasy and history in a retelling that is is either fiction or historical fiction (see July 17, 2011 post and November 16, 2011 post). Douglas Hofstadter, whom I mentioned in the previous post, had this to say in Godel, Escher, Bach:
"Not all descriptions of a person need to be attached to some central symbol for that person, which stores that person's name. Descriptions can be manufactured and manipulated in themselves. We can invent non-existent people by making descriptions of them; we can merge two descriptions when we find they represent a single entity; we can split one description into two when we find it represents two things, not one --- and soon. This 'calculus of descriptions' is at the heart of thinking. It is said to be intensional and not extensional, which means we can 'float' without being anchored down to specific objects. The intensionality of thought is connected to its flexibility; it gives us the ability to imagine hypothetical worlds, to amalgamate different descriptions or chop one description into separate pieces, and so on. Fantasy and fact intermingle very closely in our minds and this is because thinking involves the manufacture and manipulation of complex descriptions."
This is what our mind does, and self-deception is one potential outcome of our cognitive processes. Sometimes that self-deception is accidental, sometimes it is unknowing, and sometimes it is intentional. As Robert Shermer and others have documented, the deception can begin at a very early age before we are of a maturity to act against it and by the time we reach an appropriate age to question what we believe, we are too invested in or it is too costly to rebut engrained beliefs. At this point, it is a matter of memory and human memory is not limited to "accurate and detailed information." (See September 20, 2011 post). The brain has ways of categorizing information in less than a detailed way.
It is possible to read The Folly of Fools and conclude that humans suffer from a persistent state of delusion. If that is the true, the skepticism that Enlightenment philosophers confronted (see February 27, 2011 post) may well still be warranted, and perhaps John Searle's view of the 21st century that the era of skepticism was long past (see January 21, 2011 post) is perhaps unwarranted. I don't think so. In Mapping the Mind (see November 6, 2011 post), Rita Carter cites research indicating that truth telling appears to be the default position of the human brain, and that deception involves extra cognitive effort requiring more energy. The question of whether humans are more inclined to tell the truth or to self-delude themselves is an unanswered question in my view, but I am not inclined to the view that our default state is self-deception. In the prior post discussing Carter's observation, I noted she had not discussed or accounted for the mental short-cuts we often engage in (heuristics) that may rely on certain biases in our perception or understanding of things observed. Those short-cuts may circumvent the extra cognitive effort that self-deception requires.
Trivers seems to think that our difficulty in addressing reality is that neurophysiological system (and hence our conscious experience) is always a fraction of a second behind the actual sensory experience. "Regarding one's personal life, the problem with learning from living is that living is like riding a train while facing backward. That is, we see reality only after it has passed us by. Neurophysiologists have shown that this is literally true. We see (consciously) incoming information as well as our internal intention to act, well after the fact. It seems as if it is difficult to learn after the fact what to predict ahead of the fact; thus our ability to see the future, even that of our own behavior, is often very limited." It is true, as Trivers says, that the left side of our brain devoted to explaining what it is we are experiencing follows by milliseconds the actual sensory experience of what is actually happening to us. But we are talking milliseconds. The fact of the matter is that humans do have the ability to see the future coming (sometimes imperfectly, but sometimes with greater prescience than we realize). As we live our lives, we are, during our waking hours, facing forward. One of the most amazing capacities of the human mind, and perhaps some other species as well, but certainly in humans is that it plays what Antonio Damasio referred to as the "movie within a movie," and we are able mentally visualize and anticipate what is about to happen. Mirror neurons may trigger something as we watch another person that enable us to anticipate what is going to happen to someone else. So we are not living our lives facing backward. Perhaps it is when we are asleep, as Rita Carter noted (see November 6, 2011 post), and our mind is busy building memories, that we are looking backwards.
Labels:
deceit,
imagination,
Robert Trivers,
The Folly of Fools
Subscribe to:
Posts (Atom)