Saturday, February 4, 2012

Robert Trivers, The Folly of Fools (2011)

The subtitle of Robert Trivers' tome to his nearly lifelong obsession with deceit is The Logic of Deceit and Self-Deception in Human Life. Here is the "logic":

1) Deception is widespread across nature. We are most familiar with this fact in the context of camouflage. Evolution has favored genetic characteristics that conceal a species from its predators. This is not deception in the sense of an intentional or purposeful mental act, but it is nevertheless deception. Trivers also documents deceptive behavioral acts in the non-human animal kingdom that are designed to enhance reproductive success, but again these are typically not intentional or purposeful mental acts, but genetically-driven hardwired behavior in the species. Cognitive-based acts of deception in the non-human primate community appear to occur.

2) Deception is therefore an adaptive strategy that favors survival and reproductive success and must be understood in that light.

3) Deception is widespread in the human species. Trivers cites examples of deception in courtship that appear to be related to reproductive success. There are examples of deception that are related to survival. The Folly of Fools is a catalogue of the examples and means by which humans deceive other humans, a subject that a prior posting in Shakespeare's King Lear addressed (see August 28, 2011 post). And while deceit is a frequent, clever device in Shakespeare's works, deceit has been a feature of the literature of tragedy and comedy since ancient times. False identity is a frequent dramatic device in comedies. The big difference, however, in acts of deception by humans and deception in other species is that human deception is largely, but not entirely a cognitive phenomenon.

This brings us to the larger topic of Trivers' book: self-deception in human life. The question he poses from the outset of this book is this: our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner. Why then, do we act to "destroy" or depreciate the quality of that information through self-deception? Intuitively, one would think that gathering, correctly interpreting and using accurate, detailed information would the be the successful evolutionary adaptive strategy; however, the extensive catalog of the ways in which humans deceive themselves, and The Folly of Fools overwhelms its readers with examples that make you feel that that is all that we do (a point I will come back to later), suggests, by its prevalence, that self-deception is the successful evolutionary strategy. Trivers' explanation: self-deception is essential to humans' ability to practice deception --- "we deceive ourselves the better to deceive others." Trivers' lament is that if humanity understood better that the reinforcing cycle of self-deception and deception we practice frequently has such disastrous consequences for humans --- he discusses aviation and space disasters, war and other conflicts, and even professional disasters in the social sciences --- we would be better at fighting self-deception and reap benefits, both individual and social that we are foregoing by succumbing too easily to deceit.

Readers who are willing to cast aside his and her various biases --- cultural and religious, personal including emotional --- and mentally transport themselves to a state that one of my college professors, John Harsanyi, and later John Rawls called "the veil of ignorance" (see May 12, 2010 post and January 11, 2011 post) will easily accept that the litany of ways Trivers describes that we deceive ourselves are true. The telling of false historical narratives begins with self-deception. This includes self-deception that is deployed for purposes of in-group-integration, nation-building and the construction of religion and religious institutions. Importantly, self-deception is aimed at inflating the self (ego), or, correlatively, derogating others, inducing a sense of empowerment, moral superiority, and control. These examples occur at the level of individuals, however they are deployed at the group level and have their group-level "us versus them" correlates: inflating the family, the community, the corporation, the tribe, the nation, the religion, the race, the species, etc. and derogating other families, communities, corporations, tribes, nations, religions, races and species.

Where The Folly of Fools falls flat is the absence of any significant discussion of how self-deception actually occurs. The fact of self-deception is well-documented by Trivers, but the mechanism is not. There is certainly a larger story here, and some of the prior postings on this blog cover some of the elements: bias, imagination, and memory. The term "bias" appears many times in this book, but it is nowhere systematically explored as it is in Robert Shermer's The Believing Brain (see June 12, 2011 post). Trivers' tome would benefit from inclusion, even if only by reference, of a discussion of the literature of bias. It is instructive for how false historical narratives, political beliefs, religious beliefs, and even our assumptions about the behavior of others are formed. Underlying the formation of bias is how the mind really works --- something we know a lot about now although our knowledge is by no means complete either. The idea of heuristics, as described by Shermer (again, see June 12, 2011 post), the brain's capacity to solve problems through intuition, trial and error, rules of thumb, or other informal shortcuts, when there is no formal means for solving the problem is significant in the formation of beliefs. If Shermer is right that evolution has brought us to form beliefs first and only later do we try to inform our beliefs with facts, then Trivers' starting point to his thesis (stated above) --- that our sensory and neurological systems are devoted to gathering information about our physical well-being and the environment around us in an accurate and detailed manner --- is misplaced, or is at least missing an important aspect of how our mind works, that we do not always take in "accurate and detailed" information.

Other postings on this blog have discussed the fact that areas of our brain are devoted in part to trying to explain the information that our sensory organs have delivered to the brain. (See e.g. May 22, 2011 post and November 6, 2011 post). In other posts I have referred to this as our "storytelling" capability, but it includes our capacity for abstraction and imagination and analysis. Imagination is deployed for a variety of mental acts: to explain the physical world that is either to large or too small for us to see (see July 30, 2011 post and November 6, 2011 post); to explain history after rigorous research supported by contemporaneous documentation (see January 14, 2012 post, December 16, 2010 post and March 24, 2010 post); to create pure fantasy (see e.g., June 28, 2011 post and March 28, 2010 post); and to merge both fantasy and history in a retelling that is is either fiction or historical fiction (see July 17, 2011 post and November 16, 2011 post). Douglas Hofstadter, whom I mentioned in the previous post, had this to say in Godel, Escher, Bach:

"Not all descriptions of a person need to be attached to some central symbol for that person, which stores that person's name. Descriptions can be manufactured and manipulated in themselves. We can invent non-existent people by making descriptions of them; we can merge two descriptions when we find they represent a single entity; we can split one description into two when we find it represents two things, not one --- and soon. This 'calculus of descriptions' is at the heart of thinking. It is said to be intensional and not extensional, which means we can 'float' without being anchored down to specific objects. The intensionality of thought is connected to its flexibility; it gives us the ability to imagine hypothetical worlds, to amalgamate different descriptions or chop one description into separate pieces, and so on. Fantasy and fact intermingle very closely in our minds and this is because thinking involves the manufacture and manipulation of complex descriptions."

This is what our mind does, and self-deception is one potential outcome of our cognitive processes. Sometimes that self-deception is accidental, sometimes it is unknowing, and sometimes it is intentional. As Robert Shermer and others have documented, the deception can begin at a very early age before we are of a maturity to act against it and by the time we reach an appropriate age to question what we believe, we are too invested in or it is too costly to rebut engrained beliefs. At this point, it is a matter of memory and human memory is not limited to "accurate and detailed information." (See September 20, 2011 post). The brain has ways of categorizing information in less than a detailed way.

It is possible to read The Folly of Fools and conclude that humans suffer from a persistent state of delusion. If that is the true, the skepticism that Enlightenment philosophers confronted (see February 27, 2011 post) may well still be warranted, and perhaps John Searle's view of the 21st century that the era of skepticism was long past (see January 21, 2011 post) is perhaps unwarranted. I don't think so. In Mapping the Mind (see November 6, 2011 post), Rita Carter cites research indicating that truth telling appears to be the default position of the human brain, and that deception involves extra cognitive effort requiring more energy. The question of whether humans are more inclined to tell the truth or to self-delude themselves is an unanswered question in my view, but I am not inclined to the view that our default state is self-deception. In the prior post discussing Carter's observation, I noted she had not discussed or accounted for the mental short-cuts we often engage in (heuristics) that may rely on certain biases in our perception or understanding of things observed. Those short-cuts may circumvent the extra cognitive effort that self-deception requires.

Trivers seems to think that our difficulty in addressing reality is that neurophysiological system (and hence our conscious experience) is always a fraction of a second behind the actual sensory experience. "Regarding one's personal life, the problem with learning from living is that living is like riding a train while facing backward. That is, we see reality only after it has passed us by. Neurophysiologists have shown that this is literally true. We see (consciously) incoming information as well as our internal intention to act, well after the fact. It seems as if it is difficult to learn after the fact what to predict ahead of the fact; thus our ability to see the future, even that of our own behavior, is often very limited." It is true, as Trivers says, that the left side of our brain devoted to explaining what it is we are experiencing follows by milliseconds the actual sensory experience of what is actually happening to us. But we are talking milliseconds. The fact of the matter is that humans do have the ability to see the future coming (sometimes imperfectly, but sometimes with greater prescience than we realize). As we live our lives, we are, during our waking hours, facing forward. One of the most amazing capacities of the human mind, and perhaps some other species as well, but certainly in humans is that it plays what Antonio Damasio referred to as the "movie within a movie," and we are able mentally visualize and anticipate what is about to happen. Mirror neurons may trigger something as we watch another person that enable us to anticipate what is going to happen to someone else. So we are not living our lives facing backward. Perhaps it is when we are asleep, as Rita Carter noted (see November 6, 2011 post), and our mind is busy building memories, that we are looking backwards.

No comments:

Post a Comment