tag:blogger.com,1999:blog-22820315835227120012024-03-05T06:37:33.242-08:00The BookshelfCSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.comBlogger91125tag:blogger.com,1999:blog-2282031583522712001.post-38756160288883047502023-11-07T09:03:00.000-08:002023-11-07T09:03:35.105-08:00Donald Kagan, Pericles of Athens and the Birth of Democracy (1991)<div>This post was written in 2015, but not posted then.</div><div><br /></div>Over the past twelve months I have read a number of books about Greece and Greek philosophers of the 5th and 4th centuries (BCE) focusing particularly on Socrates and Plato. The order in which I read these books followed no particular plan; however, as I look back, and considering that my objective was to understand more about the relevance of Plato in 21st century --- the subject of Rebecca Goldstein's book <i><a href="http://www.nytimes.com/2014/04/20/books/review/plato-at-the-googleplex-by-rebecca-newberger-goldstein.html?_r=0" target="_blank">Plato at the Googleplex</a> --- </i>there was an order in which I could present them here that made more sense to me than the order in which I read them. Since my initial interest is the context in which these philosophers emerged (see <a href="http://csilcox-thebookshelf.blogspot.com/2014/12/john-boardman-et-al-oxford-history-of.html" target="_blank">previous post</a>), I turn first to <a href="http://history.yale.edu/people/donald-kagan" target="_blank">Donald Kagan'</a>s <i><a href="https://www.commentarymagazine.com/article/pericles-of-athens-and-the-birth-of-democracy-by-donald-kagan/" target="_blank">Pericles of Athens and the Birth of Democracy</a></i>. <br />
<br />
The importance of beginning this investigation with a discussion of Athenian democracy --- "Greek" democracy would be a misnomer because it was not a form of government that was shared or favored across the entire land we now know as Greece --- is due to the fact that for Socrates and Plato democracy was a third or possibly fourth best form of political governance. Yet both were Athenians and thrived in Athens. Late in life, Socrates found himself at odds with his community, Athens, and he was ultimately convicted of crimes against the community and executed in substantial part because of his disdain for democratic governance as practiced in 5th century Athens. Plato must have found himself at odds with Athens too; he felt more comfortable pursuing self-exile from Athens for a period of time after the death of Socrates, probably because he was identified with Socrates. So to understand Socrates and Plato, I choose to follow the <a href="http://csilcox-thebookshelf.blogspot.com/2014/12/john-boardman-et-al-oxford-history-of.html" target="_blank">previous post's</a> discussion of the <a href="http://www.uvm.edu/~jbailly/courses/clas21/notes/atheniandemocracy.html" target="_blank">early evolution of democracy in Attica </a>with a study of democracy in 5th century Athens and the Athenian leader most closely identified with Athenian democracy, Pericles.<br />
<br />
In contemporary America we can become weary of the two and four year cycles of electioneering for public office in our <i>representative</i> democracy. It seems that just as one election is over and the representatives are sworn to office, the campaign to re-elect or unseat these representatives begins anew. In contemporary America, we consider only the first 100 days following the inauguration of the President as the time period that political adversaries will play "wait and see" how the new President will exercise his or her leadership and politely leave the new leader alone before challenging the new President's policies and talking about who will succeed the new President. Contemporary Americans may be surprised to learn that in 5th century BCE democratic Athens, leadership --- comprised of ten generals --- was elected for a one-year term, and re-elected or removed from office every year. While it is not clear to what extent there was any electioneering for these official positions, one's promise of tenure was short and those who survived to serve a longer tenure were few. <br />
<br />
Athens enjoyed a direct legislative democracy responsible for enacting laws and deciding policy as well as judicial forums to resolve private and public disputes. There was no executive branch of government that would be familiar to contemporary Americans. While there were leaders of the legislative body, their powers were limited. Real power rested with the Athenian Assembly, a Council of 500 drawn from the Assembly who were chosen by lot, and, particularly since 5th century (BCE) Athens was a litigious society, with the judicial forums that were really a type of mini-assembly as well. <br />
<br />
The Athenian Assembly met in the area of Athens known as <a href="http://www.stoa.org/athens/sites/pnyx.html" target="_blank">Pnyx hill,</a> an area below the Acropolis on which the Parthenon was later built by Athens under Pericles' leadership. Participation in the Assembly was open to all who were eligible --- citizens. If you chose to participate in the Assembly, you just showed up. And thousands of men did, and the Assembly of thousands addressed every issue that a public body might be expected to discuss or legislate. It met forty times each year. Kagan describes how this process was managed as follows:<br />
<br />
"An assembly of thousands, of course, could not do its business without help. For that it relied on the Council of 500, chosen by lot from all the Athenian citizens [who were term limited at two years]. Although it performed many public functions that the larger body could not handle efficiently, its main responsibility was to prepare legislation for consideration by the people. In this respect, as in all others, the council was the servant of the assembly. The assembly could vote down a bill drafted by the council, change it on the floor, send it back with instructions for redrafting, or replace it with an entirely different bill. Full sovereignty and the real exercise of public authority rested directly with these great mass meetings. Almost no constitutional barrier prevented a majority of the citizens assembly on the Pnyx on a particular day from anything they liked.<br />
<br />
"In Athens, the executive was severely limited in extent, discretionary, and power, and the distinction between legislative and judicial authority was far less clear than in our own society. To begin with, there was no prime minister, no cabinet or any elected official responsible for the management of the state in general, for formulating or proposing a general policy. There was nothing that Americans would call an "Administration" or that the British would call a "government." The chief elected officials were the ten generals all serving one-year terms. As their title indicates, they were basically military officials who commanded the army and navy. They could be reelected without limit, and extraordinary men like Cimon and Pericles were elected almost every year. But they were most exceptional. The political power such men exercised was limited by their personal ability to persuade their fellow-citizens in the assembly to follow their advice. They had no special political or civil authority, and, except on military and naval campaigns, they could give no orders.<br />
<br />
"Even in military matters, the powers of the generals were severely limited. Leaders of expeditions were selected by vote of the full Athenian assembly, which also determined the size of the force and its goals. Before the generals took office they were subjected to a scrutiny of their qualifications by the Council of 500. After completing their year of service, their performance on the job, and especially their financial accounts, were subject to an audit in a process called <i><a href="http://www.ledonline.it/Dike/allegati/Dike10_Efstathiou_Euthina.pdf" target="_blank">euthyna</a></i>. ***<br />
<br />
"Even with these severe controls, the Athenians filled only a few public offices by election, choosing their military officials, naval architects, some of their treasurers, and the superintendents of the water supply in that matter. All other officials were chosen by lot, in accordance with the democratic principle that any citizen was capable of performing civic responsibilities well enough, and this corollary that feared the fall of executive and administrative power into the hands of a few men, even those with experience or special abilities."<br />
<br />
Among the Council of 500 there was a board of presidents of the council who presided over meetings of the assembly and there was a chairman of each day's meeting. The treasurers were likewise selected by lot, as were vendors who farmed out public contracts to operate public facilities (like mines), collectors of taxes, examiners who checked accounts of officials, inspectors of weights and measures and the like.<br />
<br />
If one had to settle on one aspect of Kagan's portrait of Athenian democracy that disturbed Socrates and Plato the most, it would be the principle that "any citizen was capable of performing civic responsibilities well enough," followed by direct participation in legislating and public discourse by any citizen. The idea that officials were selected by lot rather than skill or intelligence or even birthright would be an anathema to both Socrates and Plato. <br />
<br />
And the Athenian judicial system was not ideal in the minds of Socrates and Plato either. Socrates would ultimately face that judicial system in 399 BCE and lose his life. As Kagan describes the Athenian judicial system:<br />
<br />
"The distinction between the assembly and law courts . . . is almost a technicality. The idea behind both institutions is the same: full, direct, popular sovereignty. The panel of six thousand jurors who enlisted to serve in the courts each year, in fact, was called the Heliaea, a name given in other states to the assembly. From this panel on any given day jurors were assigned to specific courts and specific cases. The usual size of a jury was 501, although there were juries from 51 to as many as 1501, depending on whether the case was public or private and how important it was. To avoid any possibility of bribery or partiality, the Athenians evolved an astonishingly complicated system of assignments that effectively prevented tampering.<br />
<br />
"Legal procedure was remarkably different from what takes place in a modern American court The first surprise is the absence of any public prosecutor or state's attorney. There were, in fact, no lawyers at all. Complaints, whether civil or criminal, public or private, large or small, were registered and argued by private citizens. Plaintiff and defendant, suer and sued, each made his case in his own voice, if not in his own language. Anyone was free to hire a speechwriter to help him prepare his case, and the profession flourished, although it did not reach its peak until many years after the days of Pericles. Another surprise is the lack of any judge. The jury was everything. No self-respecting Athenian democrat would allow some individual, whatever his qualification, to tell him what was relevant evidence, what was not, or which laws and precedents applied. That would give too much weight to learning and expertise; it would also increase the danger of corruption and of undemocratic prejudice. It was therefore, up to the contestants in the case to cite the relevant laws and precedents and up to the jurrors to decide between them. Thus, in fundamental matters of justice and fairness, the Athenian democrat put little faith in experts." Penalties were proposed by the Plaintiff and, if found guilty or liable, the defendant would counter-propose an alternate penalty. The jury would select one or the other, but could not choose another.<br />
<br />
This particular form of self-governance by a city was truly unique in the history of human affairs at the time and that distinction is one of the reasons that historians give significant attention to this time and this location in human history. We see not only emergence of the intellectual concept that the governed are governed by their consent, but we also see the emergence of a governmental structure trying to protect that form of government from itself. We might refer to this as a system of checks and balances --- not quite like the American constitutional checks and balances --- but a system aimed at preventing abuse of power and controlling factions.<br />
<br />
Direct democracy did not mean there were not spheres of influence within Athens. As the <a href="http://csilcox-thebookshelf.blogspot.com/2014/12/john-boardman-et-al-oxford-history-of.html" target="_blank">prior post </a>discussed, aristocratic families remained prominent in Athens and male members of these families were active in the Assembly and the military. They possessed wealth that others did not. Over the course of the 5th century BCE, Athenian democracy was challenged and a few times interrupted by conflict with the Athenian aristocracy who believed their fortunes were threatened by the judgments of the Assembly and democratic leaders, and who aligned themselves with militaristic, non-democratic Sparta.<br />
<br />
One of the reforms introduced by <a href="http://ancienthistory.about.com/od/riseofdemocracy/g/041811-Cleisthenes.htm" target="_blank">Cleisthenes</a> as a limit on abuse of power and to deter factionalism and treason, encourage cohesion and consensus, was the process of ostracism. To the modern eye, ostracism would be perceived as arbitrary and capricious deprivation of the rights of citizenship because the process was designed to "vote people off the island" of democracy, at least for a temporary period not exceeding ten years. "Each year," reports Kagan, "the Athenian assembly voted on the question of whether there should be an ostracism. If the majority voted no, there was none. If they voted yes, it took place in a single day in March. On that day, each citizen could write the name of the man eh wanted to remove from the city on a broken piece of pottery --- an <i><a href="https://en.wikipedia.org/wiki/Ostracon" target="_blank">ostracon</a></i>, the scrap paper of antiquity --- and bring it to the Agora. At the end of the day, the <a href="https://en.wikipedia.org/wiki/Archon" target="_blank">archons</a> counted the votes to see if there were six thousand, the number required for some types of important decisions in the Athenian assembly. If six thousand Athenians had voted, the one who had received the most ballots was compelled to leave Attica for a period of ten years. The idea was to allow a popular politician like Cleisthenes, who was confident of majority support, to deter a coup by a hostile faction. The threat to a rival leader, it was thought, would serve as a deterrent to keep him and his faction in line. The institution, a kind of rough-and-ready vote of confidence, seems harsh by modern standards. But it appears to have worked, helping protect Athenian democracy from subversion for almost a century." Recall the <a href="http://csilcox-thebookshelf.blogspot.com/2014/12/john-boardman-et-al-oxford-history-of.html" target="_blank">prior post </a>that described the formation of Greek cities from a collection of family-controlled communities across Greece. Athens essentially represented a constitutional collection of several families as a polity, and these family or factional rivalries and prejudices remained in the background of the emergent democracy. Ostracism came to be deployed to address these factional rivalries. Members of the maternal side of Pericles' family --- the <a href="http://quatr.us/greeks/history/alcmaeonids.htm" target="_blank">Alcmaeonids</a>, of which Cleisthenes was a member --- were targets of ostracism and exiled under this process. When Pericles was ten, his father took his immediate family into exile, returning early to Athens to fight invading Persians. Pericles' father was declared a hero of Athens for his effort in several naval battles that resulted in the Persian defeat, and he became a major political figure in Athens.<br />
<br />
This was the constitutional and political environment in which Pericles was raised and later participated as Athens' leading citizen for several decades. It was the political and constitutional milieu that Socrates and later Plato found themselves in and prompted their philosophic discourse on government. As a leader, Pericles had little constitutional power, except that power that was enabled by a political process that relied on the ability to persuade in an environment we would characterize today as "free speech." That power of persuasion was another anathema for Socrates and Plato. Persuasion was part of the toolbox of <a href="http://plato.stanford.edu/entries/sophists/" target="_blank">sophists</a>, who relied upon <a href="https://en.wikipedia.org/wiki/Rhetoric_(Aristotle)" target="_blank">rhetoric</a>; Socrates and Plato were philosophers, who relied on logic and dialectic. Both of these tools can be labeled tools of intellectual reason; however, the tools did not necessarily reach the same intellectual outcome. Rhetoric in the hands of the <i>hoi polloi</i> was an anathema to Socrates and Plato.<br />
<br />
461 BCE was a defining year in the political emergence of Pericles as leader. <a href="https://en.wikipedia.org/wiki/Cimon" target="_blank">Cimon</a>, a general who led Athens following the ostracism of <a href="https://en.wikipedia.org/wiki/Themistocles" target="_blank">Themistocles </a> and strived to build acceptable relations with rival Sparta, was ostracized himself in response to policies that shifted some power in the Athenian democracy to aristocrats. Pericles, then a new, young general was recruited by Ephialtes, another general, to rally political opposition to Cimon's pro-Spartan, aristocratic policies and to prosecute Cimon in the Assembly. Cimon prevailed in this skirmish, but Pericles made a name for himself. Cimon later fell when Ephialtes was able to take advantage of Cimon off helping Sparta with thousand of supporters while they were away from Athens. Cimon's legislative changes that shifted power to aristocrats were undone. When Cimon returned to Athens, he no longer had the power to persuade the Assembly or any other unit of government. Pericles emerged as a leading general in Athens at the time Cimon was ostracized, the same year Ephialtes was the target of a political assassination (most likely by supporters of Cimon). Pericles continuing leadership as a general of Athens continued for 32 years, until his death from plague in 429 BCE. This was a period that include a major conflict with Sparta, known as the First Peloponnesian Wars, and the very beginning of a second major conflict with Sparta (the Second Peloponnesian War, divided by a period of peace. Socrates was nine years old in 461 BCE.<br />
<br />
Pericles pursued a policy of naval dominance because conflict after conflict, whether with Persia or other parts of Greece, including Sparta, revealed the navy was Athens' military strength. In contrast, Athens land-based military power was relatively weak. Sparta dominated on land. Athens built walls to protect itself from Spartan aggression. As leaders before him did, Pericles did pursue and contemplate the benefits for Athens of an accord with Sparta. Following the first Peloponnesian War from 461 to 451 BCE, there ensued a five-year truce between the two city-states. This was followed by the negotiation of a Thirty-Year Peace in 445 BCE, that was cut short by the beginning of the second Peloponnesian War in 431 BCE. Socrates was 41 years old when Pericles succumbed from plague two years later in 429 BCE. <br />
<br />
According to Rebecca Goldstein, Socrates and Plato distinguished themselves from the traditional Greek perspective by cementing the self's view of the self as the referential benchmark for determining whether one's behavior was virtuous, ethical, or just rather than society's or even history's perspective on the self (<i><a href="https://en.wikipedia.org/wiki/Kleos" target="_blank">kleos</a></i>) . (See <a href="http://csilcox-thebookshelf.blogspot.com/2014/06/rebecca-goldstein-plato-at-googleplex.html" target="_blank">June 29, 2014 post</a>). I should care less about what others think of how I behave; the primary question is whether I can live with my behavior? Self-respect is more important than public respect or memory. In the Greek tradition, Kagan notes, happiness lies not merely in "moderate material comfort, good health, long life, virtuous offspring, and opportunity for <i>kleos </i>--- the last two representing man's hopes for immortality preserved in the memory of his family and his polis." <br />
<br />
In a democratic political environment where the officials are accountable to the <i>hoi polloi</i> in so many ways ---- not just in the accounting sense where one's financial transactions are subject to the review of officials, but in the case of the generals who have a continuing need to be able to persuade fellow citizens in the wisdom of their views, policies and proposals for action --- this accountability means that what others think of us counts. A democratic system of governance does not match well with Socrates and Plato's self-referential perspective of excellence, justice, morality, and virtue. From this perspective it should not be surprising that democracy would not fit well with the general philosophic perspective of either Socrates or Plato. We have seen in the study of social emotions --- shame, sympathy, and empathy --- that the "self" is not truly an insular self, but a complex set of feelings that depend in part on how others see ourselves and how one sees themselves in the eyes of others: a social self. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/02/david-hume-treatise-of-human-nature.html" target="_blank">February 27, 2011 post</a>) . With reference to Socrates and Plato, we are not dealing with the selfish self, driven by egotistical impulses that has characterized discussions of Western economic philosophy (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">January 30, 2010 post</a>). But Plato and Socrates are promoting an egotistical social construct to defend and justify the philosopher's leadership of a new, non-democratic social order: if the philosopher-king can live with the way he or she administers justice, what else counts? <br />
<br />
Pericles made a speech in 431 BCE, two years before his death, known as <a href="http://www1.umn.edu/humanrts/education/thucydides.html" target="_blank">Pericles' Funeral Oration</a> that is venerably regarded as the Gettysburg Address of the classical Greek period. It is a brief speech about the glory that is Athens. Part of that glory is its mode of governance. "<span style="background-color: white; font-family: 'Times New Roman', Times, serif;">Our form of government does not enter into rivalry with the institutions of others. Our government does not copy our neighbors but is an example to them. It is true that we are called a democracy, for the administration is in the hands of the many and not of the few. <i>But while there exists equal justice to all and alike in their private disputes, the claim of excellence is also recognized; and when a citizen is in any way distinguished, he is preferred to the public service, not as a matter of privilege, but as the reward of merit.</i> Neither is poverty an obstacle, but a man may benefit his country whatever the obscurity of his condition. There is no exclusiveness in our public life, and in our private business we are not suspicious of one another, nor angry with our neighbor if he does what he likes; we do not put on sour looks at him which, though harmless, are not pleasant. While we are thus unconstrained in our private business, a spirit of reverence pervades our public acts; we are prevented from doing wrong by respect for the authorities and for the laws, having a particular regard to those which are ordained for the protection of the injured as well as those unwritten laws which bring upon the transgressor of them the reprobation of the general sentiment." In the italicized text Pericles connects <i>kleos </i>to the individual in a democratic society. Whereas the notion of <i>kleos</i> is something that aristocratic Greeks has historically aspired to, Pericles is saying that in Athens every citizen could aspire to <i>kleos. </i></span><br />
<span style="background-color: white; font-family: 'Times New Roman', Times, serif;"><br /></span>
The difference in methods of discourse that distinguish philosophers from the sophists --- dialectic versus rhetoric --- is likewise consistent with the contrast that marks Socrates and Plato from historical Greek tradition.CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-88704176608859680472014-12-03T16:30:00.004-08:002015-02-01T12:42:43.960-08:00John Boardman et al, The Oxford History of Greece and the Hellenistic World (2001)I divert my course of reading in a manner not planned. Rebecca Goldstein's (see <a href="http://csilcox-thebookshelf.blogspot.com/2014/06/rebecca-goldstein-plato-at-googleplex.html" target="_blank">prior post</a>) argument that Plato remains relevant has prompted me to pull a volume off the bookshelf that has rested there a long time. Purchased a dozen years ago to be consumed on a trip to Greece over several weeks, the <i><a href="http://www.amazon.com/Oxford-Illustrated-History-Hellenistic-Histories/dp/0192854380" target="_blank">Oxford History of Greece</a></i> never saw the light of day on that trip. Instead, I consumed a fictional Greek mystery --- a re-read of John Fowles' <em><a href="http://www.amazon.com/Magus-John-Fowles-ebook/dp/B0081BTOJS/ref=sr_1_1?s=digital-text&ie=UTF8&qid=1404092314&sr=1-1&keywords=the+magus+john+fowles" target="_blank">The Magus</a>, </em>one of the great novels of the 20th century.<br />
<br />
I am in search of context to gain a better appreciation of Socrates and Plato. There is nothing in the <a href="http://www.amazon.com/Oxford-History-Greece-Hellenistic-World/dp/0192801376" target="_blank"><i>Oxford History of Greece</i> </a>to meditate on (at least yet) nor is there a story to tell (yet). Just some notes and quotes from the book about Socrates and Plato that provides some context. This context will not be found in just one book on the bookshelf, but what follows is a start.<br />
<br />
<i>Socrates</i><br />
<br />
"Socrates (470-399) was an ordinary Athenian citizen belonging to no philosophical school; he may have had an early interest in cosmology, but if so, he abandoned it. He wrote nothing, and our reports of him come from sources (Plato, Xenophon, Aristophanes) that give widely divergent pictures. If our interest is philosophical, however, we have no choice but to follow Plato; and although we have always to remember that the Platonic Socrates is Plato's creation, we can form some idea of what it was about the historical Socrates that led Plato to use him as the main spokesman of Platonic ideas. The most important facts about Socrates were that he lived, uncompromisingly, for philosophy; and that he was put to death on anti-intellectual grounds, the charges being that he introduced new divinities and corrupted the youth. It is plausible that behind this lay unspoken political motives, since Socrates had associated with many of the aristocrats who had overthrown the democracy, but the dislike was in part genuinely anti-philosophical. Socrates remained for Plato the prototype of the person unconditionally committed to philosophy; his conception of philosophy changed, but never his conviction of the importance of Socrates' example. <br />
<br />
"The later cliche about Socrates was that he turned philosophy from science to ethics; but there had already been plenty of ethical and political inquiry. What he did was to make philosophy personal again. He ignored Protagoras' theories about society as much Anaxagoras' theories about matter, and instead went around picking on individuals and addressing to each of them the disconcerting and unpopular question, 'Do you understand what you are talking about?' This naively direct refusal to take at face value claims to philosophical and other expertise marks a return to Heraclitus' kind of concern: scientific and sociological inquiries are rejected until we have the self-knowledge to understand the proper use to make of the results. Until we do, the most urgent task for each person is to turn inwards rather than outwards; and in keeping with this Socrates refused to write down any teachings or speechify in any way. Whereas Heraclitus did think he had access to the truth, Socrates represents himself as ignorant, superior only in argumentative technique and self-awareness; he is, he says, merely the gadfly that stings people out of their complacency. But he has much more intellectual conception of understanding and its requirements than Heraclitus. He argues people into realizing what an undefended mess their view are. ***"<br />
<br />
<i>Plato</i><br />
<i><br /></i>
"Plato (427-347) was an aristocratic Athenian who followed Socrates' example in devoting his life to philosophy, but did not follow him in his rejection oft he permanent written word in favor of personal encounter. However, although he did write, a great deal, he retained some Socratic suspicion of writing: <i><a href="http://classics.mit.edu/Plato/phaedrus.html" target="_blank">Phaedrus</a> </i>274b-277 is a famous passage where he warns us that written words are dead and cannot answer back, where true philosophy is always a live activity and interchange of thought. Plato's early writings are designed to avoid these dangers; he rejects the established media of prose (or verse) exposition for what must have seemed at the time an amazing choice --- the dialogue, which had hitherto been used only for fairly low-grade entertainment. Some of Socrates' other followers, such as Antisthenes and Aeschines of Sphettus, wrote Socratic dialogues, but only with Plato can we see the form put to philosophical use. He employs it to present philosophical arguments in a way that ensures that the listener is stimulated to participate and continue, rather than passively learning off doctrines. Plato never needs speaks in his own person, and this makes a certain detachment inevitable; we have to make what we can of the picture of Socrates arguing. No message is forced upon us, but we are made aware of a problem, and of the need for argument and thought to get further with it.<br />
<br />
***<br />
<br />
"***But a rough grouping of the dialogues forces itself on us: the middle and late dialogues are radically different from the early ones. They are much longer, mostly undramatic, especially in their use of Socrates, and above all are didactic. The stylistic changes reflect a shift away from the personal urgency of Socratic inquiry: from the middle dialogues on, we are in no doubt that Plato does have views of his own which the figure of Socrates serves merely to present. When he gives us a theory of society (in the <i>Republic</i>) or a cosmology (in the <i><a href="http://classics.mit.edu/Plato/timaeus.html" target="_blank">Timaeus</a></i>) or a long set of arguments about the <a href="http://www.britannica.com/EBchecked/topic/182278/Eleatic-One" target="_blank">Eleatic One</a> (in the <i>Parmenides</i>) the dialogue form is serving merely to make the argument ore accessible. *** The dialogue form, and the use of Socrates, become strained to the breaking point as Plato becomes ever more engaged in straightforward philosophical debate, often with contemporary positions. All the same, Plato never wholly abandoned dialogue, and clearly continued to value its detachment, and the avoidance it necessitates of more than a mild degree of technicality and systematization of different positions.<br />
<br />
***<br />
"One of the most disputed questions in recent Platonic scholarship has been whether Plato himself came later to criticize his earlier indiscriminate acceptance of Forms. In the first part of <i><a href="http://classics.mit.edu/Plato/parmenides.html" target="_blank">Parmenides</a></i> young Socrates puts forward what looks like the middle dialogues' conception of Forms, only to have it torn to shreds by the unhistoric, but symbolic figures of Parmenides and Zeno. And in other later dialogues there are many arguments which do in fact undermine some of Plato's earlier uses of Forms. This certainly looks like self-criticism; but Plato draws no explicit morals. The ideas which for a time he held together in passionate conviction are quietly allowed to fall apart again, and in the late dialogues he pursues different interests for their own sake without over-ambitious synthesis.***<br />
<br />
***<br />
<br />
"*** Most strikingly, perhaps, the nature of his interest in ethics and politics changes considerably. In the early dialogues he is concerned with the personal achievement of virtue, and this is still the theme of his most famous middle dialogue, the <i>Republic</i>. In that dialogue his interest has spread sufficiently to society for the account of the just person to be placed against a background of a just society; but it is made clear that this is a society which is <i>ideally </i>just, an ideal which has no practicable political application. However, in the late dialogues we find Plato returning at length and several times to ethical an political questions from a changed perspective, one that has much in common with the formerly despised approach of Protagoras and the other sophists. In the <i>Statesman</i>, the <i>Critias</i>, and the <i>Laws</i> he returns to fifth century questions about the origins of society, takes history and prehistory seriously, and investigates from several angles the issue of what social arrangements actually work and produce a stably functioning real society. The study of ethics and politics is no longer seen from the viewpoint of the individual concerned to become just, but is carried out from the external viewpoint of the investigator, impersonally and historically."<br />
<br />
These select paragraphs suggest, in the same vein that Rebecca Goldstein intimates (see <a href="http://csilcox-thebookshelf.blogspot.com/2014/06/rebecca-goldstein-plato-at-googleplex.html" target="_blank">June 29, 2014 post</a>), it is difficult to know exactly the views of Plato. It seems unlikely, however, that Plato's prodigious writings were merely wandering meditations, without some commitment to the views of some of the speakers in his dialogues.<br />
<br />
<i>Context: Pre-Socratic cosmogony, War, and Democracy</i><br />
<i><br /></i>
Benchmark dates for contextual reference: Socrates was born in 470 BCE. Plato was born in 428 BCE. Socrates was 42 years old when Plato was born. Socrates died in 399, executed by drinking a cup of hemlock. Plato was 29 years old when Socrates died. Plato died in 348 BCE at the age of 80, fifty-one years after Socrates died. <br />
<br />
Socrates and Plato were certainly not the first Greeks to "reason" about the nature of the universe, mathematics and its relationship to the physical world, ethics, society, and our knowledge of reality. More than 100 years before Socrates was born, <a href="http://en.wikipedia.org/wiki/Thales" target="_blank">Thales of Miletus</a> was active in contemplating the origins of the physical world, and he was followed by two other 6th century Milesians <a href="http://en.wikipedia.org/wiki/Anaximander" target="_blank">Anaximander</a> and <a href="http://en.wikipedia.org/wiki/Anaximenes_of_Miletus" target="_blank">Anaximenes</a> a few decades later. Thales is the subject of a prior post (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/leonard-mlodinow-euclids-window-2001.html" target="_blank">November 20, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/03/david-c-lindberg-beginnings-of-western.html" target="_blank">March 28, 2010</a> posts), and as noted in the later post, "<span style="background-color: white;"><span style="font-family: Times, Times New Roman, serif;">Milesians of antiquity posed a question that had never been previously asked in either Greek or Middle Eastern culture: "what is the material origin of things---the single and simple underlying reality that can take on a variety of forms to produce the diversity and order behind chaos?" And importantly, Lindberg adds, "in the answers offered by these Milesians we find no personification or deification of nature; a conceptual chasm [that] separates their worldview from the mythological world of Homer and Hesiod. The Milesians left the gods out of the story. What they may have thought about the Olympian gods we do not (in most cases) know; but they did not invoke the gods to explain the origin, nature or cause of things." </span></span><span style="font-family: Times, Times New Roman, serif;"> In this <i>Oxford History</i> volume, it is added, "</span>Thales taught that everything is derived from water and that the earth rests on water." Thales left no writings behind, but "perhaps he was attracted to these tenets, as Aristotle conjectures, from seeing that the nutriment of all things contains moisture, and that heat itself comes from this and is sustained by it." I cite this because we can see in Aristotle's "conjecture" an early example of philosophers speaking across the centuries (see <a href="http://csilcox-thebookshelf.blogspot.com/2014/06/rebecca-goldstein-plato-at-googleplex.html" target="_blank">June 29, 2014</a> post) to each other. And it is also an example of how philosophizing acted as a bridge between our senses and intuition and understanding of the physical world (albeit an incorrect understanding as we know now). The author of this section of the <i>Oxford History of Greece</i> notes that it is hard to divorce Thales' view of the world from Egyptian and Semitic creation stories, and with respect to Anaximander he observes that much of his cosmogony appears to have been inspired by Iranian cosmology. "The Milesians were unable to free themselves from the preconception of the myth-makers of the pre-philosophical age that something so complex as the present world must have originated from something simple; that the earth is finite in extent and more or less circular, with something different underneath it; that the sky is a physical entity at a definite distance from the earth; that there are immortal sources of energy which are the moving or directing forces in the universe. Their new, philosophical assumptions were that these forces operate in a perfectly consistent way that can be observed in everyday phenomena; that everything can thus be explained from the working of a few universal processes in a single original continuum; and that there is no such thing as creation from nothing or decay from nothing, only change of substance. They tried to account systematically for all the most notable features of the world about us: the movements of the heavenly bodies, phases of the moon, eclipses, lighting, thunder, rain, snow, hail, rainbows, earthquakes, the annual inundation of the Nile."<br />
<br />
Just a decade before Thales was active in Miletus, in another part of Greece, the first Greek "lawgiver," <a href="http://en.wikipedia.org/wiki/Solon" target="_blank">Solon</a>, a leader in Athens introduced the Laws, and instituted political reforms in 594, 125 years before the birth of Socrates.<br />
<br />
"It is vital to insist that this opening of the Greek mind is much more important than the particular forms of government which were produced by the opening. Here there was 'tyranny'; there 'oligarchy'; here 'a constitution'; there 'anarchy'. Common to all of the more flexible societies is turmoil, and common to all is the achievement in the end of some sort of what we are prepared to describe as the constitutional government of the city-state.<br />
<br />
"But the routes were indeed diverse. In Sparta in the early seventh century a great lawgiver, <a href="http://en.wikipedia.org/wiki/Lycurgus_of_Sparta" target="_blank">Lycurgus</a>, it was said, laid down the rules for a system of military training (one could call it education) which turned Sparta into the most efficient military power in Greece, helping it to hold ruthless mastery over the southern half of the Peloponnese, and by stages to acquire a more subtle control over the rest of the peninsula. At the same time he formalized and thus reformed Sparta's social structure and produced a constitution which guaranteed to all Spartans some form of political equality the like of which had not been imagined by Hesiod and was not to be realized elsewhere for many a day.<br />
<br />
***<br />
"It is against this background that we must see the development and, after Lycurgus, the freezing of Spartan institutions. If her position was rare, her solutions made her unique. Most Greeks retained some traces of a state-imposed military training for the young; in Crete for example, many close similarities to Spartan customs can be seen. But only in Sparta, so far as we know, was a child completely robbed of his home and family between the ages of five and thirty and even thereafter compelled to to devote his days to military training and his evenings to the company of his messmates. Most Greeks entered the archaic age with aristocratic attitudes, and in most some faint elements of these attitudes long survived. *** In its constitution Sparta stood apart, but here in a different way. The kings were the military commanders; with the aristocratic council, the Gerousia, they initiated most political and took most judicial decisions. But there was also an assembly of all Spartan citizens which met at fixed times and passed final judgments on most things that mattered--- all Spartan citizens, that is, as defined by the great Lycurgus, all who had survived their training, who had been allotted state land in the conquered territory with helots to work it, and who continued to obey the rules."<br />
<br />
***<br />
"Some states tried a third route to the new world, constitutional, like that of the Spartans, but less idiosyncratic, very much more humane. The setting up of a colony invited, if it did not demand, some conscious thought about the character of the new settlement, some element of self-consciousness even where the desire may only have been to reproduce what had been left at home (a desire that cannot have been profound, since most colonists left home because they did not like what they had experienced there). Thus a new need was added to the instinct for change, or at least dissatisfaction with the existing order, the need to formulate; and (yet again) eastern experience will have shown that formulation was possible.***<br />
<br />
"But all this is shadow. It is only in mainland <a href="http://en.wikipedia.org/wiki/Attica_(region)" target="_blank">Attica</a> that the translation of the desire and the idea into fact can be followed. Attica had survived post-Mycenaean turmoil better than most, but here too there had been economic collapse and only gradual redevelopment. When things settled down the city of Athens was at the head of whatever association Attica may once have been, not, like Sparta, a city of 'equals' surrounded by <i><a href="http://en.wikipedia.org/wiki/Perioeci" target="_blank">perioikoi</a></i> or helots, but the center of an Attica riddled throughout with inequalities. There were aristocrats, free men, and dependents in and around the city as there were in Eleusis, Marathon, or Sunium. It is not the least of the Athenian achievements that she contrived to diminish or delete the distinctions across the country while building up the city as acknowledged capital, preserving at once local pride, national identity, and individual dignity.<br />
<br />
***<br />
"In Athens the first changes came after some twenty-five years. There arrived a moment of crisis, or near-revolution, when it was decided to appoint an arbitrator to produce a second, very different definition. . . Out of the background of discontent (with tyranny) came the choice of a revolutionary leader, Solon, who , fortunately for us, was not only a politician, but a poet, albeit a somewhat self-centered, self-righteous, and just a trifle pompous one.<br />
<br />
Solon, elected chief-magistrate for 594, had one weakness. He did not like killing people." ***<br />
<br />
***<br />
<br />
"***existing debts were cancelled and personal security was forbidden. Share-cropping ceased to exist and no Athenian could henceforth suffer the indignity of enslavement for debt. *** Politically too some element of equality was sought. The assembly won new w\authority, perhaps in some was of which we know nothing. *** Solon's assertion that the assembly was to be the ultimate court of law. An Athenian could appeal to the assembly or a committee of it against a magistrate's verdict in his court.*** All Athenians deserved freedom from the threat of slavery, a guarantee against legal oppression, some voice in the direction of the city. But some Athenians, chief among them Solon's supporters, deserved more in the way of real political power. Solon, no less than Cypselus, had had some big men behind him, and they wanted a reward. The solution was simple, but very radical. Access to major political and military office, the archonship, previously restricted by convention to a limited group of families, the Eupartridae, was to be determined by wealth in land. All Athenians were divided into four classes. To the top class or classes went the top offices, to the lowest, the <i>thetes</i> only membership in the assembly, with consequent judicial influence. So far as can be judged the potential member of 'those with power' was doubled --- no mean change.<br />
<br />
***<br />
<br />
"Solon had opened government to new men, but had done nothing positive to diminish the aristocrat at local level beyond robbing him of legalized mastery over the poorer around him. Now he had either died in the last battle against <a href="http://en.wikipedia.org/wiki/Peisistratos" target="_blank">Pisistratus</a> or thought it prudent to go into exile or, even if he stayed, knew that he had to acknowledge the existence of someone more powerful than himself. Thus the rest either lost their master or realized that he did not matter so much as before. To change allegiance from one master to another may not seem to us a momentous step, but it is a first step towards a sense of being one's own master.<br />
<br />
***<br />
Just before the end of the 6th century BCE, <a href="http://en.wikipedia.org/wiki/Cleisthenes" target="_blank">Cleisthenes</a> introduced further reforms in 508 BCE, just 38 years prior to the birth of Socrates. "The essence of the new system was the recognition that small local units, country villages or townlets, wards of the city, should control their own affairs independent of the local aristocrat. Each chose it mayor and council, and minded its own business."<br />
<br />
***<br />
"The <i>polis</i> was essentially a male association: citizens who were men joined together in making and carrying out decisions affecting the community. The origin of this activity doubtless lay in the military sphere and the right of warriors to approve or reject the decisions of their leaders; the development of the <i>polis </i>is the extension of this practice to all aspects of social life, with the partial exception of religion. Politics, direct participation in the making of rational choices after discussion, was therefore central to all Greek cities. In Athens and Sparta all male citizens participated at least in principle equally; elsewhere particular rights could be confined to certain groups, richer or better born, thereby necessarily creating conflicts and a hierarchy of rights within the citizen body. Nevertheless, the forms of political life, mass citizen assembly, smaller council, and annual executive magistrates were general, though the powers and attributes of the different elements varied widely."<br />
<br />
***<br />
"Even more important to the ordinary Athenian than these central and local government organizations was the <i><a href="http://en.wikipedia.org/wiki/Phratry" target="_blank">phratry</a></i>, the group of <i>phrateres</i>. This is the sole context in Greek of the important linguistic root common to most Indo-European languages, found for instant in the Celtic <i>brathir</i>, German <i>Bruder</i>, English <i>brother</i>, Latin <i>frater</i>, or French <i>frere</i>; in Greek it designates the non-familial type of 'brotherhood' (there was a quite different for blood relationship of brother). These brotherhoods were originally perhaps aristocratic warrior bands, but once again the democratic state had reorganized them to make them open to all: every male Athenian belonged to a phratry, and it was his phratry which dominated his social life." ***<br />
<br />
"This type of association was common in the Greek world, and had developed for different ends in different cities. Sparta is the most striking example: the male citizen body was divided into <i><a href="http://en.wikipedia.org/wiki/Syssitia" target="_blank">syssitia</a></i> or mess groups on which the entire social and military organization of the state depended. Here the normal practices of the Greek world had been transformed to create a military elite. From the age of seven, boys were given a state-organized upbringing, and brigaded into age groups. They lived communally from the age of twelve, taught all sorts of skills useful to self-reliance and survival, and provided with inadequate clothing and food to toughen them. At twenty they joined the <i>syssitia </i>where they must live until the age of thirty, and even thereafter they were required to eat daily those common meals to which they had to contribute from the land allotted to them and formed by state-owned slaves, who were in fact enslaved descendants of the neighboring communities, constantly rebelling and requiring suppression. The theoretical elegance of this solution (soldiers make slaves, slaves make soldiers, slaves need soldiers to suppress them), and the way it built on traditional Greek social customs, much impressed ancient political thinkers, and offered a counter-ideal to the Athenian democracy. The two examples show how differently similar institutions could develop in different states, and produce societies with utterly opposed characteristics."<br />
<br />
By the time Socrates was nine years old, the first <a href="http://www.ancient.eu/timeline/Peloponnesian_War/" target="_blank">Peloponessian war</a> between Athens and Sparta began and would last for ten years until 451 BCE. <a href="http://en.wikipedia.org/wiki/Pericles" target="_blank">Pericles</a> was the most influential of the ten generals of Athens at this point. Twenty years of peace would ensue before the <a href="http://atheniandemocracy.weebly.com/the-peloponnesian-war.html" target="_blank">second Peloponnesian wa</a>r between Sparta and Athens began in 431 BCE. A year later, a plague would consume Athens and Pericles was among the plague's victims in 429. Plato is born a year later in the early years of the second war. Twenty years into the second Peloponnesian war, Athenian democracy collapses and the political system is replaced by a governing oligarchy for one year before a democracy is restored in 410. Athens capitulates to Sparta in 404 BCE and the <a href="http://en.wikipedia.org/wiki/Thirty_Tyrants" target="_blank">Thirty Tyrants </a>are installed as leaders of Athens. The Thirty Tyrants maintain power for only one year before democracy is again restored at Athens in 403. Plato is 25 years old at this time. Four years later in 399 BCE, Socrates would be accused by the democratic leaders of corrupting youth and executed. Just a few years later, Plato would become active as teacher and philosopher. <br />
<br />
Plato was not in attendance at the death of Socrates, but he would write about it and become the principal source of information about the death of Socrates. Consistently during the first half of the 4th century BCE, wars between Athens, Sparta, Persia and <a href="http://en.wikipedia.org/wiki/Thebes,_Greece" target="_blank">Thebes</a> (another Athenian rival) arose. And then Macedonia entered the picture from the north, conquering Athens by the mid-4th century BCE. It is at this time that Plato dies in 348 BCE and Alexander the Great is born.<br />
<br />
This is roughly the the "Greek" historical context from which Plato and Socrates communicate across the ages to us. This context says very little about the impact of external influences, particularly Persian and Phoenician influences that possibly, if not likely contributed something to the Greek view of the world in the centuries preceding the 5th century BCE. In terms of context, one cannot ignore the Athenian experiment with expanding democracy from the time of Solon to Pericles is significant. One can't help but surmise that the freedom to philosophize (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/steven-nadler-book-forged-in-hell-2011.html" target="_blank">December 12, 2012 post </a>for a discussion of Spinoza's views on democracy and the freedom to philosophize) is incidental to this heretofore unfamiliar form of government where voices and votes are relatively equal are related. In contrast, the oligarchic Spartan form of governance and collectivized social system, just to the southwest of Athens during the same period of time, where the government managed and dictated most aspects of social and even personal life contributed almost nothing to philosophy and culture. It is surprising, then, that Plato should find in Sparta a model, of sorts, for the ideal government and social system led by philosophers that he proffered in <i><a href="http://en.wikipedia.org/wiki/The_Republic_%28Plato%29" target="_blank">The Republic</a></i>. <br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-27606911870767002512014-06-29T18:24:00.001-07:002014-11-15T09:34:22.884-08:00Rebecca Goldstein, Plato at the Googleplex: Why Philosophy Won't Go Away (2014)I can recall a classroom discussion decades ago about the function of literary criticism: was it a formal evaluation of the literary text on its own merits, treating the work as a self-contained aesthetic object? Or was context required: for example, the reader's attention to the author's biography, the culture and historical reference point(s) from which the work emerged, and perhaps the work's social function? I always leaned to the latter simply because most (if not all) authors are writing to be heard about something in context. Few, if any, authors are writing solely for the sake of form.<br />
<br />
<a href="http://www.rebeccagoldstein.com/" target="_blank">Rebecca Newberger Goldstein</a>, who has written non-fiction works about <a href="http://www.amazon.com/Betraying-Spinoza-Renegade-Modernity-Encounters-ebook/dp/B002JKVXG4/ref=sr_1_3?s=books&ie=UTF8&qid=1401632001&sr=1-3&keywords=rebecca+newberger+goldstein" target="_blank">Spinoza</a>, <a href="http://www.amazon.com/Incompleteness-Proof-Paradox-G%C3%B6del-Discoveries-ebook/dp/B00E9P9FNA/ref=sr_1_5?s=books&ie=UTF8&qid=1401632001&sr=1-5&keywords=rebecca+newberger+goldstein" target="_blank">Gödel</a>, and now a blended work of fiction and non-fiction about Plato confronts this split in <em><a href="http://www.amazon.com/Plato-Googleplex-Philosophy-Wont-Away-ebook/dp/B00F1W0D90/ref=sr_1_1?s=books&ie=UTF8&qid=1401630907&sr=1-1&keywords=rebecca+newberger+goldstein" target="_blank">Plato at the Googleplex</a></em>. She can't avoid a contextual approach: this work is substantially about the history and role of philosophy, and she is trying to communicate a message about philosophers talking across the ages to each other, even when it is impossible to get them in the same room together. Philosophy is that way, after all. But is literature really any different? Both have their origins in human imagination, and for that matter so does scientific inquiry, a point I will return to in a moment. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/07/manjit-kumar-quantum-2008.html" target="_blank">July 30, 2011</a> post).<br />
<br />
Not a biography, <em>Plato at the Googleplex </em>is biographical. It is a very clever book, organized to convince you that Plato, born 2500 years ago, is still very much alive today. The story of Plato or Socratic ideas is really just a tool in a much larger construct that Goldstein is presenting and arguing. Goldstein is disturbed --- deeply I might say --- that philosophy is being characterized as something like a historical artifact in the mental toolbox of evolving human ideas, now allegedly dominated by scientific inquiry. Previous posts in this blog have noted this: for example, John Searle's statement, "Philosophers need to forget about Cartesian dualism "and just remind ourselves that mental <span class="blsp-spelling-corrected" id="SPELLING_ERROR_13">phenomena</span> are ordinary biological phenomena in the same sense as digestion and photosynthesis." <span class="blsp-spelling-error" id="SPELLING_ERROR_14"></span> (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/01/john-searle-philosophy-in-new-century.html" target="_blank">January 21, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/michael-gazzaniga-human-2008.html" target="_blank">September 27, 2009</a> post). This story of Plato and Socratic ideas is the device to remind us that many of the things that we argue about today at work, on television talk shows, in criminal and civil justice institutions, in educational institutions, and in public fora are the same things we have been arguing about in western civilization since the 5th Century BCE.<br />
<br />
It is not enough to understand Plato's dialogues or Socratic ideas based on the text and words alone. It is imperative that we understand the context in which the words were uttered or written. There was the Golden Age of Greece glorified by Homer, then came the monument builders, wars and more wars, and in the middle of those wars an intellectual era that marks the beginning of modern philosophy. Philosophers and political leaders and teachers preceded Socrates and Plato, and they (Socrates and Plato) were arguing with their predecessors.<br />
<br />
Modern science is no different. The same is true of Einstein, Bohr, Heisenberg and others engaged in the quantum discussion of the 20th century. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/07/manjit-kumar-quantum-2008.html" target="_blank">July 30, 2011</a> post). There is little difference between philosophical theorizing and scientific theorizing. They are starting points in the discovery of what is real. But ultimately, the test of what is real is dominated by that next part of scientific inquiry that asks, "How can we test this proposition?" When the testing begins, the philosophical inquiry does not come to an end, but it wanes. We can see this in the scientific inquiry into "moral behavior" and "cooperative behavior." (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a> post). At some point, real facts about human beings start to explain what philosophical inquiry and speculation began poking around at thousands of years ago and we keep digging into those facts until there is a more solid foundation and a story to tell.<br />
<br />
Goldstein gets this, but she is not sympathetic to the view that some day science will have everything figured out. She takes offense at Lawrence Krauss' remark that the "tension between philosophy and science occurs because people in philosophy feel threatened, and they have every right to feel threatened, because science progresses and philosophy doesn't." This book is a statement that not every philosopher feels threatened and philosophy does help solve problems. Certainly John Searle does not feel threatened (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/01/john-searle-philosophy-in-new-century.html" target="_blank">January 21, 2011 post</a>) and <em>Plato at the Googleplex</em> reveals that Goldstein does not feel threatened.<br />
<br />
The contextual key to understanding Socrates and Plato, Goldstein argues, is the Greek sense of virtue, embodied in the word <a href="http://en.wikipedia.org/wiki/Arete" target="_blank"><em>arête</em></a>. Prior to Socrates, <em>arête</em> was an Athenian social construct that was dependent upon reputation (<em><a href="http://en.wikipedia.org/wiki/Kleos" target="_blank">kleos</a></em>). A reputation for excellence, a reputation for being extraordinary is what gave life "an added substance." "Live so that others will hear of you" is the choice that Achilles makes in <em>The Iliad</em>. Greek myth and song advanced this kleos-centric view. What brought Socrates into conflict with his Athenian community, Goldstein opines, is Socrates' assertion that <em>arête</em> was entirely independent of social regard, reputation. For Socrates, his moral anchor is not what others may think of him, but whether he can live with his own actions. Others cannot control my life; I must control my life. The quest for knowledge and truth and reality is personal, and it is not a social construct that is imposed upon me. And that non-kleos-centric view brought Socrates to his trial and conviction. Without this "context," we cannot understand why Socrates, the Socratic dialogues, and Plato are worth reading. This same "context" also helps us understand a Socrates of a later era, Baruch Spinoza (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/03/baruch-spinoza-emendation-of-intellect.html" target="_blank">March 12, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/steven-nadler-book-forged-in-hell-2011.html" target="_blank">December 17, 2012</a> post). Goldstein wants us to understand this connection; philosophers, after all, speak to each other across the centuries.<br />
<br />
But if Socrates is speaking to future philosophers, John Searle must be speaking back to this view of Socrates. Searle will not agree that reality is not a social construct; for Searle our reality is entirely dependent upon a social construct and, whether we like it or not, that social construct is imposed upon us. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/02/john-searle-making-social-world-2010.html" target="_blank">February 24, 2013 post</a>). We are social animals.<br />
<br />
I have not read Plato or about Plato in years. I would be interested in discovering how Plato addresses the social and moral emotions. We know that Plato and Socrates embrace the power of reason, not unlike Spinoza and other philosophers of the Enlightenment. But some Enlightenment philosophers understood the power of social emotions. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/02/david-hume-treatise-of-human-nature.html" target="_blank">February 27, 2011 post</a>). The social emotions and language (and yes, intelligence (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/11/jeff-hawkins-on-intelligence-2004.html" target="_blank">November 16, 2013 post</a>)) are, in my view, key to describing what makes us human, and these are the raw materials for the social construct that builds a reality in our minds. It is not merely our capacity to reason that defines our species.<br />
<br />
N<a href="http://www.iep.utm.edu/neoplato/" target="_blank">eoplatonism</a> is a different subject, particularly as revealed in histories of the early Christian church. Goldstein only now makes me realize that the collection of memes that are represented by neoplatonism may not actually represent the philosophy or views of Plato, if we can ever really know the philosophy or views of Plato. A surprise to me, although surely not to Plato scholars, is the statement attributed to Plato in the <a href="http://classics.mit.edu/Plato/seventh_letter.html" target="_blank"><em>Seventh Letter</em></a><em> </em>suggesting that he never put his philosophical views to writing: "This much at least, <a href="https://www.blogger.com/null" name="659"></a>I can say about all writers, past or future, who say they know the things <a href="https://www.blogger.com/null" name="660"></a>to which I devote myself, whether by hearing the teaching of me or of others, <a href="https://www.blogger.com/null" name="661"></a>or by their own discoveries-that according to my view it is not possible <a href="https://www.blogger.com/null" name="662"></a>for them to have any real skill in the matter. There neither is nor ever <a href="https://www.blogger.com/null" name="663"></a>will be a treatise of mine on the subject." If this was true --- and remember Plato's dialogues on their face express the views of others who participate in his dialogues --- then how can we ever know Plato's views? Plato is open to interpretation and if we can discern his views and philosophy at all, it is only in context. "When you ask why did some particular question occur to a scientist or philosopher for the first time," writes Goldstein, "or why did this particular approach seem natural, then your questions concern the context of discovery. When you ask whether the argument the philosopher puts forth to answer that question is sound, or whether the evidence justifies the scientific theory proposed then you've entered the context of justification. Considerations of history, sociology, anthropology, and psychology are relevant to the context of discovery, but not to justification. . . . one doesn't diminish a philosopher's achievement, and doesn't undermine its soundness, by showing how the particular set of questions on which he focused, the orientation he brought to bear in his focus, has some causal connections to the circumstances of his life." <br />
<br />
Plato, she believes, was more of a materialist than the dualist that neoplatonists would suggest. Her elucidation of Plato's Myth of the Cave (<em>The Republic</em>) suggests Plato was not, as I have long believed from the neoplatonist understanding of Plato, entirely an adherent to the eternity of abstract Forms as a matter of transcendent reality. <em>Phaedo </em>suggests that Plato supports a dualist's perspective, but <em>Timaeus</em>, she argues,<em> </em>suggests otherwise. For Plato, asserts Goldstein, there was only one "form" and that was mathematics. "The Pythagorean intuition that the form for rendering reality intelligible is supplied by mathematical ratios influenced [Plato] profoundly, ultimately yielding him his conception of the Sublime Braid [truth, beauty, and goodness are all bound up with one another, sublimely], and the means to make good on Socrates' search for the kind of knowledge that is also virtue." Goldstein writes that in Plato's <em><a href="http://plato.stanford.edu/entries/plato-timaeus/" target="_blank">Timaeus</a></em>, "the mathematics inscribed in the heavens' motions, generate the structure of reality." Mathematics is not a component of reality, however; it is a mental tool belonging to our representational capacity that makes information accessible to all humans and enables, as Goldstein writes, "our human reason to penetrate the cosmic reason."<br />
<br />
There is a "form" that is much a part of our brain's mental faculty, which this blog has described in a <a href="http://csilcox-thebookshelf.blogspot.com/2013/11/jeff-hawkins-on-intelligence-2004.html" target="_blank">prior post</a>. There, quoting sociologist Paul Bloom, it is noted, "Our minds have evolved to put things into categories and to ignore or downplay what makes these things distinct. Some categories are more obvious than others: all children understand the categories chairs and tigers; only scientists are comfortable with the categories such as ungulates and quarks. What all categories share is that they capture a potential infinity of individuals under a single perspective. They lump." Bloom says, "We lump the world into categories so that we can learn." He adds, "A perfect memory, one that treats each experience as a distinct thing-in-itself, is useless. The whole point of storing the past is to make sense of the present and to plan for the future. Without categories [or concepts], everything is perfectly different from everything else, and nothing can be generalized or learned." So if we think of "categories" created by our mind to assist our memory as Forms, these Forms or categories are very real to us. But these are not Plato's Forms as we understand them from neoplatonists (and perhaps even Plato himself).<br />
<br />
I cannot intelligibly vouch for or comment on Goldstein's perspective on Plato. First she informs us that Plato's views on anything may never be known, but she is bold enough to suggest that we can marshal a perspective on his views that gives more weight to one of his dialogues over another dialogue. Plato's breadth of philosophical conversation is astonishing for its day, and that fact alone explains why the philosophic tradition celebrates Plato. He was among the earliest rationalists, reasoning his way to a framework that helps us understand reality. Another philosophic tradition explores whether our knowledge of what is real is subjective, personal, or objective, universal. And there are other philosophic categories --- ethics, aesthetics, for example --- explored by Plato that are also divided by the subjective/objective distinction. Science may or may not solve this division, and that uncertainty opens the door for Goldstein to question Krauss' judgment and argue the continuing vitality of philosophy as the bridge between our senses, intuition, and scientific knowledge.CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-55379667240951867932014-05-11T14:04:00.003-07:002014-11-15T09:32:19.734-08:00Svante Paabo, Neanderthal Man: In Search of Lost Genomes (2014)If the prospect of future climate change poses difficult problems for estimating the impact on the extinction of species (see <a href="http://csilcox-thebookshelf.blogspot.com/2014/04/william-nordhaus-climate-casino-2013.html" target="_blank">previous post</a>), retroactively looking at the causes of the extinction of a species long dead before humans recorded history is not that easy either. We can probably make some educated guesses in a few cases based on examination of the geological record and what we can find in the chemistry and perhaps biology of dated samples of earthen material and fossils. But we remain hard-pressed right now to figure out why the species <a href="http://humanorigins.si.edu/evidence/human-fossils/species/homo-neanderthalensis" target="_blank"><em>homo neanderthalensis</em></a><em> </em>became extinct. They overlapped in time and habitat with our own species <a href="http://en.wikipedia.org/wiki/Anatomically_modern_humans" target="_blank"><em>homo sapiens sapiens</em></a><em>, </em>and now we know thanks to the research of <a href="http://www.ted.com/talks/svante_paeaebo_dna_clues_to_our_inner_neanderthal" target="_blank">Svante Paabo</a> and his colleagues at the <a href="http://www.eva.mpg.de/genetics/index.html" target="_blank">Max Planck Institute</a> that <a href="http://www.upi.com/Science_News/2014/04/09/Study-suggests-modern-humans-and-Neanderthals-interbred/8131397058307/" target="_blank">the two species interbred</a>. A small piece of the DNA of <em>homo neanderthalensis</em> lives on in anatomically modern humans. Anatomically modern humans are believed to have emerged approximately 100,000 years ago, probably in southern Africa. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/09/luigi-luca-cavalli-sforza-genes-peoples.html" target="_blank">September 25, 2013 post</a>). This species is believed to have migrated out-of-Africa approximately 50-60,000 years ago (<em>id</em>): first to the Middle East and then to South Asia. But this was not the first species of the <a href="http://en.wikipedia.org/wiki/Homo" target="_blank">genus <em>homo</em></a> to migrate out-of-Africa. Modern humans were preceded by <em><a href="http://en.wikipedia.org/wiki/Homo_erectus" target="_blank">homo erectus</a></em>, <a href="http://en.wikipedia.org/wiki/Homo_erectus" target="_blank"><em>homo heidelbergensis</em></a> and perhaps <em>homo neanderthalensis. </em><br />
<em><br /></em>
"According to the fossil record," says Paabo, "Neanderthals appeared between 300,000 and 400,000 years ago and existed until about 30,000 years ago. Throughout their entire existence their technology did not change much. They continued to produce the same technology throughout their history, a history that was three or four times longer than what modern humans have experienced. Only at the end of their history, when they may have had contact with modern humans, does their technology change. Over the millennia, they expanded and retracted with the changing climates in the area that lived in Europe and western Asia, but they didn't expand across open water to other uninhabited parts of the world. They spread pretty much as other large mammals had done before them. In that, they were similar to other extinct forms of humans that had existed in Africa for the past 6 million years and in Asia and Europe for about 2 million years. *** All of this changed abruptly when fully modern humans appeared in Africa and spread around the world in the form of the replacement crowd. In the 50,000 years that followed --- a time four to eight times shorter than the entire length of time the Neanderthals existed --- the replacement crowd not only settled on almost every habitable speck of land on the planet, they developed technology that allowed them to go to the moon and beyond. If there is a genetic underpinning to this cultural and technological explosion, as I'm sure there is, then scientists should eventually be able to understand this by comparing the genomes of Neanderthals to the genomes of people living today."<br />
<br />
Until this statement, late in <em><a href="http://www.theguardian.com/books/2014/feb/19/neanderthal-man-search-lost-genomes-svante-paabo" target="_blank">Neanderthal Man</a>, </em>Paabo's story has been about his personal and scientific journey from a young man in training to be a physician who takes an interest in the DNA of dead humans and ancient species to become director of the Department of Evolutionary Genetics at the Max Planck Institute who dissected the Neanderthal genome. The memoir reads a bit like a detective story. Now Paabo is trying to give meaning to what he has found.<br />
<br />
Who is this Replacement Man? It is the anatomically modern human (<em>homo sapiens sapiens</em>), but not the anatomically modern human that chronologically and immediately replaced the archaic human (<em>homo sapiens</em>) approximately 100,000 years ago. This is the anatomically modern human who began spreading across the earth "shortly after 50,000 years ago." (See also <a href="http://csilcox-thebookshelf.blogspot.com/2013/09/luigi-luca-cavalli-sforza-genes-peoples.html" target="_blank">September 25, 2013 post</a>). The oldest modern human bones found in the Levant date back are approximately 100,000 years old. Further evidence indicates that modern humans and Neanderthals mixed here in the Middle East for about 50,000 years, but there is no evidence that either was dominant. Their stone tools appear to be the same. But "shortly after 50,000 years ago," co-existence was no longer the norm. When humans appeared in an area, Neanderthals disappeared either immediately or shortly thereafter. These modern humans "replaced" Neanderthals. Modern human tools and weapons were more advanced than Neanderthal technology and the <a href="http://www.lithiccastinglab.com/gallery-pages/2002marchaurignacianpage1.htm" target="_blank">Aurignacian culture</a>, as it is described, produced the first cave art and first figurines of animals, including mythical creatures. "The 'replacement crowd,' says Paabo, "thus exhibited behaviors that were only occasionally or not seen at all among Neanderthals and among the earlier modern humans" who occupied the Levant the previous 50,000 years. "We don't know where the 'replacement crowd' came from. In fact, they could have been the descendants of the same humans who had already been living in the Middle East, simply accumulating the cultural inventions and proclivities that enabled 'replacement,' but it is more likely that they came from somewhere in Africa." Paabo does not report the evidence in support of this thesis, but it would be interesting to know.<br />
<br />
As a result of Paabo's work, we now know the modern human genome contains small amounts of Neanderthal DNA, and thus we know interbreeding occurred between <em>homo neanderthalensis</em> and <em>homo sapiens sapiens</em>. As a result of Paabo's work, the Neanderthal genome has been published and the work is in its incipiency to identify crucial differences between the Neanderthal genome and the human genome. it is estimated that the total number of DNA sequence positions at which Neanderthals and humans differ is roughly 100,000. One goal of this research is to identify genetic changes that might be relevant for how humans think and behave --- a possible clue to what makes us human --- a subject that has reappeared in this blog more than once. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/michael-gazzaniga-human-2008.html" target="_blank">September 27, 2009</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2013/03/richard-wrangham-catching-fire-how.html" target="_blank">March 28, 2013</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2013/10/michael-tomasello-origins-of-human.html" target="_blank">October 26, 2013</a> post). There will be more to come.CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-81287322033563245412014-04-09T20:18:00.002-07:002014-11-15T09:31:55.527-08:00William Nordhaus, The Climate Casino (2013)As previous posts both adumbrate and expose, the human brain does not always successfully sort fact from fiction, but most of the time it does do a pretty good job in comprehending reality even if that reality is derived from imagination or rests upon probabilities. The brain --- and I am thinking of the human brain in particular, but it could be any animal brain to an extent --- deals with uncertainties. It deals with uncertainties in degrees. Another remarkable feature of the human brain is its ability to anticipate and to project in the future, what Jeff Hawkins called "intelligence" where the brain recognizes a predictable set of patterns, even in the face of uncertainties. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/11/jeff-hawkins-on-intelligence-2004.html" target="_blank">November 16, 2013 post</a>).<br />
<br />
But even if the capacity of the human brain is remarkably successful in comprehending reality and anticipating the future, there are subjects like predicting future climate change and the inherent uncertainty in predicting the risk of harm that will occur if average global temperature rises just a few degrees over the current temperature that challenge the ability of some, if not many, to call their predictive capacity remarkable. A previous post discussed this issue. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/08/jared-diamond-collapse-2005.html" target="_blank">August 12, 2012 post</a>). <a href="http://economics.yale.edu/people/william-d-nordhaus" target="_blank">William Nordhaus</a> has spent a career studying and getting his intellect around the risk, uncertainty, and economics of climate change. <em>The </em><a href="http://www.nybooks.com/articles/archives/2013/nov/07/climate-change-gambling-civilization/" target="_blank"><em>Climate Casino</em></a> is Nordhaus' reflection on this subject; it is also a wake up call for those who can't or do not want to spend time looking into the future with the benefit of a very substantial amount of data that indicates a significant risk of harm from an average global temperature rise of just a few degrees <a href="http://www.eia.gov/oiaf/1605/ggccebro/chapter1.html" target="_blank">due to increased emissions of greenhouse gases</a>. I will discuss this last thought in a moment.<br />
<br />
What is not uncertain for economist Nordhaus and like-minded scientists who study climate change is that average global temperature has been increasing since the beginning of the 20th century, which is roughly aligned with the beginning of the industrial era. What is also uncertain for Nordhaus and others is that humans are already witnessing impacts from the average global temperature increase. Finally what is not uncertain for Nordhaus and like-minded scientists who study climate change is that human behavior resulting in increased emissions of greenhouse gases is responsible for the slow, but steady rise of average global temperatures.<br />
<br />
What is uncertain is the potential future impacts from rising average global tempertures, not the science that connects greenhouse gas emissions to a slow steady rise of average global temperatures. Nordhaus would be the first to admit how difficult it is to estimate and measure the impacts of climate change. He would also be the first to admit about the uncertainty associated with future <em>rate</em> of increased greenhouse gas emissions. For example, "Estimating the impacts of climate change on health is yet another difficult task. It requires estimate of climate change by region and year. Then it requires estimates of the impacts of changing climate conditions on health for different diseases. This is challenging because the changes take place well into the future in a world where incomes, medical technologies, and health status are evolving rapidly." With respect to estimating the impacts of sea level rise in the future, Nordhaus notes, "One of the challenges is that sea-level rise is so delayed. While the impacts on farming and health might arrive relatively quickly, the seal level rise slowly for many centuries because of the thermal inertia in oceans and the long delays in melting the giant ice sheets. The long delays pose special challenges because they require envisioning the shape of our landscape and societies deep into the future and taking steps today that will produce the most benefits well beyond the present century." With respect to the prospect for species loss in the future attributable to increased average global temperatures, there are no reliable estimates of the risk of extinction for different species; some climatic ranges will shift, and some areas might grown, so the number of species for growing ranges would be predicted to increase rather than decline, and they do not account for species loss by habitat destruction, overuse, overfishing, overhunting, and pollution that would occur even in the absence of climate change. And even if you could devise a means of estimating the risk of species loss due to climate change, there are no reliable techniques for valuing ecosystem and species loss.<br />
<br />
<br />
"What should we conclude at the end of this review of the impacts of future climate change?" Nordhaus asks. "The first point to emphasize is the difficulty of estimating impacts. They combine the uncertainties of emissions projects and climate models. Even if we overlook the uncertainties about future climate change, the reactions of human and other living systems to these changes are very poorly understood. In part, reactions of social systems are hard to forecast because they are so complex. In addition, humans increasingly manage their own environment, so that a small investment in adaptation may offset the impact of climate change on human societies. Moreover, climate changes are almost certain to occur in the context of technologies and economic structures that will differ vastly from those today.*** However, we must look through the fuzzy telescope as best we can. A second conclusion involves the estimate economic impacts of climate change from sectors that we can reliably measure, particularly for high income countries of today or the future. The estimates here are that economic impacts from climate change will be small relative to the likely overall changes in economic activity over the next half century to century. *** The loss in income would represent approximately one year's growth for most countries spread over several decades [because] managed systems are surprisingly resilient to climate changes if they have the time and resources to adapt. *** A third major conclusion is that the most damaging impacts of climate change --- in unmanaged and unmanageable human and natural systems --- lie well outside the conventional marketplace: sea-level risk, hurricane intensification, ocean acidification, and loss of biodiversity*** unstable ice sheets and reversing ocean currents."<br />
<br />
There are mitigation strategies for slowing climate change by reducing emissions. They are not inexpensive, and because they are not inexpensive Nordhaus recognizes that humans cannot implement these mitigation strategies in one fell swoop. But since we understand all too well where carbon dioxide emissions come from, it is not that difficult to say that if humans had the will to mitigate climate change by reducing carbon emissions we know what to do. Here are some basic facts:<br />
<ul>
<li>Petroleum emits 0.9 tons of carbon dioxide per $1000 of fuel</li>
<li>Natural gas emits 2 tons of carbon dioxide per $1000 of fuel</li>
<li>Coal emits 11 tons of carbon dioxide per $1000 of fuel</li>
</ul>
Coal has about six times more carbon dioxide emissions per dollar of cost than natural gas and about 12 times ore than petroleum. "Coal is a very inexpensive fuel per unit of energy but has the disadvantage that much carbon dioxide is released per dollar of expenditure." This suggests that the "most economical way to reduce energy emissions is to reduce coal use." When looking at emissions from the household perspective (2008 data), automotive travel is the biggest contributor to carbon dioxide emissions at 7.9 tons per household (15.2% of emissions). Space heating contributes less than half that amount at 3.2 tons per household (6.2% of emissions); air conditioning represents 1.3 tons per household (2.5% of emissions). Lighting use, electronics and computers are much smaller contributors. With the exception of automotive, the contribution of heating, air conditioning and other appliances to carbon dioxide emissions could be substantially impacted by changing the fuel mix to generate electricity by significantly reducing coal in favor of natural gas or renewables. "The results of detailed energy models suggest an important and troubling conclusion. The favorite policies of most countries today are energy efficiency regulations such as those for automobiles and appliances like refrigerators. However, such regulations will not touch the area where reductions are most economical --- electricity generation from coal. While energy-efficiency regulation may be popular, reducing coal use meets with ferocious opposition from coal regions and their hired guns. But careful analyses show that coal is king when it comes to reducing carbon dioxide emissions.***Significant reductions in emissions cannot be done quickly, or cheaply with today's technologies or those that are ready for large-scale deployment.***Yet we need to ensure that societies rely on the least expensive approaches. Returning to our examples of refrigerators versus electricity generation, we was a cost difference of a factor of almost ten."<br />
<br />
None of this is lost on the Obama Administration. There is a reason that <a href="http://www.nationaljournal.com/energy/energy-secretary-all-of-the-above-is-climate-friendly-20140226" target="_blank">the Administration has been pursuing an "all of the above" strategy for energy</a>: they are driving down natural gas prices by increasing supply, which incentivizes industry and utilities to shift their fuel mix from high carbon intensity coal to lower carbon intensity natural gas. <a href="http://www.eia.gov/todayinenergy/detail.cfm?id=13731" target="_blank">It is working</a>. Additionally, the Administration is providing limited subsidies (loan guarantees) for nuclear fuel even in the face of criticism that nuclear electricity generation plants are not cost effective: nuclear energy substitutes for coal as a fuel. The Administration's support for renewable energy fits a similar model.<br />
<br />
Economists recognize that there is a more efficient way to reduce carbon emissions than all these regulatory strategies: Tax carbon emissions or cap emissions and create a market mechanism to allow emitters to pay for permission to emit above the caps. Either way puts a price on carbon that makes it more expensive to emit carbon dioxide. When that happens, economic actors will reduce their emissions. Despite all the anti-tax rhetoric about a carbon tax, a carbon tax has the serious potential to enable taxing authorities to reduce a number of other taxes in a significant way. <a href="http://www.carbontax.org/" target="_blank">A carbon tax does not have to be accretive to overall taxation.</a> <a href="http://www.fas.org/sgp/crs/misc/R42731.pdf" target="_blank">There are some in the US Congress who have actually looked at a carbon tax this way</a>. The advantage for establishing a market in allowances (cap and trade) is that it ensures that emissions are used in the most productive manner, explains Nordhaus. This type of system has been very successful in reducing sulfur dioxide emissions.<br />
<br />
Nordhaus is right: "reducing coal use meets with ferocious opposition from coal regions and their hired guns." It also meets with deceptive rhetoric from those who simply oppose government intervention in the economy. A recent article in today's Washington Post headlined "<a href="http://www.washingtonpost.com/blogs/erik-wemple/wp/2014/04/07/study-fox-news-botches-climate-change-coverage/" target="_blank"><em>Study: Fox News botches climate-change coverage</em></a>" identifying misleading portrayals of climate science, which, by Nordhaus' view, is the least controversial piece in this discussion. It's the economics stupid; it's not the science. The hard part is to avoid over-estimating the costs of the damage that may ensue, as Nordhaus highlights in <em>The Carbon Casino, </em>so we don't make the mistake of establishing and pursuing mitigation strategies that cost more than we need to pay over the long-run. But that is almost what is happening when we favor imposing more stringent regulations on refrigerators and light bulbs as a carbon mitigation strategy instead of raising the price of carbon, which is more efficient and will motivate economic actors to pursue their own self-interest in a way consistent with the public interest. The other hard part is overcoming our reluctance to demonize coal, and pursue more economically efficient carbon pricing policies to reduce carbon emissions such as carbon taxes or cap and trade. This is politics. Our inability to do that is what is referred to as the <a href="http://www.npr.org/templates/story/story.php?storyId=120883813" target="_blank">tragedy of the commons</a>: where acting in self-interest is not the same as the collective or global interest. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/08/jared-diamond-collapse-2005.html" target="_blank">August 12, 2012 post</a>). This is exactly when governments can help.<br />
<br />
I am reminded in this political debate of <a href="http://www.scientificamerican.com/article/the-believing-brain/" target="_blank">Michael Shermer's analysis</a> of various biases that form our beliefs, and sometimes false beliefs, about things we do not really understand: "the <strong>anchoring bias:</strong> relying too heavily on one reference anchor or piece of information when making decisions; the <strong>authority bias</strong>: valuing the opinions of an authority, especially in the evaluation of something we know little about; <strong>in-group bias</strong>, in which we place more value on the beliefs of those whom we perceive to be fellow members of our group and less on the beliefs of those from different groups. This is a result of our evolved tribal brains leading us not only to place such value judgment on beliefs but also to demonize and dismiss them as nonsense or evil, or both." Our modern tribal groups are political parties, religious groups, and other social clubs like those on some television talking heads groups. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/09/daniel-schacter-seven-sins-of-memory.html" target="_blank">September 12, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011</a> posts).<br />
<br />
<div>
</div>
CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-16599436087311319062014-03-02T15:45:00.001-08:002017-01-16T13:51:49.137-08:00Albert Camus, The Plague (1947)Another <a href="http://www.amazon.com/Rereading-Patricia-Meyer-Spacks-ebook/dp/B006LZTL9O/ref=sr_1_2?ie=UTF8&qid=1390770980&sr=8-2&keywords=Rereading" target="_blank">rereading</a> of a great novel decades later.<br />
<br />
<a href="http://plato.stanford.edu/entries/camus/" target="_blank">Albert Camus' </a>plague is a metaphor for everything that the human conscience is typically compelled to resist. I say "typically" with deliberation, because human resistance to the unconscionable is not universal. Camus would know this too well since his native Algeria, the country where the metaphorical events described in <em>The Plague</em> take place, was occupied by Nazi Germany's Vichy French collaborators during World War II as Camus was writing the novel. <br />
<br />
Whatever the origins of human morality may be (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a> post), and whether morality and conscience have a strictly biological basis or are co-determined by the interaction of biology and culture (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/12/daniel-kelly-yuck-nature-and-moral.html" target="_blank">December 11, 2013</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/02/david-hume-treatise-of-human-nature.html" target="_blank">February 27, 2011 post</a>), there is a common human trait to resist things that threaten human social stasis, just like the body fights infections. <em>The Plague </em>is a novel about resistance to an <em>amoral </em>threat to a society: a bacterial disease. As depicted in <em>The Plague</em>, the emotional foundation of this resistance is a love that overcomes despair combined with a belief that there will be a future. It is a love that confronts the near certainty of death. Without love, the social action that organizes the resistance would not be possible and we succumb to the plague. Yet it is ironic where this love emerges from: human separation, what Camus calls <a href="http://saltambique.blogspot.com/2013/09/critique-idea-of-exile-and-kingdom-in.html" target="_blank">exile</a>. The novel's primary characters, Dr. Rieux, Rambert, and Tarrou, are separated from their wife and lover and think constantly about when they will be reunited with their loved ones. To soften the ache of their separation, they turn their love externally to humanity to aid the victims of the plague that will kill them all if they don't resist.<br />
<br />
The opposite of resistance is submission and we meet Camus' representative of human submission in Father Paneloux. Father Paneloux is a good man, but by faith and religion he is committed to the fatalist belief that the plague is a test imposed by something so powerful that humans cannot resist. This is a submission to what is commonly referred to as god's will. There is no room for a Sisyphus who refuses to abandon resistance in god's kingdom. "Calamity has come on you my brethren," Paneloux tells his flock, "and my brethren you deserved it." And Paneloux launches into a recitation of every instance in religious storytelling where god purportedly inflicted floods, plagues, and calamity on those who deserved punishment for some reason or another. "No earthly power, may, not even ---- mark me well --- the vaunted might of human science can avail you to avert that hand [of god] once it is stretched toward you. And winnowing, you will be cast away with the chaff." You can't resist the plague; submit; you were meant to be punished by this insidious disease. Science cannot help you.<br />
<br />
The <a href="http://csilcox-thebookshelf.blogspot.com/2014/01/william-shakespeare-julius-caesar-1599.html" target="_blank">previous post</a> considers briefly justifications for murder including rebellion against absolute authority that leads to regicide or its modern equivalent, political assassination. The justification acquires its gravitas if the lethal revolt is directed at someone evil, wicked or morally bad or wrong such as genocidal mass murderers like <a href="http://www.historyplace.com/worldhistory/genocide/pol-pot.htm" target="_blank">Pol Pot</a> or <a href="http://www.jewishvirtuallibrary.org/jsource/Holocaust/hitler.html" target="_blank">Adolph Hitler</a>. There is no or little gravitas in justifying a homicide in the case of a leader who is at worst flawed.<br />
<br />
<em><a href="http://www.amazon.com/Plague-Penguin-Modern-Classics/dp/0141185139/ref=sr_1_3?s=books&ie=UTF8&qid=1390772236&sr=1-3&keywords=the+plague" target="_blank">The Plague</a> </em>examines a mass murderer that is indifferent, nature. We often refer to these deadly diseases as acts of god, and we consider ourselves helpless in our capacity to respond to them. The <a href="http://en.wikipedia.org/wiki/Bubonic_plague" target="_blank">bubonic plague</a> of the 14th Century known as the <a href="http://en.wikipedia.org/wiki/Black_Death" target="_blank">Black Death</a>, memorialized in Barbara Tuchman's <em><a href="http://www.amazon.com/Distant-Mirror-Calamitous-14th-Century/dp/0345349571" target="_blank">A Distant Mirror</a></em> killed an estimated 75-200 million people. It is estimated that this plague reduced the human population on earth by 17-22% from an estimated 450 million down to 350–375 million in the 14th century, and reducing Europe's population by an estimated 30-60%. More recently, a 20th century <a href="http://en.wikipedia.org/wiki/1918_flu_pandemic" target="_blank">flu pandemic in 1918</a> killed an estimated 3% - 5% of the world's population. <br />
<br />
Albert Camus' fictional story of a plague that infected Oran, Algeria sometime in the 20th century is entirely metaphorical. Camus began writing his novel sometime in 1941 after Germany invaded and occupied France and several other European countries in 1940, invaded North Africa and world war was breaking out across Asia with Japanese aggression. Algeria was never occupied by the Germans or the Italians, but instead the Germans were nominally represented in Algeria by the <a href="http://en.wikipedia.org/wiki/Vichy_France" target="_blank">Vichy French</a> who submitted and collaborated with the German Nazis. If we credit an October 1941 entry in his <a href="http://www.amazon.com/Notebooks-1935-1942-Volume-Albert-Camus/dp/1566638720/ref=pd_bxgy_b_text_z" target="_blank">Notebook</a>, his metaphorical plague and Nazi aggression against Jews are linked in Camus' view. <br />
<br />
<em>Plague. Bonsels, pp 144 and 222.</em><br />
<em> 1342. The Black Death in Europe. The Jews are murdered.</em><br />
<em> </em><a href="http://drvitelli.typepad.com/providentia/2010/09/the-man-of-faith-part-2.html" target="_blank"><em>1481. The plague ravages the South of Spain</em></a><em>. The Inquisition says: The Jews. But the plague kills an inquisitor.</em><br />
<br />
Genocidal humans are consigned to a historical dust heap under the category Evil. Hannah Arendt's "<a href="https://www.bu.edu/wcp/Papers/Cont/ContAssy.htm" target="_blank">banality of evil</a>" notwithstanding, humans are less inclined to ascribe indifference to genocide. Humans typically ascribe a hateful purpose lurking behind something we call Evil. Evil is synonymous with malevolence, the state of mind of having ill-will toward other persons or objects. It is not perceived as synonymous with indifference. <br />
<br />
Camus was concerned with a very different kind of evil in the face of a mass murderer, one that juxtaposes with ignorance. "The evil that is in the world always comes of ignorance, and good intentions may do as much harm as malevolence, if they lack understanding," he writes in <em>The Plague.</em> "On the whole, men are more good than bad; that, however, isn't the real point. But they are more or less ignorant, and it is this that we call vice or virtue; the most incorrigible vice being that of an ignorance that fancies it knows everything and therefore claims for itself the right to kill. The soul of the murderer is blind; and there can be no true goodness nor true love without the utmost clear-sightedness." <br />
<br />
Father Paneloux is the standard bearer of good intentions that can do as much harm as malevolence, because he lacks understanding. Paneloux: "The love of god is a hard love. It demands total self-surrender, disdain of our human personality. Yet love of god can reconcile us to suffering and the deaths of children, it alone can justify them, since we cannot understand them, and we can only make god's will ours." Camus' words from the mouth of Father Paneloux are the words that are so common among human religions; words of blind submission, fatalism, that commends us not to resist but to justify and accept the death of children and other innocents. Shortly following this sermon, the plague infects Father Paneloux and he quickly dies. Before he succumbs, the Father declines Dr. Rieux's offer to stay with him as he is fading, "Thanks. But priests can have no friends. They have given their all to god." That is an evil that comes of ignorance. Of course, not all priests are represented by Paneloux. There are many priests over the course of history who resist oppression, disease, or calamity in the name of god. But where religion and state are joined at the hip (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/09/jose-saramago-notebook-2010.html" target="_blank">September 28, 2010</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2013/01/jose-saramago-baltasar-and-blimunda-1982.html" target="_blank">January 1, 2013</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2013/03/edward-humes-monkey-girl-evolution.html" target="_blank">March 24, 2013 post</a>), love and justice is not a commonplace outcome.<br />
<br />
Oran, of course, is repaired by those who are not priests: Rieux, Tarrou, Rambert, and Grand, those who do not disdain human personality. The plague has been out-lasted and there are survivors. But Camus would be quick to admit that the plague has not been conquered. Tarrou, speaking to Dr. Rieux toward the end of the novel confesses his abhorrence of the death penalty and the efforts he has made in the past to resist it. "As time went on I merely learned that even those who were better than the rest could not keep themselves nowadays from killing or letting others kill, because such is the logic by which they live; and that we can't stir a finger in this world without the risk of bringing death to somebody. . . .And today I am still trying to find it; still trying to understand all those others and not to be the mortal enemy of anyone. I only know that one must do what one can to cease being plague-stricken, and that's the only way in which we can hope for some peace or, failing that, a decent death. This, and only this, can bring relief to men and, if not save them, at least do them the least harm possible and even, sometimes, a little good. So that is why I resolved to have no truck with anything which, directly or indirectly, for good reasons or for bad, brings death to anyone or justifies others' putting him to death. *** That too, is why this epidemic has taught me nothing new, except that I must fight it at your side. I know positively ***that each of us has the plague within him; no one, no one on earth is free from it. And I know, too, that we must keep endless watch on ourselves lest in a careless moment we breathe in somebody's face and fasten the infection on him. What's natural is the microbe. All the rest --- health, integrity, purity (if you like) ---- is a product of the human will, of a vigilance that must never falter. The good man, the man who hardly infects anyone, is the man who has the fewest lapses of attention. And it needs tremendous will-power, a never ending tension of the mind, to avoid such lapses. Yes Rieux, it's wearying business being plague-stricken. But it's still more wearying to refuse to be it. That's why everyone in the world today looks so tired; everyone is more or less sick of plague. *** Once I definitely refused to kill, I doomed myself to an exile that can never end." But where would Tarrou stand if the killer was not natural? What if the murderer is not an "innocent murderer?" Where would Tarrou stand against the nihilists? Would his willingness to resist justify killing in the name of peace? Tarrou does not answer because he is not asked. We do not know if he would ever consider joining the armed French resistance. <em>The Plague</em> is, of course, less of a novel about the evil of murder than the evil of indifference.<br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-74545556294738884742014-01-26T08:53:00.003-08:002014-03-02T07:46:43.191-08:00William Shakespeare, Julius Caesar (1599)There is some irony in the current thinking that the origins of human propensity to moral or altruistic behavior emerged as a result of group behavior that deprived, ostracized, or exiled individuals who did not respect the group's expectation of reciprocal altruism. Ostracism and exile represented a punishment of those who cheated on the group. In the hunter-gatherer era, we are primarily dealing with those who take food from the group (or more than their share) without contributing (or contributing proportionately) to securing or preparing that food. It was the tendency for deprivation and ostracism to trigger an emotion, shame, and lead the individual to reconnect with the group's expectations. The irony is this: in some cases that exile could be lethal and permanent, death. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). Today, we would rarely, if ever, think of imposing a "death penalty" for stealing food. This issue ultimately begs for a discussion of when is homicide justifiable? Is it ever justifiable in the name of enforcing an expectation of reciprocal, altruistic behavior?" <br />
<br />
Today we think of <a href="http://en.wikipedia.org/wiki/Justifiable_homicide" target="_blank">justifiable homicide</a> almost exclusively in terms of self-defense: where it is reasonable to believe that the offending party posed an imminent threat to the life or wellbeing of another. Justifiable homicide is widely recognized almost everywhere in this way. There are <a href="http://apps.leg.wa.gov/RCW/default.aspx?cite=9A.16.040" target="_blank">statutes</a>, for example, that explain the circumstances when a homicide by a police officer is justifiable. This type of immunity conditionally respects the state's monopoly on violence, which is <a href="http://en.wikipedia.org/wiki/Monopoly_on_violence" target="_blank">one definition</a> of a government. There are also statutes that conditionally immunize lethal conduct by citizens, typically in protecting one's person, immediately family, others in his or her presence, or home. In <a href="http://apps.leg.wa.gov/RCW/default.aspx?cite=9A.16.050" target="_blank">Washington State</a>, for example, "Homicide is also justifiable when committed either: (1) In the lawful defense of the slayer, or his or her husband, wife, parent, child, brother, or sister, or of any other person in his or her presence or company, when there is reasonable ground to apprehend a design on the part of the person slain to commit a felony or to do some great personal injury to the slayer or to any such person, and there is imminent danger of such design being accomplished; or (2) In the actual resistance of an attempt to commit a felony upon the slayer, in his or her presence, or upon or in a dwelling, or other place of abode, in which he or she is." California has only a slightly more expansive, but similar <a href="http://codes.lp.findlaw.com/cacode/PEN/3/1/8/1/s197" target="_blank">statute</a>, which also recognizes that a homicide may be justifiable --- similar to the protection offered a police officer but extended to private citizens --- "to apprehend any person for any felony committed, or in lawfully suppressing any riot, or in lawfully keeping and preserving the peace." The stand-your-ground provision in <a href="http://www.leg.state.fl.us/statutes/index.cfm?App_mode=Display_Statute&URL=0700-0799/0776/Sections/0776.013.html" target="_blank">Florida</a> is not all that different on paper, but with one exception: <span class="Number">"</span><span class="Text Intro Justify" xml:space="preserve">A person who is not engaged in an unlawful activity and who is attacked in any other place where he or she has a right to be <em>has no duty to retreat and has the right to stand his or her ground and meet force with force</em>, including deadly force if he or she reasonably believes it is necessary to do so to prevent death or great bodily harm to himself or herself or another or to prevent the commission of a forcible felony." What the stand-your-ground provision did was abolish an obligation to retreat when facing imminent danger. Standing your ground was not previously an option. Where controversy further erupts over this issue is in the interpretation of these statutes by police officers investigating a homicide, by judges in their jury instructions, or perhaps jurors who may be bring some bias to their judgment. There is some data that suggests homicides are increasing in jurisdictions with stand your ground laws like <a href="http://www.chron.com/news/houston-texas/article/Texas-justifiable-homicides-rise-with-Castle-3676412.php" target="_blank">Texas</a> and <a href="http://www.motherjones.com/mojo/2013/09/stand-your-ground-justifiable-homicide-increase" target="_blank">Florida</a>. But I don't think this should be surprising: the law appears to allow someone to say, "If you want to rumble, let's rumble, even to the death." The self-defense law no longer deters conflict; this development in self-defense law no longer seems interested in promoting reciprocal, altruistic behavior. </span><br />
<br />
Contemplating the assassination of Julius Caesar, is <a href="http://en.wikipedia.org/wiki/Regicide" target="_blank">regicide</a> ever justified by "the abuse of greatness when it disjoins [severs] remorse [compassion] from power?" [<em><a href="http://www.amazon.com/Julius-Caesar-Annotated-Shakespeare-William-ebook/dp/B00FC655QG/ref=sr_1_3?s=books&ie=UTF8&qid=1389564722&sr=1-3&keywords=julius+caesar+and+bloom" target="_blank">Julius Caesar</a></em>, Act II, Scene I]. Purging the Roman Republic of tyranny is the conspirators' purported justification for killing Caesar, and assassination is their means. Julius Caesar was never accused of murder. This was a political squabble over power and form of government. As the <a href="http://csilcox-thebookshelf.blogspot.com/2013/12/william-shakespeare-richard-iii-1592.html" target="_blank">previous post</a> discussing the murderous Richard III observes, Americans had civil means of driving Richard Nixon from power in 1974 by effectively shaming him into respecting the office to which he was elected and going into exile. But how do Romans before the common era and 15th century Britons shame their regent (like Caesar or Richard III), who <a href="http://faculty.history.wisc.edu/sommerville/367/367-04.htm" target="_blank">claim to inherit their power from some divine source</a>, into leaving office? They can't unless they are to claim the government's mantle of the monopoly of violence by securing the assistance of the military or another army in a coup. <br />
<br />
Shakespeare imagines a brief discussion among the conspirators whether they should also kill Marc Antony, a potential successor to Caesar, which discussion Brutus quells: "Let Antony and Caesar fall together," says Cassius. No, Antony is but a limb of Caesar, says Brutus, "Let us be sacrificers, but not butchers [of limbs] . . . We shall be called purgers, not murderers. And for Mark Antony, think not of him. For he can do no more than Caesar's arm when Caesar's head is off." Brutus can claim to justify the murder of Caesar on the basis of protecting the people from a tyrant --- an argument not entirely distant from a claim of self-defense of the people of the Roman Republic. But he cannot claim a justification for ridding Rome of Antony. That would be murder.<br />
<br />
How does that distinction resonate in the modern era? There are modern murderers who occupy positions of power and who are simply evil and abuse their power --- Hitler, Lenin, Stalin, Saddam Hussein, or <a href="http://en.wikipedia.org/wiki/Pol_Pot" target="_blank">Pol Pot</a>, --- are just a few modern examples. Why would we not exculpate someone who slayed ("purged") Pol Pot <em>to prevent further acts of murder</em> by the Khmer Rouge and pollution of the <a href="http://en.wikipedia.org/wiki/Killing_Fields" target="_blank">Killing Fields</a> --- one of the greatest of human tragedies? Similarly, if the <a href="http://en.wikipedia.org/wiki/20_July_plot" target="_blank">attempt to kill Adolph Hitler</a> had succeeded, the conspirators may have been sanctioned by the Nazi government that supported Hitler, but to the rest of the world they would have been exculpated. <br />
<br />
The people of Rome or the Senate might have praised Brutus and Cassius for protecting them from a dictatorship, but the slaying of Caesar was met mostly with silence from the people. Caesar was not a murderer like we think of Pol Pot, Hitler or Saddam, but he was a warrior empire builder who sought absolute power. There is no evil villain in <em>Julius Caesar,</em> there are only flawed characters. Caesar may not have been empathetic, but he actually enjoyed support from his underclass and even some in the Senate. Brutus and Cassius did not enjoy widespread popular support of the governed. Neither did Brutus and Cassius have a plan to govern Rome once Julius Caesar was purged. Presumably they assumed the Senate would be restored to its historical authority before Caesar accumulated power. Although Antony initially advocated for amnesty for the conspirators, Julius Caesar's designated heir, <a href="http://www.roman-empire.net/emperors/augustus.html" target="_blank">Octavian</a>, disagreed and pursued revenge for the death of Julius Caesar. Ultimately, Brutus and Cassius fled Rome to find sanctuary and raise an army in the eastern part of the empire. During the <a href="http://en.wikipedia.org/wiki/Liberators%27_civil_war" target="_blank">Liberators' Civil War</a> that ensued following the death of Julius Caesar, both Cassius and later Brutus commit suicide as Antony's forces corner them. Civil war ensued across the Roman Empire for a number of years until ultimately <a href="http://en.wikipedia.org/wiki/Augustus" target="_blank">Caesar Augustus</a> (<a href="http://www.roman-empire.net/emperors/augustus.html" target="_blank">Octavian</a>), claiming divine power, consolidated total power as emperor. Even if the homicide of Julius Caesar could have been justified, it failed to accomplish any justifiable result. The Roman Republic met its demise with the rise of Caesar's Roman Empire. That is the larger tragedy in <em>Julius Caesar</em>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/03/bill-oreilly-killing-lincoln-2011.html" target="_blank">March 15, 2012 post</a>). <br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-26253891481323183832013-12-19T19:05:00.003-08:002014-01-23T16:59:54.277-08:00William Shakespeare, Richard III (1592)I usually discover something new in <a href="http://www.amazon.com/Rereading-Patricia-Meyer-Spacks-ebook/dp/B006LZTL9O/ref=sr_1_1?ie=UTF8&qid=1387405855&sr=8-1&keywords=Rereading" target="_blank">rereading</a> a book I have not touched in a long time. With the passage of time, there is inevitably a different perspective than the original perspective that yields a different insight. In some cases the book loses it magic the second time around, and in other cases the book is just as vibrant as it was the first time but for entirely different reasons.<br />
<br />
Decades ago, while a mere high school student reading Shakespeare, and in the wake of the 1970 <a href="http://www.history.com/topics/kent-state" target="_blank">Kent State University shootings</a>, I submitted a paper as part of the Shakespeare course requirements that re-wrote Shakespeare's <em><a href="http://shakespeare.mit.edu/richardiii/full.html" target="_blank">Richard III</a></em> in contemporary terms. I titled it Richard the Third Rate. I wish I could recall how I dealt with the opening lines, "Now is the winter of our discontent<a href="http://www.blogger.com/null" name="1.1.2"> made glorious summer by this sun of York;</a> <a href="http://www.blogger.com/null" name="1.1.3">And all the clouds that lour'd upon our house</a> i<a href="http://www.blogger.com/null" name="1.1.4">n the deep bosom of the ocean buried.</a>" Certainly I modified "[son] of York" in some way to refer to Richard Nixon's "house." And certainly I did not rewrite the entire play, but I do recall the closing: "A chopper, a chopper, My Kingdom for a chopper." That is how Presidents leave their grounds these days and escape. They climb into a helicopter and fly away. And in hindsight this was unexpectedly prescient, because it was just four years later that Richard Nixon <a href="https://www.youtube.com/watch?v=u3U4MhKDxcw" target="_blank">climbed into a chopper</a> and fled Washington, DC after he resigned the Presidency. He resigned his Kingdom for a chopper <a href="http://watergate.info/impeachment/articles-of-impeachment" target="_blank">and avoided an impeachment trial</a>.<br />
<br />
To be sure, the parallels between the two Richards are not strong. By Shakespeare's count, Richard III is directly responsible for the execution of eleven kin, close and distant, as he cleared his path to the British monarchy. With the commencement of US bombing in Cambodia, Richard Nixon merely set in motion events that indirectly connect him to the deaths of four students at Kent State University. Richard Nixon suffered a far different fate than Richard Plantagenet of York, <a href="http://en.wikipedia.org/wiki/Richard_III_of_England" target="_blank">Richard III</a>, king of England for just two short years (1483-1485). While shamed after avoiding a criminal prosecution thanks to a pardon from his successor, Richard Nixon rebuilt his reputation to some considerable degree and lived for 20 more years after relinquishing his kingdom. Richard III's rule was extinguished when he was slain in battle by his enemies, like many of his Plantagenet kin.<br />
<br />
We have a much different means and structure for removing someone from power today, although modern polities are certainly not uniform in the way they approach the transfer of political power. The manner in which Richard III was removed from power certainly persists in a few nations, and battle to the death, execution, and murder was considerably more common in the 14th and 15th centuries. The interesting storyline about Richard III's demise and removal from power is that it was all in the family. <br />
<br />
Clearly as I read Richard III in 1970, Richard Nixon was part of my mental association. Forty-three years later,<a href="http://en.wikipedia.org/wiki/Kin_selection" target="_blank"> kin selection</a> was on my mind as I turned the pages. One <a href="http://psychology.about.com/od/kindex/g/kin_selection.htm" target="_blank">definition</a> of kin selection is this: Kin selection is an evolutionary theory that proposes that people are more likely to help those who are blood relatives because it will increase the odds of gene transmission to future generations. The theory suggests that <a href="http://psychology.about.com/od/aindex/g/what-is-altruism.htm">altruism</a> towards close relatives occurs in order to ensure the continuation of shared genes. The more closely the individuals are related, the more likely they are to help one another. That "help" may include sacrificial behavior. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2010/10/oren-harman-price-of-altruism-2010.html" target="_blank">October 13, 2010</a>, and <a href="http://csilcox-thebookshelf.blogspot.com/2009/11/bert-holldobler-eo-wilson-superorganism.html" target="_blank">November 4, 2009</a> posts).<br />
<br />
The House of <a href="http://en.wikipedia.org/wiki/House_of_Plantagenet" target="_blank">Plantagenet </a>obviously did not seriously contemplate increasing the odds of their gene transmission to future generations during their monarchical reign over England in the 14th and 15th centuries. Altruism and self-sacrifice were not in their blood; conspiring against and slaying each other was. "A house divided against itself cannot stand," said Abraham Lincoln during his 1858 campaign against Stephen Douglas, nearly four hundred years after the death of Richard III. Lincoln's remarks, emanating from the book of Mark, and later modified by Thomas Hobbes in his <em>Leviathan</em>, could very well have been written by William Shakespeare for <em>Richard III</em>.<br />
<br />
The <a href="http://www.britroyals.com/plantagenet.htm" target="_blank">Plantagenet family tree</a> is worth a look. There are some recognizable names from the British royal line. But look a little closer at some in this dysfunctional family:<br />
<br />
The House of Plantagenet came to include over time, two "cadet" branches: the <a href="http://en.wikipedia.org/wiki/House_of_Lancaster" target="_blank">House of Lancaster</a> established by the son of <a href="http://www.britroyals.com/plantagenet.asp?id=henry3" target="_blank">Henry III</a>, and the <a href="http://en.wikipedia.org/wiki/House_of_York" target="_blank">House of York</a>, established by the son of <a href="http://www.britroyals.com/plantagenet.asp?id=edward3" target="_blank">Edward III</a>. The cadet House of Lancaster captured the British throne with the accession of <a href="http://www.britroyals.com/plantagenet.asp?id=henry4" target="_blank">Henry IV</a>, and lost the throne to the House of York with the accession of <a href="http://www.britroyals.com/plantagenet.asp?id=edward4" target="_blank">Edward IV</a>. Richard III was a member of the House of York, succeeding Edward IV. These two cadet branches represented the divided House of Plantagenet and ultimately led to the <a href="http://en.wikipedia.org/wiki/Wars_of_the_Roses" target="_blank">Wars of the Roses</a> between these two family subunits.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=edward2" target="_blank">Edward II</a>: His invasion of Scotland in 1314 to suppress revolt resulted in defeat at Bannockburn. When he fell under the influence of a new favourite, Hugh le Despenser, <em>he was deposed in 1327 by his wife Isabella (1292–1358), daughter of Philip IV of France, and her lover Roger de Mortimer, and murdered in Berkeley Castle, Gloucestershire</em>. He was succeeded by his son, <a href="http://www.britroyals.com/plantagenet.asp?id=edward3" target="_blank">Edward III</a>.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=richard2" target="_blank">Richard II</a>: Richard was born in Bordeaux. He succeeded his grandfather Edward III when only ten, the government being in the hands of a council of regency. His fondness for favourites resulted in conflicts with Parliament, and in 1388 the baronial party, headed by the Duke of Gloucester, had many of his friends executed. Richard recovered control in 1389, <em>and ruled moderately until 1397, when he had Gloucester [14th child of Edward III] murdered and his other leading opponents executed or banished, and assumed absolute power</em>. <em>In 1399 his cousin Henry Bolingbroke, Duke of Hereford (later <a href="http://www.britroyals.com/plantagenet.asp?id=henry4" target="_blank">Henry IV of the House of Lancaster</a>), returned from exile to lead a revolt; Richard II was deposed by Parliament and imprisoned in Pontefract Castle, where he died probably of starvation</em>.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=henry6" target="_blank">Henry VI</a>: King of England from 1422, son of Henry V. He assumed royal power 1442 and sided with the party opposed to the continuation of the Hundred Years' War with France. After his marriage 1445, he was dominated by his wife, Margaret of Anjou. <em>He was deposed 1461 in the <strong><a href="http://www.warsoftheroses.com/" target="_blank">Wars of the Roses</a></strong>; was captured 1465, temporarily restored 1470, but again imprisoned 1471 and then murdered. The unpopularity of the government</em>, especially after the loss of the English conquests in France, <em>encouraged Richard, Duke of York, to claim the throne, and though York was killed 1460, his son Edward IV proclaimed himself king 1461</em>.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=edward4" target="_blank">Edward IV</a> (<a href="http://en.wikipedia.org/wiki/House_of_York" target="_blank">House of York</a>): He was the son of Richard, Duke of York, and <em>succeeded Henry VI in the <strong>Wars of the Roses</strong></em>, temporarily losing his throne to Henry when Edward fell out with his adviser Richard Neville, Earl of Warwick. Edward was a fine warrior and intelligent strategist, with victories at Mortimer's Cross and Towton in 1461, Empingham in 1470, and Barnet and Tewkesbury in 1471. He was succeeded by his son Edward V.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=edward5" target="_blank">Edward V</a>: King of England 1483. Son of Edward IV, he was deposed three months after his accession in favour of his uncle (Richard III), <em>and is traditionally believed to have been murdered (with his brother) in the Tower of London on Richard's orders</em>.<br />
<br />
<a href="http://www.britroyals.com/plantagenet.asp?id=richard3" target="_blank">Richard III</a>: King of England from 1483. The son of Richard, Duke of York, he was created Duke of Gloucester by his brother Edward IV, and distinguished himself in the Wars of the Roses. On Edward's death 1483 he became protector to his nephew Edward V, and soon secured the crown for himself on the plea that Edward IV's sons were illegitimate. He proved a capable ruler, <em>but the suspicion that he had murdered Edward V and his brother undermined his popularity. In 1485 Henry, Earl of Richmond (later Henry VII), raised a rebellion, and Richard III was defeated and killed at Bosworth.</em> After Richard's death on the battlefield his rival was crowned King Henry VII and became the first English monarch of the Tudor dynasty which lasted until 1603.<br />
<br />
<a href="http://www.britroyals.com/tudor.asp?id=henry7" target="_blank">Henry VII</a>: Henry was the son of Edmund Tudor, earl of Richmond, who died before Henry was born, and Margaret Beaufort, a descendant of Edward III through John of Gaunt, Duke of Lancaster. Although the Beaufort line, which was originally illegitimate, had been specifically excluded (1407) from all claim to the throne, <em>the death of the imprisoned Henry VI (1471) made Henry Tudor head of the house of Lancaster</em>. At this point, however, the Yorkist Edward IV had established himself securely on the throne, and Henry, who had been brought up in Wales, fled to Brittany for safety. The death of Edward IV (1483) and accession of Richard III, left Henry the natural leader of the party opposing Richard, whose rule was very unpopular. Henry made an unsuccessful attempt to land in England during the abortive revolt (1483) of Henry Stafford, Duke of Buckingham. Thereafter he bided his time in France until 1485 when, aided by other English refugees, he landed in Wales. <em>At the battle of Bosworth Field, Leicestershire, he defeated the royal forces of Richard, who was killed</em>. Henry advanced to London, was crowned, and in 1486 fulfilled a promise made earlier to Yorkist dissidents to marry Edward IV's daughter, Elizabeth of York. He thus united the houses of York and Lancaster, founding the Tudor royal dynasty. Although Henry's accession marked the end of the <strong>Wars of the Roses</strong>, the early years of his reign were disturbed by Yorkist attempts to regain the throne. <br />
<br />
The Plantagenets are hardly the picture of our altruistic nature. Shakespeare is the chronicler of this blood-stained line of royals (<a href="http://www.amazon.com/Henry-IV-Part-William-Shakespeare-ebook/dp/B000FC1G06/ref=sr_1_2?ie=UTF8&qid=1387417049&sr=8-2&keywords=henry+iv" target="_blank">Henry IV</a>, <a href="http://www.amazon.com/Richard-II-Folger-Shakespeare-Library/dp/0743484916/ref=sr_1_3?ie=UTF8&qid=1387417085&sr=8-3&keywords=Richard+II" target="_blank">Richard II</a>, <a href="http://www.amazon.com/King-Henry-V-William-Shakespeare-ebook/dp/B004TPTIB4/ref=sr_1_2?ie=UTF8&qid=1387417169&sr=8-2&keywords=henry+v" target="_blank">Henry V</a>, <a href="http://www.amazon.com/Henry-Parts-III-Signet-Classics/dp/0451529847/ref=sr_1_3?ie=UTF8&qid=1387417124&sr=8-3&keywords=Henry+VI" target="_blank">Henry VI</a>, <a href="http://www.amazon.com/King-Richard-III-William-Shakespeare-ebook/dp/B00847SY1S/ref=sr_1_1?ie=UTF8&qid=1387417243&sr=8-1&keywords=richard+iii+book" target="_blank">Richard III</a>), and <em>Richard III </em> brings us to the conclusion of their chronicles beginning in the waning months of the life of Edward IV with the members of the House of York reminding each other just who killed whom over the course of the latter years of the Wars of the Roses. As one source summarizes this strife, there was division not merely between the two cadet Houses of the same family, but within the House of York itself: "The next round of the wars arose out of disputes within the Yorkist ranks. Warwick and his circle were increasingly passed over at Edward’s court; more seriously, Warwick differed with the King on foreign policy. In 1469 <a href="http://www.britannica.com/EBchecked/topic/119427/civil-war">civil war</a> was renewed. Warwick and Edward’s rebellious brother George, duke of <a href="http://www.blogger.com/null" id="ref281033" name="ref281033"></a><a href="http://www.britannica.com/EBchecked/topic/119792/George-Plantagenet-duke-of-Clarence">Clarence</a>, fomented risings in the north; and in July, at <a href="http://www.blogger.com/null" id="ref281034" name="ref281034"></a>Edgecote (near Banbury), defeated Edward’s supporters, afterward holding the King prisoner. By March 1470, however, Edward regained his control, forcing Warwick and Clarence to flee to France, where they allied themselves with the French king <a href="http://www.britannica.com/EBchecked/topic/348891/Louis-XI">Louis XI</a> and their former enemy, <a href="http://www.britannica.com/EBchecked/topic/364597/Margaret-of-Anjou">Margaret of Anjou</a>. Returning to England (September 1470), they deposed Edward and restored the crown to <a href="http://www.britannica.com/EBchecked/topic/261839/Henry-VI">Henry VI</a>. Edward fled to the Netherlands with his followers and, securing Burgundian aid, returned to England in March 1471. Edward outmaneuvred Warwick, regained the loyalty of Clarence, and decisively defeated Warwick at <a href="http://www.blogger.com/null" id="ref281035" name="ref281035"></a><a href="http://www.britannica.com/EBchecked/topic/53678/Battle-of-Barnet">Barnet</a> on April 14. That very day, Margaret had landed at Weymouth. Hearing the news of Barnet, she marched west, trying to reach the safety of Wales; but Edward won the race to the Severn. At <a href="http://www.blogger.com/null" id="ref281036" name="ref281036"></a><a href="http://www.britannica.com/EBchecked/topic/589269/Battle-of-Tewkesbury">Tewkesbury</a> (May 4) Margaret was captured, her forces destroyed, and her son killed. Shortly afterward, Henry VI was murdered in the <a href="http://www.britannica.com/EBchecked/topic/346946/Tower-of-London">Tower of London</a>. Edward’s throne was secure for the rest of his life (he died in 1483)."<br />
<br />
Quoth Shakespeare's Henry VII as the curtain closes on <em>Richard III</em>, "England hath long been made and scarred herself: The brother blindly shed the brother's blood; The father rashly slaughtered his own son; The son, compelled, been butcher to the sire. All this divided York and Lancaster. Divided in their dire division."<br />
<br />
What Richard III never really enjoyed, but Richard Nixon did, was abiding loyalty. <a href="http://en.wikipedia.org/wiki/John_Dean" target="_blank">John Dean</a> ultimately broke the Nixon clique's conspiracy of silence. Everyone else in the President's inner circle maintained their silence, and Nixon stood by his men. Richard Nixon divided a nation, not his family or followers. Richard III's inner circle peeled away, some who refused to carry out his purportedly (if Shakespeare's history is accurate) criminal commands, perhaps out of principle, perhaps out of fear of slaughter, and in the end he had few to stand by him as he cried, "A horse, a horse, my Kingdom for a horse." CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-67082470848426897852013-12-11T18:06:00.001-08:002013-12-18T14:26:46.809-08:00Daniel Kelly, Yuck! The Nature and Moral Significance of Disgust (2011)I have an <strong>aversion</strong> to lima beans. aver·sion, <span class="main-fl"><em>noun</em></span> <span class="pr">\ə-<span class="unicode">ˈ</span>vər-zhən, -shən\</span>
<style type="text/css">
.headword .ld_on_collegiate { margin:10px 0 0 0;padding:0 0 0 19px; width: 405px;}
.ld_on_collegiate p {margin:0 0 10px 0;padding:0;line-height:20px; }
.ld_on_collegiate p.bottom_entry {margin:0 0 3px 0;padding:0;line-height:20px;}
#mwEntryData div.headword .ld_on_collegiate p em,
.ld_on_collegiate p em { color: black; font-weight: normal; }
#mwEntryData div.headword + div.d { margin-top: -7px; }
.ld_on_collegiate .bnote { font-weight: bold; }
.ld_on_collegiate .sl, .ld_on_collegiate .ssl { font-style: italic; }
</style><br />
<div class="ld_on_collegiate">
<div class="bottom_entry">
: a strong feeling of not liking something. **** 2</div>
<div class="scnt">
<span class="ssens"><em class="sn">a</em> <strong>:</strong> a feeling of <a class="d_link" href="http://www.merriam-webster.com/dictionary/repugnance">repugnance</a> toward something with a desire to avoid or turn from it <span class="vi"><regards drunkenness="" em="" with="">aversion </regards></span></span>> <span class="ssens"></span></div>
</div>
<div class="scnt">
<span class="ssens"><em class="sn">b</em> <strong>:</strong> a settled dislike <strong>:</strong> <a href="http://www.merriam-webster.com/dictionary/antipathy">antipathy</a> <span class="vi"><expressed an="" em="">aversion </expressed></span></span>to parties></div>
<div class="scnt">
<span class="ssens"><em class="sn">c</em> <strong>:</strong> a tendency to <a class="d_link" href="http://www.merriam-webster.com/dictionary/extinguish">extinguish</a> a behavior or to avoid a thing or situation and especially a usually pleasurable one because it is or has been associated with a <a class="d_link" href="http://www.merriam-webster.com/dictionary/noxious">noxious</a> stimulus </span></div>
<span class="ssens">
</span><br />
<div class="scnt">
Yuck! But is this aversion a form of <strong>disgust</strong>? dis·gust <span class="main-fl"><em>noun</em></span> <span class="pr">\di-<span class="unicode">ˈ</span>skəst, dis-<span class="unicode">ˈ</span>gəst <em>also</em> diz-\</span>
<style type="text/css">
.headword .ld_on_collegiate { margin:10px 0 0 0;padding:0 0 0 19px; width: 405px;}
.ld_on_collegiate p {margin:0 0 10px 0;padding:0;line-height:20px; }
.ld_on_collegiate p.bottom_entry {margin:0 0 3px 0;padding:0;line-height:20px;}
#mwEntryData div.headword .ld_on_collegiate p em,
.ld_on_collegiate p em { color: black; font-weight: normal; }
#mwEntryData div.headword + div.d { margin-top: -7px; }
.ld_on_collegiate .bnote { font-weight: bold; }
.ld_on_collegiate .sl, .ld_on_collegiate .ssl { font-style: italic; }
</style></div>
<div class="scnt">
<div class="ld_on_collegiate">
: a strong feeling of dislike for something that has a very unpleasant appearance, taste, smell, etc.<br />
<div class="bottom_entry">
: annoyance and anger that you feel toward something because it is not good, fair, appropriate, etc.</div>
</div>
</div>
<div class="scnt">
<strong>:</strong> marked aversion aroused by something highly distasteful <strong>:</strong> <a href="http://www.merriam-webster.com/dictionary/repugnance">repugnance</a></div>
<div class="scnt">
</div>
<div class="scnt">
I find lima beans distasteful, noxious. I avoid eating them. Apparently, <a href="http://www.outsideonline.com/outdoor-adventure/Lima-Beans.html" target="_blank">I am not the only one</a>. Years ago as a child, I felt the same way toward other food items, but today only lima beans remains associated with a noxious stimulus of some kind that I cannot define. I know well that others like lima beans and are not harmed by them, but something triggers certain neurons firing in my brain that creates this reaction to the lima bean. Are lima beans <em>disgusting</em>, by which I intend to describe lima beans an <a href="http://www.thefreedictionary.com/elicitor" target="_blank">elicitor</a> of disgust? By definition, lima beans are disgusting . . . at least to me. <span style="font-size: small;"><strong>dis·gust·ing</strong>, </span><span class="main-fl"><em>adjective</em></span>
<style type="text/css">
.headword .ld_on_collegiate { margin:10px 0 0 0;padding:0 0 0 19px; width: 405px;}
.ld_on_collegiate p {margin:0 0 10px 0;padding:0;line-height:20px; }
.ld_on_collegiate p.bottom_entry {margin:0 0 3px 0;padding:0;line-height:20px;}
#mwEntryData div.headword .ld_on_collegiate p em,
.ld_on_collegiate p em { color: black; font-weight: normal; }
#mwEntryData div.headword + div.d { margin-top: -7px; }
.ld_on_collegiate .bnote { font-weight: bold; }
.ld_on_collegiate .sl, .ld_on_collegiate .ssl { font-style: italic; }
</style><br />
<div class="ld_on_collegiate">
: so unpleasant to see, smell, taste, consider, etc., that you feel slightly sick<br />
<div class="bottom_entry">
: so bad, unfair, inappropriate, etc., that you feel annoyed and angry</div>
</div>
</div>
<div class="scnt">
</div>
<div class="scnt">
You see in these definitions of disgust and disgusting two distinctive feelings. One centers on an aversion, dislike or repugnance toward something that is distasteful, or suffers from a bad smell or appearance; although the definitions do not elaborate on what is distasteful or smells bad, it is easy to think of something associated with the mouth and ingestion such as rotten food or a poison. The second feeling centers on a dislike for another group or behavior considered annoying or unfair by some standard. In both cases, the consequence of the feeling is likely rejection: rejection of the noxious substance; rejection of the other person or group.</div>
<div class="scnt">
</div>
<div class="scnt">
<a href="https://sites.sas.upenn.edu/rozin" target="_blank">Paul Rozin</a> at the University of Pennsylvania and <a href="http://people.stern.nyu.edu/jhaidt/" target="_blank">Jonathan Haidt</a> at the University of Virginia and now NYU have devoted more attention and research to the subject of disgust than anyone else at this moment in time. In their contribution to the 1999 edition of the Handbook of Cognition and Emotion entitled <em><a href="http://books.google.com/books?id=vsLvrhohXhAC&pg=PA429&lpg=PA429&dq=%22disgust:++The+body+and+soul+emotion%22+and+%22chapter+21%22&source=bl&ots=uRCLamT5Ke&sig=76XoZFc4fyx8iQ7leDlwyrXA9dQ&hl=en&sa=X&ei=FRyuUsa7DMHNsQTutYDoDg&ved=0CD4Q6AEwAg#v=onepage&q=%22disgust%3A%20%20The%20body%20and%20soul%20emotion%22%20and%20%22chapter%2021%22&f=false" target="_blank">Disgust: The Body and Soul Emotion</a></em>, they made several key points and arguments:</div>
<div class="scnt">
<ul>
<li>Distaste and disgust are different. Other animals, particularly other mammals, show they react to ingesting a substance because it is distasteful by rejecting it. Disgust, on the other hand, is uniquely human because, in addition to some biological rejection of distasteful or foul smelling substance, the feeling has a cognitive content that is not elicited by sensory properties. Like other emotions, disgust links together cognitive and bodily responses, which can be analyzed as <em>an affect program, </em>in which outputs (behaviors, expressions, physiological responses) are triggered by inputs (cognitive appraisals or environmental events). While the biological outputs that represent disgust have been reasonably stable among human populations over time, there has been an enormous expansion on the cognitive appraisal side, which expansion varies with history and culture and takes disgust far beyond its animal precursors and well beyond an aversion to lima beans.</li>
<li>For humans, the elicitors of core disgust are generally of animal origin, although there is research that plants and vegetables can elicit core disgust. </li>
<li>The rejection response is now harnessed to the offensive idea that humans are animals, and thus disgust is part of affirming our unique humanity by suppressing every characteristic that we feel to be 'animal'. They call this <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.186.6114&rep=rep1&type=pdf" target="_blank">animal nature disgust</a> and distinguish it from core disgust. The cognitive notion here includes associating something deemed disgusting with something labeled impure. </li>
<li>There is also another form of disgust they call interpersonal disgust, which is rejection of persons outside one's social or cultural group. Hindu caste behavior is a prime example, but there are hundreds of other examples, racial, religious, and the like, that are readily recognized.</li>
<li>Finally, thee is social-moral disgust where violations of social norms trigger a feeling of disgust. Not all violations of social norms trigger disgust. Bank robbery they point out, while viewed as "wrong," does not trigger necessarily disgust or rejection. </li>
</ul>
</div>
<br />
There has been much written in recent years about disgust as a moral emotion or the potential nexus between disgust and moral judgments/ethical norms. The example of <a href="http://www.psychology.mcmaster.ca/3bn3/lecturenotes/papers/BorgEtAl08.pdf" target="_blank">incest avoidance</a> is one example of "moral" behavior that is at the heart of this discussion. Avoiding consumption of lima beans is not, and I am reasonably certain that the avoidance of lima beans is not considered a "moral behavior" across any culture or society. <a href="http://en.wikipedia.org/wiki/Marc_Hauser" target="_blank">Marc Hauser</a> described disgust as "the most powerful emotion against sin, especially in the domains of food and sex. . . . In the absence of a disgust response, we might well convince ourselves that it is okay to have sex with a younger sibling or eat vomit, act with deleterious consequences for our reproductive success and survival respectively." Incest avoidance is worthy of study -- in contrast to my aversion for lima beans -- because it is virtually universal across cultures, and is therefore almost unique among things humans generally avoid. <a href="http://rabbilubliner.files.wordpress.com/2011/01/will-they-serve-pork-in-heaven-shemini-5766.pdf" target="_blank">Aversion to eating pork</a>, for example, is not universal, and like lima beans there is a substantial part of the human population that likes eating pork and is not harmed by eating pork. In some cultures, the avoidance of eating pork is considered a "moral behavior." <br />
<br />
Disgust did not begin as a moral or even a social emotion. There is common agreement, among those who have studied disgust that the emotion's evolutionary origins lie in response to the ingestion of something: spoiled food, toxins; it later evolved as a response to the presence of disease and parasites, which is referred to as <a href="http://www.sas.upenn.edu/sasalum/newsltr/fall97/rozin.html" target="_blank">core disgust</a>. <a href="http://www.cla.purdue.edu/philosophy/directory/?personid=1330" target="_blank">Daniel Kelly</a> documents this body of research in <a href="http://mitpress.mit.edu/books/yuck" target="_blank"><em>Yuck!</em></a> And a common feature of this emotion is the automatic facial feature called <a href="http://www.psychologytoday.com/blog/yuck/201109/what-we-talk-about-when-we-talk-about-disgust-part-1" target="_blank">face gape</a>. As <a href="http://www.sas.upenn.edu/sasalum/newsltr/fall97/rozin.html" target="_blank">one article</a> explains, "<span style="font-family: inherit;">At its root, disgust is a revulsion response -- "a basic biological motivational system" -- that Darwin associated with the sense of taste. Its function is to reject or discharge offensive-tasting food from the mouth (and/or the stomach), and its fundamental indicator, the "gape" or tongue extension, has been observed in a number of animals, including birds and mammals. In humans, the characteristic facial expressions of disgust that coincide with gaping include nose wrinkling and raising the upper lip, behaviors usually accompanied by a feeling of nausea and a general sense of revulsion. Together these behaviors and sensations facilitate the rejection of food that has been put into the mouth." Evolutionarily, disgust began with distaste, but at some point it adapted in humans to protect against infection by pathogens and parasites. Daniel Kelly explains his thesis that the two responses became "entangled." "Other previously puzzling features of disgust also fall into place once its role in parasite avoidance becomes clear. Together, eating and sex constitute two of the most basic evolutionary imperatives. Both behaviors are unavoidable ingredients of evolutionary success, but both involve the crossing of bodily perimeters at various points. By virtue of this, both activities leave those engaging in them highly vulnerable to infection. The upshot is that disgust's role in monitoring the boundaries of the entire body (rather than just the mouth) makes much more sense in light of its connection to infectious disease. Moreover, both feeding and procreating are activities that require those boundaries to be breached. They are highly salient to disgust both because they are universal and unavoidable and because they are two of the most potent vectors of disease transmission." It is this entanglement of distaste and core disgust that is unique to humans. There is evidence that each is independently found in other species.</span><br />
<br />
Importantly, disgust appears to be activated in the cortex. Kelly identifies the <a href="http://en.wikipedia.org/wiki/Insular_cortex" target="_blank">insula</a>, which is not part of that ancient subcortical system of the forebrain or midbrain that Jaak Panksepp documents is associated with basic emotional systems. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>). The word disgust never appears in Panksepp's book and is not even found in his description of the <a href="http://mybrainnotes.com/brain-fear-autism.html" target="_blank">fear system</a> involving in the <a href="http://en.wikipedia.org/wiki/Amygdala" target="_blank">amygdala</a> and the <a href="http://www.pnas.org/content/106/12/4870.full" target="_blank">hypothalamus</a> which, because <a href="http://science.howstuffworks.com/life/fear.htm" target="_blank">fear stimulates flight</a>, sounds like it might be related to an emotion like disgust that stimulates avoidance. On the other hand, Kelly identifies an area of the forebrain, the putamen, as an area of the brain associated with processing disgust, but the <a href="http://en.wikipedia.org/wiki/Putamen" target="_blank">putamen</a> is <a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1689486/pdf/9821359.pdf" target="_blank">involved in emotional facial recognition</a>, so it may not be something disgusting that activates the putamen, but the recognition of face gape that activates the putamen. The putamen is primarily associated with the motor cortex, so the connection to the putamen as an emotional processor is not at all that clear. I suspect that the putamen, if it is implicated in disgust, it is not part of what Rozin and Haidt refer to as inputs (cognitive appraisals or environmental events), but our biological output that results in virtually automatic, nearly uniform facial expressions. <a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1689486/pdf/9821359.pdf" target="_blank">Research on patients with Huntington's Disease</a> would seem to confirm this observation. But the insular cortex is involved in maintaining the homeostatic condition of the body; <a href="http://en.wikipedia.org/wiki/Insular_cortex" target="_blank">it maintains an awareness of various parameters of the body</a>; the insula is also believed to <a href="http://www.effective-mind-control.com/insular-cortex.html" target="_blank">process convergent information to produce an emotionally relevant context for sensory experience</a>. More specifically, the anterior insula is related more to olfactory, gustatory, vicero-autonomic, and limbic function, while the posterior insula is related more to auditory-somesthetic-skeletomotor function. The insula has an important role in <a href="http://en.wikipedia.org/wiki/Pain" title="Pain">pain</a> experience and the experience of a number of basic <a class="mw-redirect" href="http://en.wikipedia.org/wiki/Emotions" title="Emotions">emotions</a>, including <a href="http://en.wikipedia.org/wiki/Anger" title="Anger">anger</a>, <a href="http://en.wikipedia.org/wiki/Fear" title="Fear">fear</a>, <a href="http://en.wikipedia.org/wiki/Disgust" title="Disgust">disgust</a>, <a href="http://en.wikipedia.org/wiki/Happiness" title="Happiness">happiness</a> and <a href="http://en.wikipedia.org/wiki/Sadness" title="Sadness">sadness</a>. If disgust is seated in the insular cortex, this would confirm the significance of cognitive appraisal in producing disgust. The insular cortex is a mammalian development, so it evolved later than those ancient emotional systems that Panksepp discusses.<br />
<br />
Morality varies across cultures and even within cultures and smaller social groups. <a href="http://en.wikipedia.org/wiki/Westermarck_effect" target="_blank">Edward Westermark</a> advanced the thesis that there was an innate aversion to sexual intercourse between persons living very closely together from early youth, and as applied to persons who are closely related this generates a feeling of horror of intercourse with close kin. But is this aversion really innate? Or is it learned? Or is something innate triggered because, in this case, one must first experience living closely with someone at an early age? Because there is survival value associated with avoiding <a href="http://en.wikipedia.org/wiki/Inbreeding" target="_blank">inbreeding</a>, there is arguably an evolutionary imperative associated with incest avoidance and should explain in substantial part why incest avoidance is universal across cultures. Inbreeding reduces the fitness of a given population. Incest avoidance seems to have nothing to do with morals. If it is considered moral behavior, it is only because humans have put that label on a form of behavior that likely predates social norms and morals. If it is considered disgusting, it is only because humans have put that label on it. <br />
<br />
Outside the example of the inversion to inbreeding, interpersonal and social-moral disgust is something that is <em>learned</em>. It is not innate. Rozin observes that up to about two years of age, children show no aversion to primary disgust elicitors such as feces. Toilet training may be the learning event that leads to core disgust. Later in childhood development, an event or object that was "previously morally neutral becomes morally loaded." At this point in the learning process, disgust becomes recruited. Disgust "becomes a major, if not the major force for negative socialization in children; a very effective way to internalize culturally prescribed rejections (perhaps starting with feces) is to make them disgusting." <br />
<br />
Kelly theorizes that disgust migrated from being an emotional response to toxins, parasites and disease to a socially shared emotion because face gape is automatic and the emotion as revealed in the facial gesture was communicated in a way that was empathetically received. Like other facial communications (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/07/dacher-keltner-born-to-be-good-science.html" target="_blank">July 16, 2010 post</a>), there is a reliable causal link between the production of an emotion and its expression here that acts as a signal to avoid. And to the extent that group selection (or more narrowly kin selection) is engaged, shared disgust becomes a survival mechanism for the group to avoid toxins, parasites and disease, Kelly posits that the genetic underpinnings of the neural correlates of the emotion and gape face were recruited by human culture to perform several novel functions. And what the emotion qua emotion of disgust has in common with the socially shared emotion is that the object of the emotion's attention is distasteful and/or impure. As a cultural phenomenon, the socially shared emotion becomes connected to social, moral or ethical norms, group identity and avoidance of others. Culture then labels something to be avoided or averted as disgusting. <br />
<br />
Social or ethical norms are not necessarily social or ethical norms because of a common emotional stimulus like disgust, although they could be and common aversion to foods and attitudes toward sexual behaviors within a culture are a few examples that coms to mind. Because most social norms are learned and not instinctual, interpersonal and social-moral disgust is little more than a social label for one's attitude toward behavioral transgressions of group rules. Interestingly, the label may not be shared by all within the group. Consider, for examples, how societal attitudes toward homosexuality --- a behavior that large segments of many <a href="http://www.yale.edu/minddevlab/papers/DS_Gay.pdf" target="_blank">societies consider "disgusting"</a> --- are rapidly changing.<br />
<br />
Kelly concludes, disgust is "far from being a reliable source of special, supra-rational information about morality" and we should be extremely skeptical of claims that disgust deserves any "epistemic credit" as a trustworthy guide to justifiable moral judgments or deep ethical wisdom in repugnance. Justifying a moral or ethical rule on disgust can easily slide in dehumanization and demonization itself, which in itself is problematic. There is a ready and recent example that highlights Kelly's concern in <a href="http://www.cnn.com/2013/12/12/world/asia/north-korea-uncle-executed/index.html" target="_blank">today's news</a> that the youthful leader of North Korea had his elder uncle, also in the leadership of North Korea's government, executed for treason. In language that recalls Rozin and Haidt's discussion of animal-nature disgust, the uncle was labeled as "despicable human scum" and "worse than a dog," and said he had betrayed his party and leader." In other words --- particularly with the association to animals and scum (certainly there are parasites and disease in scum) --- the uncle was disgusting. His purported "disgustingness" became a <em>post hoc</em> justification for his execution.<br />
<br />
This entire discussion implicates the relationship between genes and culture, and Kelly devotes an entire chapter to "gene culture coevolution," sometimes referred to as the <a href="http://evolutionwiki.org/wiki/Dual_inheritance_theory" target="_blank">dual inheritance theory</a>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/06/frans-de-waal-ape-and-sushi-master.html" target="_blank">June 17, 2010</a> posts) and its application to the evolution of disgust. Rozin and Haidt concur: "The interaction between biology and culture is clear, because the output side of disgust remains largely ruled by biological forces that originally shaped it, while the input/appraisal/meaning part has been greatly elaborated, and perhaps transformed in some cases."CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-56478961684119212252013-11-27T17:05:00.002-08:002013-11-27T17:07:32.616-08:00Jose Saramago, The Lives of Things (2013, 1995)At a very early age, children learn the difference between artifacts and living things: the inanimate and the animate. In fact, one of the things that the human mind does quite well from an early age is to categorize things. These are often referred to as ontological categories. Children recognize intentionality in animals, and they recognize that artifacts lack intentionality. The animate are characterized by motion, and their ability to communicate in some capacity. Artifacts do not move on their own, nor do they have an ability to communicate. Very early in life humans develop certain expectations about these categories. Yet there are some adults who come to believe that they can talk to artifacts and that the artifacts can listen to them and perhaps respond as if they were living things. Religious icons are an example of these artifacts. When we believe that icons can hear us speak and respond in some way, this violates our expectations of the inanimate. When those violations occur, we have entered the realm of the supernatural, the paranormal. Why does this happen?<br />
<br />
In <a href="http://www.amazon.com/Religion-Explained-Evolutionary-Origins-Religious/dp/0465006965" target="_blank"><em>Religion Explained</em></a><em>, </em>anthropologist <a href="http://artsci.wustl.edu/~pboyer/PBoyerHomeSite/index.html" target="_blank">Pascal Boyer</a> poses a number of supernatural notions --- many of which are linked to a religious idea associated with a particular religion (not just the predominant religions), and some others that he just makes up --- and what he is interested in is determining whether the listener (or reader) can say that a particular religion has been built up around the idea. They range from "Some people get old and then one day they stop breathing and die and that's that" to "There is one God! He knows everything we do" to "Dead people's souls wander about and sometimes visit people" to "When people die, their souls sometimes come back in another body" to "We worship this woman because she was the only one ever to conceive a child without having sex" to "We pray to this statue because it listens to our prayers and helps us get what we want" to "This mountain over there eats food and digests it. We give it food sacrifices every now and then, to make sure it stays in good health" to "The river over there is our guardian. It will flow upstream if it finds out that people have committed incest." Obviously, the first does not seem like a notion that a religion would build up around because it is part of our every day experience. But we should recognize a religious affiliation with the others. <br />
<br />
Religious representations, says Boyer, are particular combinations of mental representations that satisfy two conditions. "First, the religious concepts violate certain expectations from ontological categories. Second, they preserve other expectations." A very frequent type of counterintuitive concept is produced by assuming that various objects or plants have some mental properties, that they can perceive what happens around them, understand what people say, remember what happened, and have intentions. A familiar example of this would be that of people who pray to statues of gods, saints or heroes. Not just artifacts but also inanimate living things can be "animated" in this sense. Boyer reports, "The pygmies of the Ituri forest for instance say that the forest is a live thing, that it has a soul, that it "looks after" them and is particularly generous to sociable, friendly and honest individuals. These will catch plenty of game because the forest is pleased with their behavior."<br />
<br />
What Boyer is getting at is quite consistent with Jeff Hawkins' model of the cortex. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/11/jeff-hawkins-on-intelligence-2004.html" target="_blank">November 16, 2013 post</a>). Boyer describes these as inference systems. We quickly make inferences about something we experience derived from higher level categories --- object vs animal vs plant, for example. In Hawkins' model, our memory of these higher level categories, "concepts," is retained in the higher cortical areas. Within the synapses of the hierarchical structure of the cortex described in <em>On Intelligence</em> below the higher cortical areas contains our memories of more narrow categories of more specific objects, animals, and plants and their respective attributes. The hierarchical structure of the cortex described by Hawkins resembles the structure of taxonomy. Taxonomy is a "powerful logical device that is intuitively used by humans in producing intuitive expectations about living things. People use the specific inference system of intuitive biological knowledge to add to the information given." But why then does the brain persistently retain memories of non-real --- supernatural --- concepts?<br />
<br />
Biological inferences are not always valid, Boyer admits. He refers to this as the enrichment of intuitive principles. <br />
<br />
In describing his research about supernatural concepts, Boyer writes, "Our reasoning was that the present explanation of supernatural concepts, on the basis of what we know from anthropology, also implied precise psychological predictions. Cultural concepts are selected concepts. They are the ones that survive cycles of acquisition and communication in roughly similar forms. Now one simple condition of such relative preservation is that concepts are <em>recalled. </em>So [we] designed fairly coherent stories in which we inserted various new violations of ontological expectations as well as episodes that were compatible with ontological expectations. The difference in recall between the two kinds of information would give us an idea of the advantage of violations in individual memory. Naturally, we only used stories and concepts that were new to our subjects. If I told you a story about a character with seven-league boots or a talking wolf disguised as a grandmother, or a woman who gave birth to an incarnation of a god after a visit from an angel, you would certainly remember those themes; not just because they were in the story but also because they were familiar to start with. Our studies were supposed to track how memory stores or distorts or discards <em>novel </em>material." Boyer's research showed that "long-term recall (over months) shows that violations [of expectations] were much better preserved [in memory] than any other material." It is not strangeness per se that is preserved; it has to be a violation of an ontological category. So among the expected attributes of the ontological categories of living animals is that they die. Anthropocentric gods who purportedly live forever violate expectations associated with our ontological category of living things; they are therefore more likely to be preserved in memory. Likewise for statues that talk and listen and respond to human speech or thinking. This is true across cultures, only the details on top of the ontological/conceptual violations vary. <br />
<br />
So concepts that violate our expectations of reality stick in memory. This does not sound particularly surprising. A previous post, not surprisingly connected to a <a href="http://www.nobelprize.org/nobel_prizes/literature/laureates/1998/saramago-bio.html" target="_blank">Jose Saramago</a> publication, emphasized the relationship between storytelling and memory (<a href="http://csilcox-thebookshelf.blogspot.com/2013/02/jose-saramago-small-memories-2009.html" target="_blank">February 26, 2013 post</a>): "This is not the first time that a post in this blog has connected Saramago's work with the subject of memory. In <em><a href="http://www.amazon.com/The-Notebook-Daniel-Hahn/dp/184467701X" target="_blank">The Notebook</a></em> (<a href="http://csilcox-thebookshelf.blogspot.com/2010/09/jose-saramago-notebook-2010.html" target="_blank"><span style="color: #473624;">September 28, 2010 post</span></a>), the Nobelist created a memory bank in blog form. In the posting on his final novel, <em><a href="http://www.amazon.com/Cain-Jose-Saramago/dp/B00CF5QQ2Y" target="_blank">Cain</a></em> (<a href="http://csilcox-thebookshelf.blogspot.com/2011/12/jose-saramago-cain-2011.html" target="_blank"><span style="color: #473624;">December 20, 2011 post</span></a>) I remarked, "I also believe storytelling evolved in part to preserve our memories of things past. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html"><span style="color: #473624;">August 15, 2011 post</span></a>). And storytelling, whether historical or fictional or both, enables the construction of both personal and social/group identity." And Saramago is a master at clutching collective memory --- history we call it --- and creating stories --- fiction we call it --- as in <em><a href="http://www.amazon.com/Year-Death-Ricardo-Reis/dp/0156996936" target="_blank">The Year of the Death of Ricardo Reiss</a></em> (<a href="http://csilcox-thebookshelf.blogspot.com/2011/06/jose-saramago-year-of-death-of-ricardo.html" target="_blank"><span style="color: #473624;">June 28, 2011 post</span></a>) and <em><a href="http://www.amazon.com/Baltasar-Blimunda-Jose-Saramago/dp/0156005204" target="_blank">Baltasar and Blimunda</a></em> (<a href="http://csilcox-thebookshelf.blogspot.com/2013/01/jose-saramago-baltasar-and-blimunda-1982.html" target="_blank"><span style="color: #473624;">January 1, 2013 post</span></a>)." <br />
<br />
This finally brings us round to Saramago's collection of six early short stories, <a href="http://www.amazon.com/The-Lives-Things-Jose-Saramago/dp/1844678784" target="_blank"><em>The Lives of Things</em></a><em>. </em>These are stories that stick in memory. Three of the short stories are constructed around artifacts --- "things," a centaur, and a chair that topples an oppressive dictator --- that violate our expectations of the ontological category of these objects. A few sentences from the story "Things" triggered my memory of Pascal Boyer's research:<br />
<br />
"There was a time when the manufacturing process had reached such a degree of perfection and faults became so rare that the Government (G) decided there was little point in depriving members of the public (especially those in categories A, B, and C) of their civil right and pleasure to lodge complaints: a wise decision which could only benefit the manufacturing industry. So factories were instructed to lower their standards. This decision, however, could not be blamed for the poor quality of the goods which had been flooding the market for the last two months. As someone employed at the Department of Special Requisitions (DSR), he was in a good position to know that the Government had revoked these instructions more than a month ago and imposed new standards to ensure maximum quality. Without achieving any results. As far as he could remember, the incident with the door was certainly the most disturbing. It was not a case of some object or other, or some simple utensil, or even a piece of furniture, such as the settee in the entrance-hall, but of an item of imposing dimensions; although the settee was anything but small. However it formed part of the interior of furnishings, while the door was an integral part of the building, if not the most important part."<br />
<br />
The "incident with the door" occurs as the story opens, virtually ordinary in the way Saramago describes it: "As it closed, the tall heavy door caught the back of the civil servant's right hand and left a deep scratch, red by scarcely bleeding." The civil servant decides to have this small wound treated at office infirmary and when explaining to the nurse how he came to be scratched, the nurse responds that this is the third such case that day. This incident presages a wider series of incidents that pits objects against living things. What unfolds is a revolution of objects against living people --- presumably in response to the Government's policy that manufactured goods could be made to lower quality standards; as the title of this volume implies, <em>things (objects) come alive. </em>Ordinary useful everyday objects begin to disappear: a pillar box, a jug, doors, stairs, utensils, clothes, and ultimately entire buildings and blocks of buildings. Nobody sees anyone taking or removing these objects; they seem to disappear of their own volition when no is watching, in the dark. This tale violates our expectations (inferences) associated with the concept of artifacts (objects). Things do not have volition, intentionality. But the tale is likely to be preserved in memory a bit longer than had Saramago not presented these "things" as a metaphor for people whom the government/society's power structure treated as objects that could be made to lower quality standards. <br />
<br />
There are those who contend that a god or other supernatural being (omniscient, eternal) must exist and our beliefs in these supernatural beings (as well as religion in general) is genetically hardwired. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/11/richard-powers-generosity-2009.html" target="_blank">November 30, 2009 post</a>). In contrast, I suggest that what is genetically hardwired is our brain's disposition to not discard, or discard only with significant cognitive effort, concepts that violate our expectations of reality. Hence, beliefs in the ability of artifacts and imaginary unseen things to engage in behavior that we associate with living things simply stick, and they are rendered stickier when human cultural institutions reinforce those beliefs or concepts. This is beginning to sound like Gene Culture Co-evolution or the dual inheritance theory. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/06/frans-de-waal-ape-and-sushi-master.html" target="_blank">June 17, 2010</a> posts).<br />
<br />
What I have not been able to understand or reconstruct yet is why the sensory inputs that the lower levels of the cortex are constantly bombarded with during our encounters with the physical world do not overwhelm the sticky concepts stuck in higher cortical areas that violate intuitive expectations? Boyer describes humans as information hungry, in need of cooperation from other humans including the provision of information by other humans, and this cognitive niche is our milieu, our environment. We have a taste for gossip, information about other humans. Because humans are in need of cooperation, they have developed mechanisms for social exchange resulting in the formation of groups and coalitional dynamics. But the human mind is not constrained to consider and represent only what is currently going on in the immediate environment. The human brain spends a considerable amount of time thinking about "what is not here and now." Fiction --- a Jose Saramago story --- is the most salient illustration, says Boyer. "One of the easiest things for human minds to do is to produce inferences on the basis of false premises. Thus "thoughts are decoupled from their standard inputs and outputs." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/jose-saramago-year-of-death-of-ricardo.html" target="_blank">June 28, 2011 post</a>). "Decoupled cognition," writes Boyer, "is crucial to human cognition because we depend so much on information communicated by others and on cooperation with others. To evaluate information provided by others you must build some mental simulation of what they describe. Also, we would not carry out complex hunting expeditions, tool making, food gathering or social exchange without complex planning. The latter requires an evaluation of several different scenarios, each of which is based on nonfactual premises (What if we go down to the valley to gather fruit? What if there is none down there? What if other members of the group decide to go elsewhere? What if my neighbor steals my tools? and so on). Thinking about the past too requires decoupling. As psychologist Endel Tulving points out, episodic memory is a form of mental "time travel" allowing us to re-experience the effects of particular scene on us. This is used in particular to assess other peoples behavior, to reevaluate their character, to provide a new description of our own behavior and its consequences and for similar purposes. . . The crucial point to remember about decoupled thoughts," says Boyer, "is that they run the inference systems in the same way as if the situation were actual. This is why we can produce coherent and useful inferences on the basis of imagined premises. . . Hypothetical scenarios suspend one aspect of actual situations but then run all inference systems in the same way as usual."<br />
<br />
Thus fantasy --- which includes not only stories like those in Saramago's <em>The Lives of Things</em>, but also religious stories that violate our inferred expectations of reality --- succeed to the extent that they activate the same inference systems of the brain that are used in navigating reality. "Religious concepts," concludes Boyer, "constitute salient cognitive artifacts whose successful cultural transmission depends upon the fact that they activate our inference systems in particular ways. The reason religion can become much more serious and important than the inference systems that are of vital importance to us: those that govern our most intense emotions, sharpen our interaction with other people, give us moral feelings, and organize social groups. . . .Religious concepts are supernatural concepts that matter, they are practical." Boyer summarizes: What is "important" to human beings, because of their evolutionary history, are the conditions of social interaction: who knows what, who is not aware of what, who did what with whom, when and what for. Imagining agents with that information is an illustration of mental processes driven by relevance. Such agents are not really necessary to explain anything, but they are so much easier to represent and so much richer in possible inferences that they enjoy a great advantage in cultural transmission." So the fantastic that violates ontological expectations is not just sticky in our memory for that reason, but because of the facility by which the fantastic is culturally transmitted, its stickiness is enhanced as a matter of group memory and supports the human need for cooperation and social interaction. Will Saramago's stories of the fantastic --- for example, the Iberian peninsula breaking away from Europe and floating out to the Atlantic Ocean in <em>The Stone Raft,</em> a community's near complete loss of sightedness in <em>Blindness</em>, death taking a holiday in <em>Death With Interruptions --- </em>enjoy a great advantage in cultural transmission? Probably not. These stories are labeled fiction, and we understand the stories as fiction. They likely exploit the same inference systems in the brain that we rely upon to experience and navigate the real world. They may be memorable, but cultural exchange --- save the temporal book club meeting --- is not likely to be constructed around these stories. This is likely a result of what John Searle refers to when he mentions "the sheer growth of certain, objective, universal knowledge." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/01/john-searle-philosophy-in-new-century.html" target="_blank">January 21, 2011 post</a>). Cultural transmission of religious concepts began long before there was any volume of certain, objective, universal knowledge. Religious concepts now compete for our attention to objective, universal knowledge, as evidence by the Dover, Pennsylvania litigation over teaching "creation science" in the schools. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/03/edward-humes-monkey-girl-evolution.html" target="_blank">March 14, 2013 post</a>). And while our brain's enduring capacity to retain and rely upon decoupled concepts is persistent, it might not be emergent in our modern cognitive niche if it had not been emergent centuries ago when we lacked substantial certain, objective, universal knowledge.CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-48866275591441445782013-11-16T17:41:00.002-08:002013-11-20T17:26:51.858-08:00Jeff Hawkins, On Intelligence (2004)The scenario described in the previous post (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/10/michael-tomasello-origins-of-human.html" target="_blank">October 26, 2013 post</a>) of the mass of commuter humanity changing trains in a crowded subway station, silently cooperating to avoid colliding with one another as they cross paths was intended to introduce the subject of humans reading and understanding the intentions of others as a foundation of human cooperative activity. But there is another characteristic of the human brain besides mindreading that supports this outcome: the human brain constantly anticipates, predicts the future. In this scenario, it predicts (perhaps not perfectly) the future behavior of others (more likely their immediate behavior), where they are directing their motion, where they are turning, whether they are accelerating or slowing down. <a href="http://en.wikipedia.org/wiki/Jeff_Hawkins" target="_blank">Jeff Hawkins</a> labels this <em>intelligence: </em>how the brain predicts behavior and future events is the subject of <em><a href="http://www.amazon.com/Jeff-Hawkins/e/B001KHNZ7C" target="_blank">On Intelligence</a></em>. Hawkins' interest is in understanding human intelligence to build a foundation for improved machine intelligence. The focus of his inquiry is the <a href="http://www.sciencedaily.com/articles/n/neocortex.htm" target="_blank">neocortex</a>, the outermost layers of the human brain, and <a href="http://science.howstuffworks.com/life/inside-the-mind/human-brain/human-memory.htm" target="_blank">memory</a>. What Hawkins offers up is the <a href="http://en.wikipedia.org/wiki/Memory-prediction_framework" target="_blank">memory-prediction framework</a> of intelligence. This differs from a <span id="goog_406274836"></span><a href="http://210.101.116.28/W_files/kiss61/1r300130_pv.pdf" target="_blank">computational framework<span id="goog_406274837"></span></a>.<br />
<br />
Hawkins is not out to explain what makes us human (compare <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/michael-gazzaniga-human-2008.html" target="_blank">September 27, 2009 post</a>). Nor is he out to explain human consciousness (compare <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>). But he does briefly touch on these matters. Previous posts in the blog address human imagination and creativity as a hallmark of what makes us "human," (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html" target="_blank">November 6, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011 post</a>), and Hawkins presents a model discussed below about the role of the neocortex in imagination, including imagination by false analogy. What he does not touch on is the role of the brain in generating and controlling emotions, the subject of Jaak Panksepp's research (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>), which naturally links to the origins of the moral and social aspects of what makes us human. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). So while Hawkins does connect the neocortex and thalamus within his memory-prediction framework (see below), he does not elaborate upon the role of the large <a href="http://www.scholarpedia.org/article/Models_of_thalamocortical_system" target="_blank">thalamo-cortical system</a> that resides in the human brain that plays a substantial role in what makes us human and the <a href="http://willcov.com/bio-consciousness/front/Thalamocortical%20system.htm" target="_blank">biological basis of human consciousness</a>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>).<br />
<br />
Prior posts identify the critical role of the <a href="http://en.wikipedia.org/wiki/Hippocampus" target="_blank">hippocampus</a> in <a href="http://people.uncw.edu/galizio/rapfiles/aces/Papers/FortinEichenbaum2002.pdf" target="_blank">memory formation</a>, but ultimately long-term memory is shifted to the <a href="http://en.wikipedia.org/wiki/Cerebral_cortex" target="_blank">cerebral cortex</a> through a process known as <a href="http://inside.bard.edu/~luka/documents/sleepconsolidation.pdf" target="_blank">consolidation</a> that occurs during sleep. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/09/rodrigo-q-quiroga-borges-and-memory-2012.html" target="_blank">September 10, 2013</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html" target="_blank">November 6, 2011 posts</a>). As a prior post described: "Memories are distributed in the same parts of the brain that encoded the original experience. So sounds are found in the <a href="http://www.ncbi.nlm.nih.gov/books/NBK10900/">auditory cortex</a>, taste and skin sensory memories are found in the <a href="http://en.wikipedia.org/wiki/Somatosensory_system"><span class="blsp-spelling-error" id="SPELLING_ERROR_22">somatosensory</span> cortex</a>, and sight in the <a href="http://en.wikipedia.org/wiki/Visual_cortex">visual cortex</a>. But procedural -- "how to" --- memories are stored outside of the cortex, in the <a href="http://en.wikipedia.org/wiki/Cerebellum">cerebellum</a> and <a href="http://en.wikipedia.org/wiki/Putamen"><span class="blsp-spelling-error" id="SPELLING_ERROR_23">putamen</span></a>, and fear memories are stored in the <a href="http://en.wikipedia.org/wiki/Amygdala">amygdala</a>." Hawkins' thesis is that the cortex is critical to human capacity to predict events because of the linkage to memory storage in the cortex. In focusing on the neocortex, Hawkins is looking at, evolutionarily speaking, the most recent adaptation in the development of animal neurological systems. The neocortex is unique to mammals, and the human neocortex is larger than the neocortex in any other mammal, facts that suggest the human neocortex is critical to understanding what makes us human. This is just the opposite of Jaak Panksepp's focus on the older parts of the brain, the brain stem and the midbrain. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>). It is not as though Hawkins believes these older parts of the brain are irrelevant to human behavior. "First," Hawkins says, "the human mind is created not only by the neocortex but also by the emotional systems of the old brain and by the complexity of the human body. To be human you need all of your biological machinery, not just a cortex." But Hawkins is ultimately interested in the creation of an intelligent machine, and he believes that in the pursuit of that interest he needs to understand what makes humans "intelligent." He finds that understanding in how the neocortex is structured and proposes a model for how it operates to predict future events. <br />
<br />
Hawkins' model is based on our current knowledge of the structure of the neocortex. That much is known. And here is a graphical representation of that structure:<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiM_f2zjAP3bjNgH4EEXHuaqoVQHEAmHFyxgq9rwAo5f4WoKR0_OTjqZlf8Q0ewlMysmKSoIEPuJHQBblsojIgijrzh99HgdAblHf5b-R5EtosmVeG34T5995_SxfdN-0DWRq42569AV-Xb/s1600/Picture1.png" imageanchor="1"><img border="0" height="303" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiM_f2zjAP3bjNgH4EEXHuaqoVQHEAmHFyxgq9rwAo5f4WoKR0_OTjqZlf8Q0ewlMysmKSoIEPuJHQBblsojIgijrzh99HgdAblHf5b-R5EtosmVeG34T5995_SxfdN-0DWRq42569AV-Xb/s320/Picture1.png" width="320" /></a></div>
<strong><span style="font-family: Arial, Helvetica, sans-serif;">SENSORY INPUT</span></strong><br />
<span style="font-family: inherit;"></span><br />
<span style="font-family: inherit;">Each region of the neocortex is known to consist of four areas, labeled 1, 2, 4 and IT. The graph above represents those four layers, with IT at the top and 4, 2, and 1 below it for one of the regions of the cortex (visual, auditory, somatosensory, motor). The visual cortex layers are labeled, from bottom to top, V1, V2, V4, and IT; the auditory cortex layers labeled A1, A2, A4 and IT; the somatosensory (touch) cortex layers labeled S1, S2, S4 and IT, and similarly for the motor cortex. The arrows are pointed in both directions, indicating that information moves in both directions between the areas. </span><br />
<br />
Neurons fire in a specific pattern in response to a specific sensory stimulus. For the exact same sensory stimulus, the same neurons will fire in the same pattern within this hierarchy. For a different sensory stimulus, different neurons will fire in a pattern. The brain's capacity to recognize (predict) these patterns is at the heart of memory.<br />
<br />
Recall the discussion in connection with Rodrigo Quiroges' book, <em><a href="http://www.amazon.com/Borges-Memory-Encounters-Human-Brain/dp/0262018217" target="_blank">Borges and Memory</a> </em>(<a href="http://csilcox-thebookshelf.blogspot.com/2013/09/rodrigo-q-quiroga-borges-and-memory-2012.html" target="_blank">September 10, 2013 post</a>): "Each neuron in the retina responds to a particular point, and we can infer the outline of a cube starting from the activity of about thirty of them [retinal neurons]. Next the neurons in the primary visual cortex fire in response to oriented lines; fewer neurons are involved and yet the cube is more clearly seen. This information is received in turn by neurons in higher visual areas, which are triggered by more complex patterns --- for example, the angles defined by the crossing of two or three lines. . . As the processing of visual information progresses through different brain areas, the information represented by each neuron becomes more complex, and at the same time fewer neurons are needed to encode a given stimulus." The arrows representing sensory input from the retinal neurons are the arrows pointing to area V1 of the visual cortex. A particular pattern of neurons firing in V1 leads neurons in V2 to fire, and all the way up to IT. As just noted, in each higher layer "fewer neurons are involved." In V1, the cells are spatially specific, tiny feature-recognition cells that infrequently fire depending on which of the millions of retinal neurons are providing sensory input; at the higher IT, the cells are constantly firing, spatially non-specific, object recognition cells. One of way of thinking about this is that certain neurons in V1 fired in recognition of two ears, a nose, two eyes, and perhaps even more details like the texture of skin, facial hair, the color of hair; neurons in IT fired in recognition of an entire head or face. Cells in the IT encode for categories; Hawkins calls them "<a href="http://www.nature.com/nature/journal/v435/n7045/full/nature03687.html" target="_blank">invariant representations</a>." In philosophy, these invariant representations might be analogous to <a href="http://en.wikipedia.org/wiki/Theory_of_Forms" target="_blank">Plato's forms</a>. It is here one would find neurons firing in response to things --- rocks, platypuses, your house, a song, Jennifer Aniston or Bill Clinton. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/09/rodrigo-q-quiroga-borges-and-memory-2012.html" target="_blank">September 10, 2013 post</a>). <br />
<br />
Psychologists recognize the same phenomenon, although in different terms. <a href="http://www.amazon.com/Descartes-Baby-Science-Development-Explains/dp/0465007864" target="_blank">Paul Bloom</a> asserts that humans are "splitters" and "lumpers," but for the most part we are lumpers. Borges' Funes was a splitter. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/09/rodrigo-q-quiroga-borges-and-memory-2012.html" target="_blank">September 10, 2013 post</a>). "Our minds have evolved," Bloom says, "to put things into categories and to ignore or downplay what makes these things distinct. Some categories are more obvious than others: all children understand the categories chairs and tigers; only scientists are comfortable with the categories such as ungulates and quarks. What all categories share is that they capture a potential infinity of individuals under a single perspective. They lump." Bloom says, "We lump the world into categories so that we can learn." He adds, "A perfect memory, one that treats each experience as a distinct thing-in-itself, is useless. The whole point of storing the past is to make sense of the present and to plan for the future. Without categories [or concepts], everything is perfectly different from everything else, and nothing can be generalized or learned." <br />
<br />
The neocortex consists of six horizontal layers of cells (I-VI) each roughly 2mm thick (shown below for area V1 of the visual cortex). The cells within each layer are aligned in columns perpendicular to the layers. The layers in each column are connected via <a href="http://en.wikipedia.org/wiki/Axon" target="_blank"><span style="font-family: inherit;">axons</span></a><span style="font-family: inherit;">, making </span><a href="http://en.wikipedia.org/wiki/Synapses" target="_blank"><span style="font-family: inherit;">synapses</span></a><span style="font-family: inherit;"> along the way. "Columns do not stand out like neat little pillars," explains Hawkins, "nothing in the cortex is that simple, but their existence can be inferred from several lines of evidence." Vertically aligned cells tend to become active for the same stimulus. </span><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUHifrwT3uyjabssB9P7RWue9xflM1Eszz0a6wkqG2KhSU8ZfOBtVj-uEl3bjajay2TPpWYy7A8AuSIlRQrdJpEwZkPBX6omObBs_LR6STjZcGakecPxz1l61qo3lAGfwiVmUfL6kHxNzo/s1600/114_Hypercolumns.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="307" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUHifrwT3uyjabssB9P7RWue9xflM1Eszz0a6wkqG2KhSU8ZfOBtVj-uEl3bjajay2TPpWYy7A8AuSIlRQrdJpEwZkPBX6omObBs_LR6STjZcGakecPxz1l61qo3lAGfwiVmUfL6kHxNzo/s400/114_Hypercolumns.png" width="400" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiU01l8jwaR2P1e_tOHMMCADfBbhOGl5rDnXaJmeWtJZxBXHzro3JDScym3EyrknxT_8WLmPA7JYYtjSs1hMRdSBJtH8sDGJF5qWmpBebQNLpyQcJjUhyHTBaPLtaMXxifZ9-vbPgtay69s/s1600/imagesCAKKNYR4.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"></a> </div>
<br />
Again, as was the case with the different areas of a region of the cortex, information is moving both up and down the layers of a given area. Inputs move up the columns; memories move down the columns. "When you begin to realize that the cortex's core function is to make predictions, then you have to put feedback into the model; the brain has to send information flowing back toward the region that first receives inputs. Prediction requires a comparison between what is happening and what you expect to happen. What is actually happening flows up, and what you expect to happen flows down."<br />
<span style="font-family: inherit;"> </span><br />
Memories are stored in this hierarchical structure. "The design of the cortex and the method by which it learns naturally discover the hierarchical relationships in the world. You are not born with knowledge of language, houses, or music. The cortex has a clever learning algorithm that naturally finds whatever hierarchical structure exists and captures it. When structure is absent, we are thrown into confusion, even chaos. *** You can only experience a subset of the world at any moment in time. You can only be in one room of your home, looking in one direction. Because of the hierarchy of the cortex, you are able to know that you are at home, in your living room, looking at a window, even though at that moment your eyes happened to be fixated on a window latch. Higher regions of cortex are maintaining a representation of your home, while lower regions are representing rooms, and still lower regions are looking at window. Similarly, the hierarchy allows you to know you are listening to both a song and album of music, even though at any point in time you are hearing only one note, which on its own tells you next to nothing." Critical to this capability is the brain's ability to process sequences and recognize patterns of sequences. "Information flowing in to the brain naturally arrives as a sequence of patterns." When the patterns are repeated through a repeated firing of a particular combination of neurons, the cortical region forms a persistent representation, or <em>memory</em>, for the sequence. In learning sequences, we form invariant representations of objects. When certain input patterns repeat over and over, cortical regions "know that those experiences are caused by a real object in the world." <br />
<br />
<span style="font-family: inherit;">One of the most important attributes of Hawkins' model is a concept called <a href="http://en.wikipedia.org/wiki/Autoassociative_memory" target="_blank">auto-associative memory</a>. This is what enables the brain to recall something by sensing only a portion of that memory. In the case of the brain, that input may belong to an entirely different category than what is recalled. Auto-associative memory is part of pattern recognition: the cortex does not need to see the entire pattern in order to recognize the larger pattern. The second feature of auto-associative memory, says Hawkins, is that an auto-associative memory can be designed to store sequences of patterns, or temporal patterns. He says this is accomplished by adding a time-delay to feedback. </span><br />
<span style="font-family: inherit;"></span><br />
<span style="font-family: inherit;">The cortex is linked to the </span><a href="http://en.wikipedia.org/wiki/Thalamus" target="_blank"><span style="font-family: inherit;">thalamus</span></a><span style="font-family: inherit;">. Hawkins says that one of the six layers of cells (L5 - second from the bottom in a given cortical area) within the neocortex is </span><a href="http://www.homodiscens.com/home/ways/agnoscens/hawkins/index.htm" target="_blank"><span style="font-family: inherit;">wired to the thalamus</span></a><span style="font-family: inherit;">, which in turn sends information back to Layer I (the highest layer in a given cortical layer), acting as a delayed feedback important to learning sequences and to predicting. The thalamus is selective in what it transmits back to the cortex because the number of neurons going to the thalamus exceeds the number of neurons back to the cortex by a factor of ten. This requires an understanding of <a href="http://willcov.com/bio-consciousness/review/Reentry%20and%20Recursion.htm" target="_blank">reentrant activity and recursion</a>, which need not be explained here. But Layer 1 (at the top of a given cortical area) is <em>also </em>receiving information from higher cortical areas (e.g. in the case of the visual cortex, layer 1 in V4 from layer 6 in IT; layer 1 in V2 from layer 6 in V4, etc.) So layer 1 now has two inputs: from the thalamus and from the higher cortical area. Layer 1, Hawkins emphasizes, now carries "much of the information we need to predict when a column should be active. Using these two signals in layer 1, a region of cortex can learn and recall multiple sequences of patterns." </span><br />
<br />
Cortical regions "store" sequences of patterns when synapses are strengthened by repeated firing. "If this occurs often enough, the layer 1 synapses [at the top of the region] become strong enough to make the cells in layers 2, 3, and 5 [below] fire, even when a layer 4 cell hasn't fired--- meaning parts of the column can become active without receiving input from a lower region of the cortex. In this way, cells in layers 2, 3, and 5 learn to 'anticipate' when they should fire based on the pattern in layer 1. Before learning, the column can only come active if driven by a layer 4 cell. After learning, the column can become partially active via memory. When a column becomes active via layer1 synapses, it is anticipating being driven from below. This is prediction. If the column could speak, it would say, 'When I have been active in the past, this particular set of my layer 1 synapses have been active. So when I see this particular set again, I will fire in anticipation.' Finally, layer 6 cells can send their output back into layer 4 cells of their own column. Hawkins says that when they do, our predictions become the input. This is what we do, he adds, when we daydream, think, imagine. It allows us to see the consequences of our own predictions, noting that we do this when we plan the future, rehearse speeches, and worry about future events. In Hawkins' model, this has to be part of what <a href="http://www.psych.ucsb.edu/people/faculty/gazzaniga" target="_blank">Michael Gazzaniga</a> refers to as our decoupling mechanism. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011 post</a>)<br />
<br />
<span style="font-family: inherit;">This is Hawkins' model of the brain's capacity to predict, intelligence if you will. Of course, it is a more complex than I have regurgitated here. "If a region of cortex finds it can reliably and predictably move among these input patterns using a series of physical motions (such as </span><a href="http://www.scholarpedia.org/article/Human_saccadic_eye_movements" target="_blank"><span style="font-family: inherit;">saccades of the eyes</span></a><span style="font-family: inherit;"> or fondling with the fingers) and can predict them accurately as they unfold in time (such as the sounds comprising a song or the spoken word), the brain interprets these as having a causal relationship. The odds of numerous input patters occurring in the same relation over and over again by sheer coincidence are vanishingly small. A predictable sequence of patterns must be part of a larger object that really exists. So reliable predictability is an ironclad way of knowing that different events in the world are physically tied together. Every face has eyes, ears, mouth and nose. If the brain sees an eye, the saccades and sees another eye, then saccades and sees a mouth, it can feel certain it is seeing a face."</span><br />
<br />
<span style="font-family: inherit;">This begins at a very early age in our post-natal development. The two basic components of learning, explains Hawkins, are forming the classifications of patterns and building sequences. "The basics of forming sequences is to group patterns together that are part of the same object. One way to do this is by grouping patterns that occur contiguously in time. If a child holds a toy in her hand and slowly moves it, her brain can safely assume that the image on her retina is of the same object moment to moment, and therefore the changing set of patterns can be grouped together. At other times, you need outside instruction to help you decide which patterns belong together. To learn that apples and bananas are fruits, but carrots and celery are not, requires a teacher to guide you to group these items as fruits. Either way, your brain slowly builds sequences of patterns that belong together. But as a region of cortex builds sequences, the input to the next region changes. The input changes from representing mostly individual patterns to representing groups of patterns. The input to a region changes from notes to melodies, from letters to words, from noses to faces, and so on. Where before a region built sequences of letters, it now builds sequences of words, The unexpected result of this learning process is that, during repetitive learning, representations of objects move down the cortical hierarchy. During the early years of your life, your memories of the world first form in higher regions of cortex, but as you learn they are re-formed in lower and lower parts of the cortical hierarchy."</span><br />
<br />
Michael Shermer (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011 post</a>) made the same point in a slightly different way when he referred to "patternicity." According to Shermer, as sensory data flows into the brain, there is a "tendency" for the brain to begin looking for meaningful patterns in both meaningful and meaningless data. He calls this process <a href="http://www.scientificamerican.com/article.cfm?id=patternicity-finding-meaningful-patterns"><em>patternicity</em></a><em>. </em>Shermer asserts that patternicity is premised on "association learning," which is "fundamental to all animal behavior from <a ---="" a="" about="" accept="" agent="" all="" an="" analogy.="" analogy="" and="" are="" as="" associated="" associations="" assuming="" be="" because="" behavior="" beliefs="" bigotry="" born.="" br="" brain.="" brain="" by="" cannot="" causal="" causes="" circumstance="" cognition="" conspiracy="" correct="" correlations="" cost="" decisions="" default="" depend="" discover="" e.g.="" errors="" every="" evolution="" faces="" facts="" faith="" false="" fit="" found="" happy="" hawkins="" href="http://www.nytimes.com/2011/06/21/science/21brain.html?_r=1&scp=2&sq=Roundworm&st=cse><em>C. elegans</em> (roundworm)</span></a> to<em> homo sapiens</em>." if="" in="" incorrect="" intolerance="" is="" learning="" lump="" magical="" may="" mode="" model="" more="" much="" n="" natural="" nbsp="" no="" non-causal="" not="" of="" often="" on="" ones.="" opportunity="" or="" our="" patternicities="" patterns="" plants="" political="" position="" prediction="" problem="" processes="" pseudoscience="" real="" refers="" religion="" research="" rooted="" rustle="" s="" says="" sense="" set="" shermer.="" so="" split-second="" such="" superstition="" survival="" than="" that="" the="" theories="" there="" they="" thinking="" this="" threat="" time="" to="" underlying="" unseen="" us="" well.="" which="" wind="" with=""></a>which is "fundamental to all animal behavior from <a href="http://www.nytimes.com/2011/06/21/science/21brain.html?_r=1&scp=2&sq=Roundworm&st=cse"><span style="color: #473624;"><em>C. elegans</em> (roundworm)</span></a> to<em> homo sapiens</em>." Because our survival may depend on split-second decisions in which there is no time to research and discover underlying facts about every threat or opportunity that faces us, evolution set the brain's default mode in the position of assuming that all patterns are real, says Shermer. A cost associated with this behavior is that the brain may lump causal associations (e.g. wind causes plants to rustle) with non-causal associations (e.g. there is an unseen agent in the plants). In this circumstance, superstition --- incorrect causal associations --- is born. "In this sense, patternicities such as superstition and magical thinking are not so much errors in cognition as they are the natural processes of a learning brain." Religion, conspiracy theories and political beliefs fit this model as well.<br />
<br />
My surmise is that beliefs and concepts rooted in false analogy become stored in memory in higher cortical areas when they are reinforced over and over through cultural transmission. Something like this may be what Edward O. Wilson means when he refers to epigenetic rules and culture. "Human nature," Wilson says, is the "inherited regularities of mental development
common to our species. <em>They are <a href="http://www.pnas.org/content/77/7/4382.abstract" target="_blank"><span style="color: #473624;">epigenetic
rules</span></a></em>, which evolved by the interaction of genetic and cultural
evolution that occurred over a long period in deep prehistory. These rules are
the genetic biases in the way our senses perceive the world, the symbolic coding
by which we represent the world, the options we automatically open to ourselves,
and the responses we find easiest and most rewarding to make. . ." (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/04/nessa-carey-epigenetics-revolution-2012.html" target="_blank">April 8, 2013 post</a>). Storytelling --- the creation of works of fiction --- may be important to making and reinforcing memories. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html" target="_blank">August 15, 2011 post</a>). Thus, when a prediction based on a false analogy is violated and one would normally recognize an error, the error message is transmitted back up to the higher cortical areas for a check. But because the belief based in false analogy resides there in the higher areas, the false analogy may never be corrected. The false analogy becomes an invariant representation. Paul Bloom explains in <a href="http://www.amazon.com/Descartes-Baby-Science-Development-Explains/dp/0465007864" target="_blank"><em>Descartes' Baby</em></a><em> </em>just how these concepts and beliefs can be rooted in our brains at a very early age, and as Hawkins describes above, memories formed earlier in life form in the higher regions of the cortex. These false analogies can be difficult to dislodge.<br />
<br />
Hawkins has been helpful in providing a model of the cortex as the part of the brain devoted to our capacity to predict. When tied into the models of other parts of the brain relating to consciousness and emotion discussed elsewhere in this blog, we begin to assemble the whole human brain and begin an appreciation of what makes us "human." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">September 27, 2009 post</a> discussing <a href="http://www.epjournal.net/wp-content/uploads/ep07206207.pdf" target="_blank">Michael Gazzaniga's reference to Jeff Hawkins</a>). While Hawkins' interest lies in the intelligent machine, he does not believe a machine can ever become "human." <br />
<br />
And finally, Hawkins confirms why I have had held to my instinct that <a href="http://plato.stanford.edu/entries/chinese-room/" target="_blank">John Searle's Chinese Room argument</a> was intuitively correct. The man in the Chinese Room must have been human.<br />
<br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-20408690502564347012013-10-26T12:39:00.002-07:002013-10-27T10:32:10.273-07:00Michael Tomasello, The Origins of Human Communication (2008)Every work morning I board the subway to travel to my office, and approximately halfway on this journey I change trains. Changing trains entails exiting a train into a crowd of people who are looking to board the train I am leaving, walking some 50-70 steps to a staircase while passing many people who are, like myself, leaving the train I just left, or, heading in the opposite direction to the train I just left. I descend stairs to another platform and then walk roughly 20 steps on the platform to wait for an oncoming train that will take me to my destination. During this change of trains, I probably come close to 100 or more persons within a 5-10 foot radius of my person. I do not know these people. Most people are not talking. A few I recognize as having seen previously on this journey, but still I don't know them. I don't talk with them. Some I see only out of my peripheral vision. The amazing part of this brief, everyday journey navigating through a mass of people is that I almost always avoid any physical contact with them, and the same is true for most of them as well. It is easy to think that each individual is merely moving autonomously toward their individual goal, but the reality is that each individual is acting cooperatively with the others to ensure that <em>the others </em>are able to move toward their individual goals by not colliding (with modest, likely accidental exception) or inhibiting the others as they move. It is like a dance. Occasionally someone crosses diagonally in front of me, but I avoid a collision by slowing down or moving sideways. Avoiding contact with and staying out of the way of other moving persons is a shared intention; in the case of humans, cultural rules have played a key role in enabling the individual to realize their shared intention: <em>e.g.,</em> stay on the right of oncoming persons; follow the person ahead of you. But the individuals are not merely blindly following rules; they are watching the faces and body movements of others <em>and reading their minds. </em><br />
<br />
Place a camera high above this subway station and make a video of the masses transitioning between trains. The flow of people almost seems choreographed; it is not as chaotic as one might think it could be. Now watch this <a href="http://www.nytimes.com/interactive/2012/11/20/science/bug-lovers-video-contest.html#1" target="_blank">video of ants marching</a>. The movement of ants seems just as orderly as my video of humans passing in the subway station. But it is different. <a href="http://worldsciencefestival.com/videos/order_out_of_chaos_ant_communication" target="_blank">Ants communicate differently</a>, <a href="http://www.antark.net/ant-life/ant-communication/pheromones.html" target="_blank">relying on chemicals and touch</a>. They are acting automatically, inflexibly. <a href="http://en.wikipedia.org/wiki/Chemotaxis" target="_blank">Chemotaxis</a> (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>) is at play here. In contrast, humans read the intentions of other humans in their facial expressions, gaze, and motions (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/07/dacher-keltner-born-to-be-good-science.html" target="_blank">July 16, 2010</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">September 18, 2009 post</a>), even when no words or spoken, and this is relatively unique in the animal kingdom. Apes are known to <a href="http://www.udel.edu/PR/UDaily/2005/oct/chimps111204.html" target="_blank">understand intentions</a> of others, and <a href="http://guardianlv.com/2013/10/empathetic-apes-comfort-each-other-like-humans-says-study/" target="_blank">apes are known to experience empathy</a>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/11/frans-dewaal-age-of-empathy-2009.html" target="_blank">November 9, 2010 post).</a> Mirror neurons, which some argue enable us to feel what others are feeling or experiencing, were first discovered in monkeys. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">September 18, 2009 post</a>). What apes do not do --- and humans do, according to <a href="http://wwwstaff.eva.mpg.de/~tomas/" target="_blank">Michael Tomasello</a> --- is share intentions and goals with others.<br />
<br />
Mindreading introduces us to <em><a href="http://www.iep.utm.edu/theomind/" target="_blank">theory of mind</a></em>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011 post</a>s) It is not a "theory" so much as it is a state of awareness: our ability to attribute mental states to others --- mindreading. In my little "everyday vignette" just described, our theory of mind is almost unconscious, and I would submit that this ability is close to unique, if not unique in the animal kingdom. If apes enjoy a theory of mind, it is certainly not as well-developed as humans. It plays out in nearly every other human scenario imaginable because we are social animals. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). Our theory of mind undoubtedly varies among these scenarios, for example, if our attentiveness to something specific about another person is heightened, there is probably a heightened attentiveness to another's specific mental state; whereas in my vignette the subway passenger's theory of mind is likely ascribing generic mental states to the masses of other humans around. <br />
<br />
Michael Tomasello's <em><a href="http://www.amazon.com/Origins-Human-Communication-Nicod-Lectures/dp/0262515202" target="_blank">Origins of Human Communication</a> </em>is not specifically about verbal speech or language, which is the focus of Christine Kenneally's <em><a href="http://www.amazon.com/First-Word-Search-Origins-Language/dp/0670034908" target="_blank">The First Word</a></em>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/christine-kenneally-first-word-2007.html" target="_blank">August 31, 2009 post</a>). It is about <em>human communication.</em> In Tomasello's view, language is not hardwired genetically into the brains of humans, and while he does not debate whether language is an adaptation or exaptation (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/10/vs-ramachandran-tell-tale-brain-2011.html" target="_blank">October 25, 2011 post</a>), Tomasello treats language as an emergent property, emerging from antecedent forms of human communication --- specifically gestures such as pointing or pantomiming. And Tomasello is armed with a lot of research data to advance his argument. Given that humans do not generally utter a spoken word of language until they are older than one year (14-18 months), Tomasello finds his antecedents in babies and, with an evolutionary longer gaze, in apes, the genetically closest animal to <em>homo sapiens sapiens</em>. Pointing and pantomiming are gestures human babies use before they begin to speak (with language essentially becoming a substitute for pantomiming). <br />
<br />
Tomasello's thesis is that the "ultimate explanation for how it is that human beings are able to communicate with one another in such complex ways with such simple gestures is that they have unique ways of engaging with one another socially. More specifically, human beings <em>cooperate </em>with one another in species-unique ways involving processes of <a href="http://email.eva.mpg.de/~tomas/pdf/BBS_Final.pdf" target="_blank">shared intentionality</a>." By "simple gestures," Tomasello is referring to the acts of pointing and pantomiming. He notes that apes point and respond to pointing, but the key difference here with humans is that when apes point they are making requests --- demanding action by others. "Bring me that food." Apes possess the ability to follow <a href="http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0036390" target="_blank">gaze direction</a>. Apes want the other to see something and do something. Humans, by contrast, point not only to direct the other human's attention to something, but to share information with others, request help and cooperation, even when there is no benefit to themselves. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 27, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a>, and <a href="http://csilcox-thebookshelf.blogspot.com/2010/10/oren-harman-price-of-altruism-2010.html" target="_blank">October 13, 2010 posts</a> for discussions of direct and indirect reciprocal altruism). "Pointing," says Tomasello, "is based on humans' natural tendency to follow the gaze direction of others to external targets, and pantomiming is based on humans' natural tendency to interpret the actions of others intentionally. This naturalness makes them good candidates as an intermediate step between ape communication and arbitrary linguistic conventions [of humans]." While there are <a href="http://l2c2.isc.cnrs.fr/publications/files/08reb.pdf" target="_blank">some primatologists</a> who credit apes with more cooperative, social behavior than Tomasello acknowledges, what differentiates apes and <span style="font-family: inherit;">humans</span>, he says, is an underlying psychological infrastructure --- made possible by cultural learning and imitation that allows humans to learn from others and understand their intentions. That leads to <a href="http://www.eva.mpg.de/psycho/pdf/Publications_2007_PDF/Shared_intentionality_07.pdf" target="_blank">shared intentionality</a> --- sometimes referred to as "we" intentionality --- collaborative interactions in which participants share psychological states with one another and have shared goals and shared action plans. This brings us to the research observations about human babies. <a href="http://www.eva.mpg.de/psycho/pdf/Publications_2007_PDF/Shared_intentionality_07.pdf" target="_blank">Tomasello</a> (2007):<br />
<br />
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">"[H]uman adults quite often teach youngsters things by demonstrating what they should do – which the youngsters then respond to by imitating (and internalizing what is learned. Adult chimpanzees do not demonstrate things for youngsters (or at least do this very seldom). Interestingly, when human adults instruct their children in this way (providing communicative cues that they are trying to demonstrate something), 14-month-old infants copy the particular actions the adults used, and they do so much more often than when adults do not explicitly instruct – in which case they just copy the result the adult achieved (Gergely & Csibra, 2006). Furthermore, there is some evidence that 1-year-old infants are beginning to see the collaborative structure of some imitative interactions. Thus, they sometimes observe adult actions directed to them, and then reverse roles and redirect the actions back to the demonstrator, making it clear by looking to the demonstrator’s face that they see this as a joint activity (Carpenter, Tomasello & Striano, 2005). Chimpanzees may on occasion redirect such learned actions back to their partners, but they do not look to their partner’s face in this way (Tomasello & Carpenter, 2005). Thus, chimpanzees’ social learning is actually fairly individualistic, whereas 1-year-old children often respond to instruction and imitate collaboratively, often with the motivation to communicate shared states with others.</span></span></span><br />
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20; font-family: inherit;">
</span></span></span><br />
<div align="LEFT">
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;"> </span></span></span></span><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;">***</span></span></span></span></div>
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
<span style="font-family: inherit;">
</span></span></span></span><br />
<div align="LEFT">
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;"> </span></span></span></span><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;">"Human children, on the other hand, often are concerned with sharing psychological states with others by providing them with helpful information, forming shared intentions and attention with them, and learning from demonstrations produced for their benefit. The emergence of these skills and motives for shared intentionality during human evolution did not create totally new cognitive skills. Rather, what it did was to take existing skills of, for example, gaze following, manipulative communication, group action, and social learning, and transform them into their collectively based counterparts of joint attention, cooperative communication, collaborative action, and instructed learning – cornerstones of cultural living. Shared intentionality is a small psychological difference that made a huge difference in human evolution in the way that humans conduct their lives. </span></span></span></span></span></div>
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
<span style="font-family: inherit;">
</span></span></span></span></span><br />
<div align="LEFT">
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;">"In terms of ontogeny, </span><a href="http://email.eva.mpg.de/~tomas/pdf/BBS_Final.pdf" target="_blank"><span style="font-family: inherit;">Tomasello et al (2005)</span></a><span style="font-family: inherit;"> hypothesized that the basic skills and motivations for shared intentionality </span><span style="font-family: inherit;">typically emerge at around the first birthday from the interaction of two developmental trajectories, each representing an evolutionary adaptation from some different point in time. The first trajectory is a general primate (or perhaps great ape) line of development for understanding intentional action and perception, which evolved in the context of primates’ crucially important competitive interactions with one another over food, mates, and other resources (Machiavellian intelligence; Byrne; Whiten, 1988). The second trajectory is a uniquely human line of development for sharing psychological states with others, which seems to be present in nascent form from very early in human ontogeny as infants share emotional states with others in turn-taking sequences (Trevarthen, 1979). The interaction of these two lines of development creates, at around 1 year of age, skills and motivations for sharing psychological states with others in fairly local social interactions, and then later skills and motivations for reacting to and even internalizing various kinds of social norms, collective beliefs, and cultural institutions."</span></span></span></span></span></div>
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
<span style="font-family: inherit;">
</span></span></span></span></span><br />
<div align="LEFT">
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="font-family: inherit;">The cooperative <em>homo</em> hunter-gatherer phenomenon is </span><a href="http://news.nationalgeographic.com/news/2010/01/100112-modern-human-behavior/" target="_blank"><span style="font-family: inherit;">believed to have emerged</span></a><span style="font-family: inherit;"> among <em><a href="http://en.wikipedia.org/wiki/Homo_erectus" target="_blank">homo erectus</a>, </em>hundreds of thousands of years before <em>homo sapiens</em>. (See </span><a href="http://news.nationalgeographic.com/news/2010/01/100112-modern-human-behavior/" target="_blank"><span style="font-family: inherit;">November 21, 2012 post</span></a><span style="font-family: inherit;">). Exactly when their social structures emerged is a matter of debate, but as forms of <em>homo</em> cooperation evolved forms of communication would be expected to emerge as well and how those forms of communication might have evolved is what Michael Tomasello explores in <em><a href="http://www.amazon.com/Origins-Human-Communication-Nicod-Lectures/dp/0262515202" target="_blank">The Origins of Human Communication</a></em>. The timing of the emergence of verbal language among <em>homo</em> is also </span><a href="http://en.wikipedia.org/wiki/Origin_of_language" target="_blank"><span style="font-family: inherit;">a matter of debate and no consensus</span></a><span style="font-family: inherit;">, but some put that event as occurring among <em>homo sapiens</em> </span><a href="http://www.atlnightspots.com/worlds-languages-traced-back-to-africa/" target="_blank"><span style="font-family: inherit; font-size: small;">50-70,000 years ago</span></a><span style="font-family: inherit;"> and perhaps </span><a href="http://www.dailymail.co.uk/sciencetech/article-1377150/Every-language-evolved-single-prehistoric-mother-tongue-spoken-Africa.html" target="_blank"><span style="font-family: inherit;">as early as 100,000 years ago</span></a><span style="font-family: inherit;">, but </span><a href="http://machineslikeus.com/news/when-did-language-originate" target="_blank"><span style="font-family: inherit;">perhaps earlier</span></a><span style="font-family: inherit;">. That timing would correlate with what we believe is the evolutionary origins of modern humans, </span><a href="http://en.wikipedia.org/wiki/Homo_sapiens_sapiens" target="_blank"><em><span style="font-family: inherit;">homo sapiens sapiens</span></em></a><span style="font-family: inherit;"><em>, </em>in Africa. Whatever the timing for the origins of human speech, there is a gap of hundreds of thousands of years between the origins of communication and verbal speech. But it is over these hundreds of thousands of years, if not over a million years, of the evolution of cooperation among the species of genus <em>homo, </em>that the psychological infrastructure critical to human eusociality cited by Tomasello developed:</span></span></span></span></span></div>
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
</span></span></span></span><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"> </span></span></span><br />
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
</span></span></span><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">"[O]ur proposal," Tomasello writes, " is the relatively uncontroversial one that human collaboration was initially mutualistic --- with this mutualism depending on the first step of more tolerant and food-generous participants [see e.g. <a href="http://csilcox-thebookshelf.blogspot.com/2013/03/richard-wrangham-catching-fire-how.html" target="_blank">March 28, 2013 post</a>]. The more novel part of the proposal is that mutualistic collaboration is the natural home of cooperative communication. Specifically, skills of recursive mindreading arose initially in forming joint goals, and then this led to joint attention on things relevant to the joint goal (top-down) and eventually to other forms of common conceptual ground. Helping motives, already present to some degree in great apes outside of communication, can flourish in mutualistic collaboration in which helping you helps me. And so communication requests for help --- either for actions or for information --- and compliance with these (and perhaps even something in the direction of offering help by informing) were very likely born in mutualistic collaboration. At this point in our quasi-evolutionary tale, then, we have, at a minimum, point to request help and a tendency to grant such requests --- with perhaps some offers of help with useful information --- in the immediate common ground of mutualistic collaborative interactions."</span></span></span><br />
<span style="color: #231f20; font-family: TimesNRMT;"><span style="color: #231f20; font-family: TimesNRMT;"><span style="color: #231f20; font-family: TimesNRMT;">
</span></span></span><span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;"></span></span></span><br />
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">Helping by informing becomes the cornerstone of indirect reciprocity, which Martin Nowak finds makes humans "supercooperators:" the only species that "can summon the full power of indirect reciprocity, thanks to our rich and flexible language." (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012 post</a>).</span></span></span><br />
<span style="color: #231f20; font-family: TimesNRMT;"><span style="color: #231f20; font-family: TimesNRMT;"><span style="color: #231f20; font-family: TimesNRMT;">
</span></span></span><br />
<div align="LEFT">
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">Missing from Tomasello's discussion of social cognition and the human communication is the emotional content and perhaps underpinnings of that communication. Tomasello cites inflexible ape vocalization as "tightly tied to emotion," but as prior posts in this blog point out there is uniquely human social behavior anchored in emotions arising in the more ancient part of our brains. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 posts</a>). Jaak Panksepp's discussion (May 19, 2013 post) of the care system, the grief system, and the play system are just examples of the emotional underpinnings of the psychological infrastructure that Tomasello relies upon to build his viewpoint. I do not purport to have read everything that Tomasello has researched and written, so maybe this discussion occurs elsewhere. I note that he does comment in his 2005 article, that "theorists such as Trevarthen (1979), Braten (2000), and especially <a href="http://www.amazon.com/The-Cradle-Thought-Exploring-Thinking/dp/0195219546" target="_blank">Hobson (2002</a>), have elaborated the interpersonal and emotional discussions of early human ontogeny in much more detail than we have here. We mostly agree with their accounts . . ."</span></span></span></div>
<span style="color: #231f20;"><span style="color: #231f20;"><span style="color: #231f20;">
</span></span></span><br />
<div align="LEFT">
There are two final points in this discussion. First, the evidence from babies supports the conclusion that our social cognition and behavior is not innate. Tomasello calls this <a href="http://www.healthline.com/galecontent/ontogenetic-development" target="_blank">ontogenetic</a>, which is the nurture side of the nature versus nurture (post natal learning) debate. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">September 18, 2009 post</a>). With respecting to language, for example, V.S. Ramachandran believes that what is innate is our "capacity for acquiring rules of language," not language or verbal speech itself. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/10/vs-ramachandran-tell-tale-brain-2011.html" target="_blank">October 25, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/02/daniel-j-levitin-this-is-your-brain-on.html" target="_blank">February 15, 2012 posts</a>). Tomasello certainly concurs with the view that "[t]he actual acquisition of language occurs as a result of social interaction. <span class="blsp-spelling-error" id="SPELLING_ERROR_25">Ramachandran</span> believes that language was enabled by cross linkages in the brain between different motor maps (e.g. the area responsible for manual gestures and the area responsible for <span class="blsp-spelling-error" id="SPELLING_ERROR_26">orafacial</span> movements). (<em><a href="http://csilcox-thebookshelf.blogspot.com/2012/02/daniel-j-levitin-this-is-your-brain-on.html" target="_blank">Id</a></em>.) Second, Marco Iacoboni's suggestion <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">(September 18, 2009 post)</a> that the individualistic or competitive models of human behavior leave much to be desired is worth invoking. As a prior post (<a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">id</a>) observed, "Self and other are 'inextricably blended,' says Iacoboni. The sense of self follows the sense of 'us,' which is the first "sense" of awareness an infant has immediately following its birth as a result of mother infant interactions. We are social animals first." While competition and individual behavior is not necessarily a vice and might be deemed virtue (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/01/bernard-mandeville-fable-of-bees-1723.html" target="_blank">January 30, 2010 post</a>), it is not the endpoint in our understanding and modeling of our human cognitive framework. In the contentious political conversation that now embraces America, it is not sufficiently recognized that the social and cooperative dimension of the human cognitive framework is dominant to competition in that framework as well as our evolutionary history as a key to human survival. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/08/jared-diamond-collapse-2005.html" target="_blank">August 22, 2012 post</a>s). If we can cooperatively navigate our way successfully through a subway station without thinking too deeply about what we are doing, we ought to be able to collaboratively solve a public policy issue.</div>
<div align="LEFT">
</div>
CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com1tag:blogger.com,1999:blog-2282031583522712001.post-65316436051497444422013-09-25T18:03:00.000-07:002013-09-25T18:09:28.022-07:00Luigi Luca Cavalli-Sforza, Genes, Peoples and Languages (2001)Information is typically packaged. The smallest unit of information (something like a <a href="http://en.wikipedia.org/wiki/Bit" target="_blank">bit</a>) (see <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/charles-seife-decoding-universe-2006.html" target="_blank">August 23, 2009</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/seth-lloyd-programming-universe-2006.html" target="_blank">August 17, 2009 posts</a>) has limited meaning (information value) on its own. Aggregating, absorbing, connecting, colliding, and communicating with other units of information expands the information value associated with the package of bits. These packages of information include small subatomic units, electrons, atoms, chemical compounds, photons, waves of sound and light, proteins, genotypes, cells, organs, phenotypes, letters, words, songs, books, and culture. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/11/matt-ridley-genome-1999.html" target="_blank">November 27, 2010 post</a>).<br />
<br />
Information migrates. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/05/arthur-herman-how-scots-invented-modern.html" target="_blank">May 20, 2012 post</a>) It is in nearly constant motion. And when it is in motion, information can be altered and its meaning changed. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html" target="_blank">August 15, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/charles-seife-decoding-universe-2006.html" target="_blank">August 23, 2009 post</a>s). Sometimes information is degraded by change; sometimes information is enhanced. Information moves with its package; the package migrates, and information moves along. <em><a href="http://www.amazon.com/Genes-People-Languages-Luigi-Cavalli-Sforza/dp/0865475296" target="_blank">Genes, Peoples and Languages</a> </em>is about the movement of genetic information in the package of a phenotype and the scientific quest to track the movement and transformation of modern human genes over the course of roughly one hundred thousand years. And along the way, as a result of natural selection, and in some geographic areas, genetic drift, the information in this genetic package was edited and revised from the general population that preceded it: hair texture and color changed, skin color changed, small genetic changes enabled humans to digest milk, immunized them from diseases such as malaria in certain areas, morphologies changed, and so on. <br />
<br />
This research supports the <a href="http://www.sciencedaily.com/releases/2007/05/070509161829.htm" target="_blank">Out of Africa hypothesis</a>: that <a href="http://intl.pnas.org/content/early/2012/10/17/1212380109.full.pdf" target="_blank">modern human origins begin on the African continent</a> approximately 100,000 years ago, likely in southern Africa; that intra-continental African migration ensued northward along East Africa in the thousands of years afterward; and the first migration of <em>homo sapiens</em> out of Africa occurred roughly 50-60,000 years ago to the Arabian peninsula and the <a href="http://en.wikipedia.org/wiki/Levant" target="_blank">Levant</a>, likely along the coast, and ultimately to southern Asia (India), <a href="http://blogs.smithsonianmag.com/hominids/2012/08/the-oldest-human-fossils-in-southeast-asia/" target="_blank">southeast Asia</a>, and Oceania (Australia) about 45,000 years ago. And about the same time that modern humans were reaching Oceania, migrations out of the Levant northward in the direction of Europe, and later in the direction of central Asia and ultimately to North America roughly 15,000 years ago. What should not be forgotten in this focus on modern human migration is that a similar migratory path may have been taken over a million years earlier by <em><a href="http://www.dnalc.org/view/15892-Human-migrations-map-interactive-2D-animation.html" target="_blank">homo erectus</a></em>. <br />
<br />
<a href="http://en.wikipedia.org/wiki/Luigi_Luca_Cavalli-Sforza" target="_blank">Cavalli-Sforza</a> sees a parallel between genetic evolution and cultural evolution. The units and type of information as well as the means of information transmission in these two circumstances, however, are radically different. Speech acts (including rituals) and language are the means of transmitting cultural information, and Cavalli-Sforza treats linguistic evolution as a type of cultural evolution. But genes and culture do not co-evolve. As mentioned in an earlier post, "Language is a social institution, and social institutions and culture evolve, albeit at a different and faster pace than biological evolution." (<a href="http://csilcox-thebookshelf.blogspot.com/2009/08/christine-kenneally-first-word-2007.html" target="_blank">August 31, 2009 post</a>). Language changes can occur as a result of migration and conquest of another's territory. Cavalli-Sforza documents this in a number of cases. Religion is another attribute of culture that likewise can change as a result of migration and conquest. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/05/robert-wright-evolution-of-god-2009.html" target="_blank">May 12, 2010 post</a>). And ideas can change as a result of migration and conquest. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/05/arthur-herman-how-scots-invented-modern.html" target="_blank">May 20, 2012 post</a>). "There is a fundamental difference between biological and cultural mutation," writes Cavalli-Sforza. "Cultural mutations may result from random events, and thus be very similar to genetic mutations, but cultural changes are more often intentional or directed to a very specific goal, while biological mutations are blind to their potential benefit. At the level of mutation, cultural evolution can be directed while genetic change cannot." Later he adds, "We must note a significant difference between biological and linguistic mutation. A genetic mutant is generally very similar to the original gene, since one gives rise to another with only a small change. Words vary in more complicated ways. The same root can change meaning. One word can have may unrelated senses. One could try to establish greater similarities between genes and words taking into account all of the peculiarities, but it is not clear that would be useful." The curious aspect of Cavalli-Sforza's discussion of biological and cultural evolution and transmission is the absence of any discussion of the evolutionary debate about whether evolution operates on genes, phenotypes, or groups that has laced this subject for several decades now. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/11/bert-holldobler-eo-wilson-superorganism.html" target="_blank">November 4, 2009</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2009/11/richard-powers-generosity-2009.html" target="_blank">November 30, 2009</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a>, and <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012</a> posts). References to Richard Dawkins, memes, and Edward Wilson are not to be found. Cavalli-Sforza's discussion on this subject is disjointed, and one wonders how he would treat the subject of the unit of information on which evolution operates.<br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com1tag:blogger.com,1999:blog-2282031583522712001.post-88773222167037186052013-09-10T19:46:00.000-07:002013-09-15T11:19:56.995-07:00Rodrigo Q. Quiroga, Borges and Memory (2012)Memory is a subject that is no stranger to this blog. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/02/jose-saramago-small-memories-2009.html" target="_blank">February 26, 2013</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/12/howard-jacobson-finkler-question-2010.html" target="_blank">December 2, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html" target="_blank">November 6, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/09/daniel-schacter-seven-sins-of-memory.html" target="_blank">September 20, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html" target="_blank">August 15, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2010/11/matt-ridley-genome-1999.html" target="_blank">November 27, 2010</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2010/09/bobby-thomson-et-al-giants-win-pennant.html" target="_blank">September 9, 2010</a> posts.). Rodrigo Quiroga's <a href="http://mitpress.mit.edu/books/borges-and-memory-0" target="_blank"><em>Borges and Memory</em></a><em> </em>effectively assembles most of the comments and observations in these earlier posts and weaves in the parallel observations of Argentinian writer Jorge Luis Borges' about human memory, demonstrating that Borges' astute observations hold close to the empirical research of neuroscience. Borges' artistic tool drafted by Quiroga to make his point is a short story authored by Borges in 1942, <em><a href="http://www.srs-pr.com/literature/borges-funes.pdf" target="_blank">Funes the Memorious</a></em>, about a man who could not forget the most trivial facts and datum and sustained an incredible memory. In Borges' words, Funes had an "infinite vocabulary for the natural series of numbers, a useless mental catalogue of all the images of his memory." He was "almost incapable of ideas in general," which means he could not abstract and generalize. In light of Daniel Schacter's characterization of the human mind's near universal capacity for forgetting (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/09/daniel-schacter-seven-sins-of-memory.html" target="_blank">September 20, 2011 post</a>), sometimes instantly, sometimes only over considerable periods of time --- and the evolutionary benefits of forgetting --- the story of Funes is a unique story, one far removed from normal experience. But Quiroga informs and documents that there are real cases of persons like Funes, who are simply incapable of forgetting the slightest details and their memories are clogged with useless data. He also confirms that these individuals are limited in their ability to generalize and abstract. There is the case of <a href="http://en.wikipedia.org/wiki/Solomon_Shereshevsky" target="_blank">Solomon Shereshevskii</a> of Russia, a patient of Alexander Luria, whose capacity for memorizing meaningless sequences of numbers, formulae, words, sequences of syllables, and retaining them in memory for years is much like the Funes of Borges' imagination. Shereshevskii's diagnosis includes severe <a href="http://en.wikipedia.org/wiki/Synesthesia" target="_blank">synesthesia</a>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/leonard-mlodinow-euclids-window-2001.html" target="_blank">November 20, 2011 </a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/10/vs-ramachandran-tell-tale-brain-2011.html" target="_blank">October 25, 2011 post</a>). Shereshevskii reportedly tells Luria that he could not read nor study because it made him lose track of what he was reading if he had to think about words beyond their literal meaning.<br />
<br />
And there are others, and they are often referred to as <a href="http://en.wikipedia.org/wiki/Savant_syndrome" target="_blank">savants</a> and in other cases are regarded as autistic: <a href="http://en.wikipedia.org/wiki/Kim_Peek" target="_blank">Kim Peek</a> (whom the story <em><a href="http://www.imdb.com/title/tt0095953/" target="_blank">Rain Man</a></em> was based upon) who was diagnosed with <a href="http://en.wikipedia.org/wiki/FG_syndrome" target="_blank">FG Syndrome</a>, <a href="http://en.wikipedia.org/wiki/Daniel_Tammet" target="_blank">Daniel Tammet</a> whose incredible ability to recall sequences of numbers (including large numbers such as primes) was ascribed to synesthesia, <a href="http://en.wikipedia.org/wiki/Leslie_Lemke" target="_blank">Leslie Lemke</a> who suffered from cerebral palsy and was blind, but was nevertheless a highly skilled pianist who had an enormous memory for music after hearing a song only once, and <a href="http://en.wikipedia.org/wiki/Jill_Price" target="_blank">Jill Price</a>, for whom the inability was a curse.<br />
<br />
We sometimes refer to a person's "<a href="http://en.wikipedia.org/wiki/Eidetic_memory" target="_blank">photographic" memory</a>. But that is not the way memory works, particularly memory based on visual perception. Recall <a href="http://www.usc.edu/programs/neuroscience/faculty/profile.php?fid=27" target="_blank">Antonio Damasio's</a> account of how the brain retains and retrieves memories. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>). Larger human brains lack the storage capacity for "large files of recorded images of prior events." To solve this problem, Damasio argues, human brains "borrowed the dispositional strategy" from early evolution that allows us to be able to retrieve those memories without, figuratively speaking, having to film and store those images. Here is how Damasio says this works. He refers to ancient dispositional networks, which we can identify as the subcortical systems discussed by Jaak Panksepp in his account of human emotional systems (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>), and these are unconscious, automatic dispositions that are essentially hardwired into our biology. Recall Panksepp's discussion of the Fear System and the link between emotions and memory: "Learning and memory are automatic and involuntary responses (mediated by
unconscious mechanisms of the brain), which in their most lasting forms are
commonly tethered to emotional arousal. Emotional arousal is a necessary
condition for the creation of fear-learning memories." (<a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>). The ancient dispositional networks and the more recent, evolutionarily speaking, mapping networks in the brain, says Damasio, are now connected. A current perception triggers the dispositional network that directs the brain to reassemble aspects of past perceptions from the part of the cerebral cortex that had been previously activated when an original perception of an object occurred and where the representation or image was mapped. This occurs in what Damasio refers to as <em><a href="http://bigthink.com/videos/the-convergence-and-divergence-of-memory" target="_blank">convergence-divergence zones</a></em> that record "the coincidence of activity in neurons hailing from different brain sites, neurons that had been made active by the mapping of a certain object." A part of the cerebral cortex is devoted to image space where images of all sensory types occurs and map-making occurs; a separate part of the cerebral cortex is devoted to dispositional space where the tools exist to reactivate and generate images previously experienced --- affectionately called "<a href="http://mindhacks.com/2005/06/24/evidence-for-grandmother-cells/" target="_blank">grandmother cells</a>" after our ability to recall our grandmother. The contents of dispositions, Damasio says, are always unconscious --- he says they are "encrypted" and "implicit" --- in contrast to the <em>explicit </em>images in the image space created by current perceptions. The "encrypted" dispositions are not themselves images, but merely implicit formulas for how to reconstruct maps in image space. Our "knowledge base" is, by this hypothesis, part of our unconscious brain, stored in code, waiting to be retrieved from what Damasio refers to as "association cortices" (and hence the analogy to <a href="http://webspace.ship.edu/cgboer/psychbeginnings.html" target="_blank">Hume's associationism</a>). <br />
<br />
Quiroga concurs in substantial part. "The brain does not reproduce visual stimuli like a digital camera or television, but rather processes their meaning starting from relatively little information and a set of unconscious inferences. Now, if the neurons in the primary visual cortex do not simply copy information detected by the retina, what do they do? This was what <a href="http://www.nobelprize.org/nobel_prizes/medicine/laureates/1981/presentation-speech.html" target="_blank">David Hubel and Torsten Wiesel</a> at the Johns Hopkins University set out to study in the late 1950s. Following a serendipitous event and a spectacular series of experiments that ensued (which earned them the Nobel Prize in Physiology or Medicine in 1981). Hubel and Wiesel discovered neurons in the <a href="http://www.cs.utexas.edu/users/nn/web-pubs/sirosh/pvc.html" target="_blank">primary visual cortex</a> that respond to oriented lines. This information eventually reaches the <a href="http://www.scholarpedia.org/article/Inferior_temporal_cortex" target="_blank">inferior temporal lobe</a> where "face cells" -- ie., neurons that respond to human or monkey faces but not to other images, for example of hands, fruits or houses --- were found in experiments with monkeys. . . Each neuron in the retina responds to a particular point, and we can infer the outline of a cube starting from the activity of about thirty of them [retinal neurons]. Next the neurons in the primary visual cortex fire in response to oriented lines; fewer neurons are involved and yet the cube is more clearly seen. This information is received in turn by neurons in higher visual areas, which are triggered by more complex patterns --- for example, the angles defined by the crossing of two or three lines. . . As the processing of visual information progresses through different brain areas, the information represented by each neuron becomes more complex, and at the same time fewer neurons are needed to encode a given stimulus. In the late 1960s, Polish psychologist Jerzy Konorski wondered if at the end of this process there might be individual neurons that represent an object or a person as a whole. Is there, for example, a neuron that represents the concept of 'my home,' another one that responds to 'my dog,' and another for 'my grandmother.'" Quiroga believed the answer lies in or near the hippocampus, because of a case made famous by <a href="http://www.nytimes.com/2013/05/21/science/still-charting-memorys-depths.html?pagewanted=all" target="_blank">Brenda Milner</a> involving split brain surgery that removed a patient's hippocampus and resulted in an inability to form explicit, new memories. <br />
<br />
Quiroga became engaged in research with <a href="http://en.wikipedia.org/wiki/Christof_Koch" target="_blank">Christof Koch</a> at Caltech that studied recordings of activity of individual neurons in the <a href="http://www.ncbi.nlm.nih.gov/pubmed/15217334" target="_blank">medial temporal lobe</a> that responded not to a generic face, but responded uniquely to the faces of individual humans: actress Jennifer Aniston, soccer star Diego Maradona, and Mr. T. Medial temporal lobe structures that are critical for long-term memory include the <a href="http://en.wikipedia.org/wiki/Amygdala" title="Amygdala">amygdala</a>, <a href="http://en.wikipedia.org/wiki/Brainstem" title="Brainstem">brainstem</a>, and <a href="http://en.wikipedia.org/wiki/Hippocampus" target="_blank">hippocampus</a>, along with the surrounding <a href="http://en.wikipedia.org/wiki/Hippocampal_formation" title="Hippocampal formation">hippocampal region</a> consisting of the <a href="http://en.wikipedia.org/wiki/Perirhinal_cortex" title="Perirhinal cortex">perirhinal</a>, <a href="http://en.wikipedia.org/wiki/Parahippocampal_gyrus" title="Parahippocampal gyrus">parahippocampal</a>, and <a href="http://en.wikipedia.org/wiki/Entorhinal_cortex" title="Entorhinal cortex">entorhinal</a> neocortical regions. The hippocampus is critical for memory formation, and the surrounding medial temporal cortex is currently theorized to be critical for memory storage. The <a href="http://en.wikipedia.org/wiki/Prefrontal_cortex" title="Prefrontal cortex">prefrontal</a> and visual cortices are also involved in explicit memory. The firing of neurons that begins in the retina (or in the case of auditory stimuli, in the cochlea), goes through the primary visual cortex (or other primary cortical areas), passing through higher visual areas in the temporal lobe ultimately reaches the hippocampus. The many individual neurons in the retina respond to many discrete visual stimuli that does not represent the image as a whole. At some point in the brain's pathways from the retina to the hippocampus the brain assembles these varied discrete stimuli. What Quiroga and others found is that the "<a href="http://www.scientificamerican.com/article.cfm?id=brain-cells-searching-for-jennifer-aniston-neuron" target="_blank">Jennifer Aniston neuron</a>" did not respond to a single image of Jennifer Aniston, but to multiple images of Jennifer Aniston. Further research found that a specific neuron would also respond to a multitude of Star Wars characters --- Luke Skywalker, Yoda, Darth Vader. What this signified is that the unique neuron did not associate with a particular camera image, but to a conceptual image. This "loss of detail" led Quiroga to conclude that the neuron codes for an abstraction of Jennifer Aniston. "We can see Funes," writes Quiroga, "as someone lacking those Jennifer Aniston neurons that encode abstract concepts." He repeats, "We do not process images in our brains in the same way a camera does; on the contrary, we extract a meaning and leave aside a multitude of details.. . . Just like perception, memory is a creative process; when we remember something we do not repeat the experience as it was; rather we relive it in another context, create a new representation, and even change its meaning." If we integrate Damasio into Quiroga's thinking, the convergence-divergence zones assemble "a cascade of processes that involve memories and emotions," "extracting particular features that help us recognize these people and objects in very different circumstances." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html" target="_blank">November 6, 2011 post</a>).<br />
<br />
Quiroga quotes Francis Crick (<em><a href="http://www.amazon.com/Astonishing-Hypothesis-Scientific-Search-Soul/dp/0684801582" target="_blank">The Astonishing Hypothesis</a></em>): "What you see is not what is really there; it is what your brain believes is there. Seeing is an active, constructive process. Your brain makes the best interpretation it can according to its previous experience and the limited and ambiguous information provided by our eyes." This recalls Michael Gazzaniga's "Interpreter" (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/07/michael-gazzaniga-whos-in-charge-free.html" target="_blank">July 25, 2013</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/michael-gazzaniga-human-2008.html" target="_blank">September 27, 2009 post</a>s). A previous post that quotes Crick and Gazzaniga is likewise worth recalling (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011 post</a>). This latter post introduces the subject of the brain's capacity to engage in self-deception, to imagine things that don't exist in reality. I think in most cases we do not engage in self-deception, the brain's default mode is to construct a reality as close to the reality that is out there. So what we see, most of the time, is really there, or at least close to it. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/02/robert-trivers-folly-of-fools-2011.html" target="_blank">February 4, 2012 post</a>). Information in our quantum world is probablistic. It is true that the brain quickly discards much information that is never encoded. As Daniel Schacter has explained --- and Quiroga acknowledges --- our memories can become very fragile if not unreliable, and our "reality" can read more like fiction. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/09/daniel-schacter-seven-sins-of-memory.html" target="_blank">September 20, 2011 post</a>). And that is interesting, because the minds of savants like Funes have the ability to recollect and retain for considerable periods of time incredible detail without assigning meaning and in those cases the brain does not filter out reality from gibberish.CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-9699823753813097372013-08-27T18:18:00.001-07:002013-09-06T17:31:32.372-07:00Roberto Saviano, Beauty and The Inferno (2012)In <em><a href="http://www.amazon.com/All-Names-Jose-Saramago/dp/B005K5HOQI" target="_blank">All The Names</a></em>, <a href="http://www.nobelprize.org/nobel_prizes/literature/laureates/1998/saramago-bio.html" target="_blank">Jose Saramago</a> tells the story of a civil servant in an unnamed city of an unnamed country charged with maintaining the birth, marriage, and death records of a nation's citizens. The government agency, the Central Registry of Births, Marriages and Deaths, operates a data warehouse combining records of both the living and the dead. Of course, there is a story, a life history, for every one of these citizens and former citizens, but the data warehouse does not record those stories. In a revolutionary moment, Saramago's protagonist, an equally anonymous "Senor Jose," rebels against his assigned duty and follows his curiosity to uncover the story of one citizen whose data record on an index card prompted his curious interest in a life beyond data. <br />
<br />
Sr. Jose had a secret hobby that propelled him to look deeper into a person than the data in the data warehouse. He collected news items about famous citizens of his country, and he would supplement his curiosity about the famous by climbing the walls of the Central Registry to collect their birth data, including information about the names of parents, godparents, birthplaces, addresses, and the like. And one night, in the Central Registry, while collecting the index cards of five famous persons, he inadvertently pulls out a card of a sixth person, an unknown woman, not famous, and he becomes obsessed with learning her story. He can't turn to the newspapers. She's not famous, after all. He puts her index card back in the file along with the five cards of the famous, but not before copying the data on the card. Sr. Jose "makes a decision," writes Saramago: he decided to look for the unknown woman. Saramago's discussion of this "decision" assembles in one paragraph a succinct discussion of free will comparatively similar to the way free will (or the lack thereof) is discussed in two recent postings on the subject (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/07/michael-gazzaniga-whos-in-charge-free.html" target="_blank">July 15, 2013 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>):<br />
<br />
"Senhor Jose's decision appeared two days later. Generally speaking, we don't talk about a decision appearing to us, people jealously guard both their identity, however vague it might be, and their authority, what little they may have, and prefer to give the impression that they reflected deeply before taking the final step, that they pondered the pros and cons, that after intense mental effort, they finally made a decision. It has to be said that things never happen like that. Obviously it would not enter anyone's head to eat without feeling hungry, and hunger does not depend on our will, it comes into being of its own accord, the result of objective bodily needs, it is a physical and chemical problem whose solution, in a more less satisfactory way, will be found in the contents of a plate. Even such a simple act as going down into the street to buy a newspaper presupposes not only a desire to receive news, which, since it is a desire, is necessarily an appetite, the effect of specific physico-chemical activities in the body, albeit of a different nature, that routine act presupposes, for example, the unconscious certainty, belief or hope that the delivery van was not late or that the newspaper stand is not closed due to illness or to the voluntary absence of the proprietor. Moreover, if we persist in stating that we are the ones who make our decisions, then we would have to begin to explain, to discern, to distinguish, who it is in us who made the decision and who subsequently carried it out, impossible operations by anyone's standards. Strictly speaking, we do not make decisions, decisions make us. The proof can be found in the fact that, though life leads us to carry out the most diverse actions one after the other, we do not preclude each one with a period of reflection, evaluation and calculation, and only then declare ourselves able to decide if we will go out to lunch or buy a newspaper or look for the unknown woman." Sr. Jose walks out his door and goes to see the street where the woman purportedly lived. Sr. Jose's search to find the woman named on the sixth index card becomes an obsession to learn her story, driven by what Jaak Panksepp calls our Seeking system urges (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>).<br />
<br />
Consider the life of an exceptional journalist, an investigative journalist who wants to tell a story, not just report data. <a href="http://www.robertosaviano.it/" target="_blank">Roberto Saviano</a> made a decision to investigate and expose the <a href="http://en.wikipedia.org/wiki/Camorra" target="_blank">Camorra</a>, one of the Mafia clans in his native Naples area of Italy. Saviano probably did "reflect deeply before taking the final step," to borrow a phrase from Saramago, After all, inquiry and publicity of a Mafia sect knowingly undertakes a risk to self-preservation. <a href="http://www.pbs.org/frontlineworld/stories/italy801/interview/saviano.html" target="_blank">He knows</a> that the people he writes about in his book <a href="http://www.amazon.com/Gomorrah-Personal-Journey-International-Organized/dp/B003R4ZGL4" target="_blank"><em>Gamorrah</em></a><strong> </strong>are killers and will not hesitate to kill him if it suits them. He knows this; he must have reflected deeply about this. This is, in the words of <a href="http://www.psych.rochester.edu/people/ryan_richard/index.html" target="_blank">Richard Ryan</a> and <a href="http://www.psych.rochester.edu/people/deci_edward/index.html" target="_blank">Edward Deci</a>, "self-determination. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/07/michael-gazzaniga-whos-in-charge-free.html" target="_blank">July 15, 2013 post</a>). Saviano assented to his course of action in exposing the Camorra, although he may very well have been driven by his Seeking system and perhaps modulated by his Fear system. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>). Jose Saramago described Roberto Saviano as someone who "mastered the art of living." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/07/jose-saramago-elephants-journey-2008.html" target="_blank">July 17, 2011 post</a>). He is referring to Saviano's "courage," <a href="http://www.merriam-webster.com/dictionary/courage" target="_blank">which refers to</a> a mental persistence to persevere in the face of fear. Courage has to be found in the neocortex of the brain, not the subcortical emotional systems that Panksepp finds controlling. But yes, Senhor Jose made a decision that probably did not involve much reflection to search for the story of a woman who is more than a piece of data; Roberto Saviano made a decision to confront fear and seek truth that undoubtedly involved substantial reflection. Their intent is different: Senhor Jose is out to satisfy his own curiosity and perhaps derive some private reflection on the meaning of his life; Roberto Saviano is not interested in merely satisfying some private need, but to engage in social communication with a broader public, perhaps to arouse the public's reflection to some collective action.<br />
<br />
"A writer can never be a good person," writes Saviano. "Often he comes to writing precisely because he realizes he cannot be a good person. He ends up writing with a sense of guilt for not being able to change things, in the hope that his indirect actions will multiply in his readers' consciousness, that they might act in his stead or alongside him, creating the ultimate dream of a community of people who understand, feel and walk together. People who live." <br />
<br />
<em><a href="http://www.theguardian.com/books/2010/oct/09/roberto-saviano-beauty-inferno-mafia" target="_blank">Beauty and The Inferno</a> </em>represents Roberto Saviano's further reflection on his infernal life in the wake of the publication of <em>Gamorrah, </em>in which he lives a life guarded by police charged with protecting his life from the threats of the Camorra. The parallels with the life of <a href="http://www.salman-rushdie.com/" target="_blank">Salman Rushdie</a> following the Iranian fatwa issued after publication of <a href="http://en.wikipedia.org/wiki/The_Satanic_Verses" target="_blank"><em>The</em> <em>Satanic Verses</em></a> immediately come to mind, and the parallel is not lost on Saviano who writes an essay about his invitation to join Rushdie for a panel discussion at the Swedish Academy. But <em>Beauty and The Inferno </em>also reads a bit like <em><a href="http://en.wikipedia.org/wiki/Profiles_in_Courage" target="_blank">Profiles in Courage</a></em>. And among all the names that Saviano recognizes for leading a courageous life not dissimilar from his, some of whom suffered a fate we call premature death, include: <a href="http://www.people.com/people/archive/article/0,,20098794,00.html" target="_blank">Joe Pistone</a> who squealed on the American mafia; <a href="http://en.wikipedia.org/wiki/Giancarlo_Siani" target="_blank">Giancarlo Siani</a> who also wrote about the Camorra; <a href="http://en.wikipedia.org/wiki/Uwe_Johnson" target="_blank">Uwe Johnson</a>; <a href="http://en.wikipedia.org/wiki/Gustaw_Herling-Grudzi%C5%84ski" target="_blank">Gustav Herling</a>; <a href="http://en.wikipedia.org/wiki/Varlam_Shalamov" target="_blank">Varlam Shalamov</a>; <a href="http://en.wikipedia.org/wiki/Anna_Politkovskaya" target="_blank">Anna Politkovskaya</a>. Names that may have remained anonymous pieces of data, but Saviano has captured their stories.<br />
<br />
CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-11314756506536184032013-08-12T18:15:00.001-07:002013-08-12T18:15:47.237-07:00George Saunders, Tenth of December (2013)I rarely read a short story. I respect that writing a good short story can be as challenging, if not more challenging in many respects as writing a novella or novel. There is less time to dawdle and get to the point in a short story, but in my experience the fleeting taste that one experiences in a short story is not as satisfying as a well-developed character, cast of characters, and extended plot of a novel. In several of <a href="http://www.georgesaundersbooks.com/" target="_blank">George Saunders'</a> short stories in <em><a href="http://www.amazon.com/Tenth-December-Stories-George-Saunders/dp/0812993802/ref=sr_1_1?s=books&ie=UTF8&qid=1375839963&sr=1-1" target="_blank">Tenth of December</a></em> , I was left with that same level of satisfaction (I don't want to say dissatisfaction, because that is not what I mean; but I do mean to say less than satisfying). I expect a short story to figuratively punch me in the face and get my attention. And one of Saunders' stories, <em><a href="http://www.arts.rpi.edu/~ruiz/AdvancedDigitalImagingSpring2013/ReadingsADI/Saunders%20Escape%20from%20Spiderhead.pdf" target="_blank">Escape from Spiderhead</a>, </em>did just that. A world manipulated by <a href="http://www.urbandictionary.com/define.php?term=soma" target="_blank">soma</a> -- Huxley's <em><a href="http://www.amazon.com/Brave-New-World-Aldous-Huxley/dp/0060850523" target="_blank">Brave New World</a> --- </em>comes to mind. <br />
<br />
Saunders introduces a brave new world of drug-induced manipulation imaginatively different than Huxley. Soma leaves Huxley's masses anesthetized; Saunders' imaginary pharma cocktails manipulate and heighten every subcortical emotional system in Jaak Panksepp's lexicon (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a>). The drugs from Saunders' imaginary MobiPak manipulate the Seeking, Lust, Fear, Care and Rage systems, for example, driving passionate sex among people who would likely have no interest in one another. Add some Verbaluce to the cocktail, and suddenly your cortical regions are verbalizing your feelings more honestly than your emotional systems can feel them. Veritalk and ChatEase are also available to manipulate communications capabilities. And so are the drugs to reverse everything that the other drugs manipulated in the first instance.<br />
<br />
And who submits to this life? Very subtly Saunders unfolds this regime. I nearly missed it when one of the manipulative experimenters begins a conversation, "Do you even know her story? You don't. You legally can't. Does it involve whiskey, gangs, infanticide? I can't say. Can I imply, somewhat peripherally, that her past, violent and sordid, did not exactly include a dog named Lassie and lot of family talks about the Bible while Grammy sat doing macramé, adjusting her posture because the quaint fireplace was so sizzling?" And then a page later: "Mom always looked heartsick when our time was up. It had almost killed her when they arrested me. The trial had almost killed her. She's spent her savings to get me out of real jail and in here." These lab rats are criminals diverted from the penal system; something like a halfway house where the only escape may be suicide. But the suicide that closes this short story --- in stark contrast to Albert Camus' treatment of suicide as a rejection of freedom and a refusal to embrace life passionately --- presents one of the more enobling characteristics of human behavior: sacrifice in order to avoid harm to another --- altruistic behavior of a type considered in several previous posts (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/10/oren-harman-price-of-altruism-2010.html" target="_blank">October 13, 2010 post</a>). This is the stuff that vividly punches you in the face. It could have been a novel.<br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-4316739206886605102013-07-15T17:28:00.001-07:002013-08-27T18:20:02.873-07:00Michael Gazzaniga, Who's In Charge? Free Will and the Science of the Brain (2011)What do we want to be free from? That is the question that drives <a href="http://www.psych.ucsb.edu/~gazzanig/" target="_blank">Michael Gazzaniga's</a> inquiry about free will and the science of the brain. "We don't want to be free from our experience of life, we need that for decisions. We don't want to be free from our temperament because that also guides our decisions. We actually don't want to be free from causation, we use that for prediction. A receiver trying to catch a football does not want to be free from all the automatic adjustments that his body is making to maintain his speed and trajectory as he dodges tackles. We don't want to be free from our successfully evolved decision-making device. What do we want to be free from?"<br />
<br />
I awaken at 5:30 in the morning. Gradually my mind becomes cognitive (conscious) again. My body is telling me to stay put. Don't get up. Rest some more. I lie still for many more minutes, and inevitably my mind starts to race about what I need to do (or don't need to do) in the coming day. I need to get out of bed and start doing things. But when? I can probably afford to lie in bed until 7am and still do what I need to do over the course of the remaining day. I lie still for awhile longer. It is now forty-five minutes later. My mind wrestles with whether I should continue to try and rest or get out of bed and start being active. I cannot lie still anymore. I <em>tell myself </em>I will get out of bed and read a chapter of Michael Gazzaniga's book, <em><a href="http://www.amazon.com/dp/0061906115" target="_blank">Who's In Charge?</a></em> and I do. I had a decision to make and I made it: I got out of bed at 6:15am instead of perhaps 7am. Is that what free-will is all about? What was I free from? No one else was dictating that I stay in bed. There was no social rule telling me I had to stay in bed until 7am. My body was not chained to the bed. But was this decision entirely unchained from all causes? Jaak Panskepp has compellingly explained that our Seeking urges begin with our <a href="http://books.google.com/books?id=fOuDJNtiCCUC&pg=PA178&lpg=PA178&dq=Panksepp+and+automatic&source=bl&ots=zrU_FwtW75&sig=wpRQ8Cy67MNgqHFkCsL3GoI4XTw&hl=en&sa=X&ei=woHoUZmQH--r4AON_IGgCw&ved=0CFYQ6AEwBg#v=onepage&q=Panksepp%20and%20automatic&f=false" target="_blank">automatic impulses</a> deep in the subcortical areas of our brain, (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">previous post</a>), so perhaps my urge to read and learn and cease lying down was something less than volitional? <br />
<br />
Much of what we humans and other animals do in our lives is automatic, unconscious, instinctive. As Panksepp points out, even learning, memory, and habit formation is unconscious. What is learned and habitual is not the stuff of choices and decision trees, and yet <a href="http://plato.stanford.edu/entries/freewill/" target="_blank">choice is what free will is purportedly about</a>. And choice, the hallmark of "who's in charge," is typically assigned to the neocortex of the brain that resides on top of the limbic system and the forebrain. Yet if we concur with Panksepp that it is the brain's subcortical emotional system that energizes the neocortex, not the other way around, the role of the neocortex appears to be only regulatory of the urges that come from below the cortex: controlling and inhibiting impulses, instinct, and habit, not initiating behavior in the first instance. <br />
<br />
For Gazzaniga, neuroscience (the study of the brain) does not offer much support for the common understanding of free will. The evidence from neuroscience is inconsistent with free will. Gazzaniga's first point is that there is no single executive decision center in the brain. The brain is composed of distinct modules, and while we may have a unified sense of self and making decisions, neuroscience does not support our sense that we are making decisions entirely liberated from either the environment around us or within us. Our sense of psychological unity, says Gazzaniga, emerges out of specialized system in the left side of our neocortex, which he call The Interpreter. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/bruce-m-hood-supersense-why-we-believe.html" target="_blank">June 5, 2011 post</a>). This is the area of the brain in which the human tendency to want to explain things is found as well as our capacity for imagination. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011 post</a>). But as Gazzaniga explains, The Interpreter is slow. It comes to life after the event it seeks to explain has occurred. So what does it mean that we humans build our theories about ourselves after the fact? "This post hoc interpreting process has implications for and an impact on the big questions of free will and determinism."<br />
<br />
Gazzaniga cites <a href="http://www.mitpressjournals.org/doi/abs/10.1162/jocn.2007.19.1.81" target="_blank">research by Hakwan Lau of Columbia University</a> that purports to show how the brain could lead the mind into thinking that the explanation developed by The Interpreter after a certain behavior was an intention occurring <em>before </em>a spontaneous action occurred; in essence, tricking the mind into thinking that the explanation was an intention. Lau discovered that an area of the brain in the frontal cortex known as the <a href="http://en.wikipedia.org/wiki/Supplementary_motor_area" target="_blank">supplemental motor area (SMA)</a> (involved with planning of motor actions that are sequences of action done from memory). An area called the <a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2648723/" target="_blank">pre-SMA</a> is involved with creating new sequences of action in memory, which gives one the feeling of the urge to move (perhaps not unlike my getting out of bed in the morning to read Gazzaniga's book). It is the pre-SMA area that is activated when humans generate actions of their own choice. Lau applied <a href="http://en.wikipedia.org/wiki/Transcranial_magnetic_stimulation" target="_blank">transcranial magnetic stimulation</a> (TMS) to the pre-SMA that locally activates nerve cells in the pre-SMA. Describing Lau's research, Gazzaniga explains: "When TMS is applied over the pre-SMA <em>after</em> the execution of a spontaneous action, the perceived onset of the intention to act, that moment when you become conscious that you intend to act, is shifted <em>backward</em> in time on the temporal map, and the <em>perceived </em>time of the actual action, the moment when you are conscious that your acting, is shifted <em>forward </em>in time." In other words, the perceived onset of intention depends, at least in part, on neural activity that takes place after the execution of action. While Lau is careful to say that without further experimentation "one cannot draw the strong conclusion that the experience of having conscious control of a simple motor action [e.g. getting out of bed in the morning] is entirely illusory," he adds that his experimental "results throw doubt on the commonsensical view that the experience of intention, including the experienced onset, is completely determined before an action." Lau adds, "An alternative view that is compatible with the data is that one function of the experience of intention [even if it occurs afterwards] might be to help clarify the ownership of actions, which can help to guide future actions." Gazzaniga concludes, however, that The Interpreter "makes the story fit with the pleasing idea one actually willed the action." Free will is illusory, he says.<br />
<br />
There is certainly no consensus among neuroscientists and psychologists over this research, just as thousands of years of philosophical debate has not achieved consensus about free-will. But one truth about free-will drawn from the philosophical debate is that it is at least a theoretical construct used to justify the notion of personal responsibility for one's actions. And that leads to the discussion of whether our free-will, if it really does exist, is limited to self-control and regulating existing tendencies of human behavior that are selfish or impulsive or emotionally driven? This is where I think Jaak Panksepp is coming from (see <a href="http://csilcox-thebookshelf.blogspot.com/" target="_blank">previous post</a>) when he says, "At primary-process levels of emotional processing there is no free will, there is no 'controlled cognitions.' Neither do the automatic secondary processes of learning and memory functions, that are molded by our wild animal passions developmentally, exhibit free will. That can only emerge from well-sculpted, deeply reflective, cognitive attitudes." Free-will is reflected in those "controlled cognitions" that respond to the neocortex being energized by the emotional systems of the subcortical areas of the brain. There is consensus that the ability to deliberate and rationally choose between different courses of actions. As Antonio Damasio has documented in <em><a href="http://www.amazon.com/Looking-Spinoza-Sorrow-Feeling-Brain/dp/0156028719" target="_blank">Looking for Spinoza</a> </em>and <em><a href="http://www.amazon.com/Descartes-Error-Emotion-Reason-ebook/dp/B00AFY2XVK/ref=sr_1_1?s=books&ie=UTF8&qid=1374757586&sr=1-1&keywords=Descartes+Error" target="_blank">Descartes Error</a></em>, choosing between different courses of action is not an act of cognition alone, but of cognition and emotion in tandem. (See also <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>). There are certain emotions that are linked with feelings of responsibility such as sympathy and regret and these emotions do not originate in the cortex where the brain's "executive control" is said to reside. <br />
<br />
Yet what inspires those controlled cognitions? I submit it is memory and culture and our body's biochemistry. And that brings us to Gazzaniga's chapter on the "the social mind," a subject that is covered in many previous posts dealing with mirror neurons, mimicry, moral feelings and related emotions. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012,</a> <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/12/rebecca-skloot-immortal-life-of.html" target="_blank">December 10, 2011</a>, posts). Gazzaniga endorses this point of view. Echoing Christopher Boehm (see November 21, 2012 post), Gazzaniga writes, "If Michael Tomasello and Brian Hare are correct that we have been domesticating ourselves over thousands of years through ostracizing and killing those who were too aggressive, in essence removing them from the gene pool and modifying our social environment, then we have been making rules for groups to live by and enforcing them throughout our evolutionary history." Gazzaniga adds, "The culture to which we belong actually plays a significant role in shaping some of our cognitive processes." And in terms of our biochemistry, Gazzaniga notes, "'Easterners and Westerners also vary in their genetic makeup . . . Much research had already shown that serotonin plays a part in attention, cognitive flexibility, and long-term memory, so [<a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2894665/" target="_blank">researchers</a>] decided that looking into a specific serotonin system polymorphism (a DNA sequence variation), which was known to affect an individual's mode of thinking, could prove fruitful [in accounting for differences in attention across cultures]. They looked at different alleles (genes which have different nucleic acid sequences occupying the same position on a paired chromosome that control the same inherited characteristic) of the 5-HTRIA gene that ultimately controls neurotransmission of serotonin. They found that there was a significant interaction between the type of 5-HTRIA alleles a person had and the culture in which he lived. This interaction affected where that particular person's attention was directed. Those person processing the identical DNA sequences in the matched gene pairs (homozygous) G allege, which is associated with the reduced ability to adapt to changes, more strongly endorse the culturally reinforced mode of thinking than those with homozygous C allele. . . Summarizing these findings, these researchers concluded, 'The same genetic predispositions can result in different psychological outcomes, depending on an individual's cultural context.'"<br />
<br />
Gazzaniga suggests that what is sorely needed in this discussion is new terminology, which may be another way of saying that the discussion needs to be repurposed. For example, we can abandon thousands of years of debate that this discussion is between causal determinism and free-will. This is essentially conceding to neuroscientists and others that our actions are determined in many respects by biology and the environment (including culture), and that cognition is not truly independent of biology and the environment. Professors <a href="http://www.psych.rochester.edu/people/ryan_richard/index.html" target="_blank">Ryan</a> and <a href="http://www.psych.rochester.edu/people/deci_edward/index.html" target="_blank">Deci</a> at the University of Rochester and others use a different terminology, <a href="http://www.selfdeterminationtheory.org/" target="_blank">"self-determination</a>" and <a href="http://www.intrinsicmotivation.net/SDT/documents/2006_RyanDeci_Self-RegulationProblemofHumanAutonomy.pdf" target="_blank">autonomy (self-regulation</a>). Self-determination and autonomy are not liberated from causal influences that motivate behavior ("people's autonomy lies not in being independent causes but in exercising their capacity to reflectively endorse or reject prompted actions"). They are not "free" in that sense. Critical to these terms is that our neocortex "assents" to whatever we have been motivated to do after some reflection. This rules out instinctive, habitual, unconscious behavior, and since we arguably assent habitually to much of our behavior each day without much reflection, it focuses on true choices. This cognitive scenario may very well involve a narrow subset of human life. I would think this is heavily an exercise exhibiting self-control.<br />
<br />
Responsibility is a social construct, Gazzaniga says, echoing John Searle's deontological view of how humans construct a social reality. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/02/john-searle-making-social-world-2010.html" target="_blank">February 24, 2013 post</a>). "Responsibility is not located in the brain. The brain has no area or network for responsibility. . . the way to think about responsibility is that it is an interaction between people, a social contract. Responsibility reflects a rule that emerges out of one or more agents interacting in a social context, and the hope that we share is that each individual will follow certain rules." But there are aspects of the brain that do lead to this interaction between people in a social context and support the development of rules for responsibility, and we have identified these in prior posts: the emotional systems and structures of the brain that promote care, grief, play, empathy, sympathy, fear, among others. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/05/jaak-panksepp-and-lucy-biven.html" target="_blank">May 19, 2013 post</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>).CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-91630640293036345682013-05-19T17:15:00.003-07:002013-07-24T17:01:42.625-07:00Jaak Panksepp and Lucy Biven, The Archaeology of Mind, Neuroevolutionary Origins of Human Emotions (2012)At its most fundamental level, the neurosensory system of every animal, including humans, is a physiological system related to monitoring our entire physical system, whether the animal is awake or not, and, when awake (including the sleep-wake transition period), reacting to stimuli. The latter is referred to as the <em><a href="http://en.wikipedia.org/wiki/Arousal" target="_blank">arousal system</a></em>. In the case of <a href="http://en.wikipedia.org/wiki/Vertebrate" target="_blank">vertebrates</a>, four neurotransmitters (chemicals) --- <a href="http://en.wikipedia.org/wiki/Acetylcholine" target="_blank" title="Acetylcholine">acetylcholine</a>, <a href="http://en.wikipedia.org/wiki/Norepinephrine" target="_blank" title="Norepinephrine">norepinephrine</a>, <a href="http://en.wikipedia.org/wiki/Dopamine" title="Dopamine">dopamine</a>, and <a href="http://en.wikipedia.org/wiki/Serotonin" title="Serotonin">serotonin</a> --- stimulate different arousal systems originating in the <a href="http://en.wikipedia.org/wiki/Brain_stem" target="_blank">brain stem</a>, evolutionarily the oldest part of the vertebrate brain, and motivate certain behaviors such as seeking food, flight or fight behavior, and sexual activity. There are certain emotional systems that are virtually at the core of the arousal system, at least during periods of wakefulness, and for several decades now, <a href="http://www.vetmed.wsu.edu/research_vcapp/Panksepp/" target="_blank">Jaak Panksepp</a> has been researching and advocating that these emotional systems have their origins with animals evolutionarily older than humans, located in the older subcortical areas of mammalian brains and only later connected with the cortical areas. <br />
<br />
<em><a href="http://books.wwnorton.com/books/978-0-393-70531-7/" target="_blank">The Archaeology of Mind</a></em> begins and ends with vertebrate animals, yet the evolutionary story is older, and to tell the story of what is missing from Panksepp's account I excerpt heavily from <a href="https://www.youtube.com/watch?v=9aivpJMkXdQ" target="_blank">Steven Rose</a>'s <em><a href="http://www.amazon.com/The-Future-Brain-Tomorrows-Neuroscience/dp/B004JZWYA6" target="_blank">The Future of the Brain</a></em>, which outlines the evolution of the brain from unicellular organisms, to eukaryotes, to invertebrate animals and vertebrates. This excerpting is important to a point I wish to make. Rose says this:<br />
<br />
"By the time that cells capable of metabolism and faithful replication, of symbiogenesis and competition appear, all the defining features of life have emerged: the presence of a semi-permeable boundary separating self from non-self; the ability to metabolise -- that is, to extract energy from the environment so as to maintain this self --- and to self-repair, at least to a degree when damaged; and to reproduce copies of this self more or less faithfully. All of these features require something we may term adaptability or behavior --- the capacity to respond to and act upon the environment in such a way as to enhance survival and replication. At its simplest, this behavior requires neither brains nor nervous systems, albeit a sophisticated set of chemical and structural features. What it does require is the property that some would call a program: at its most general way of describing both the individual chemical components of the cell and the kinetics of their interactions as the cell or living system persists through time. ***<br />
<br />
"Built into this program must also be the possibility of modifying its expression, transiently or lastingly, in response to the changing contingencies of the external environment. *** One way of conceiving of this capacity to vary a program is as an action plan, an 'internal representation' of the desired goal-- at its minimum, that of survival at least until replication is achieved. I will be arguing that, in multicellular organisms, such action plans are ultimately what brains are about.<br />
<br />
"Amongst the most basic forms of adaptive behavior drawing on such action plans <em>is goal-directed movement-- of a unicell swimming towards food for instance.</em> [Emphasis added]. Dip a thin capillary tube containing a solution of glucose into a drop of bacteria-rich liquid, and the bacteria collect around the mouth of the capillary from which the glucose diffuses--a phenomenon first noted as long ago as the nineteenth century. Such simple responses engage a series of necessary steps. First, the cell needs to be able to sense the food. In the simplest case the food is a source of desirable chemicals --- perhaps sugars or amino acids-- although it may also be the metabolic waste products excreted by another organism. Indeed the molecule does not have to be edible itself provided it can indicate the presence of other molecules that can be metabolized-- that is, it acts as a signal. *** But signals are only signals if there is recipient who can interpret the message they bear. Cell membranes are studded with proteins whose structure is adapted to enable them to trap and bind specific signaling molecules floating past them, and hence read their message. This chemical detection system is the most basic of all sensory mechanisms.<br />
<br />
"Interpreting the message --- using it to develop a plan of action -- should make it possible for the cell to determine the direction of the gradient and finally to move up it to the source. Moving towards a specific chemical source --- <a href="https://en.wikipedia.org/wiki/Chemotaxis" target="_blank">chemotaxis</a> --- requires that the cell possess some sort of direction indicator or compass. One way of creating such a compass, employed by bacteria, is to swim in a jerky trajectory, enabling the cell to interpret the gradient by comparing the concentration of the attractant chemical at any moment with that a moment before.***"<br />
<br />
If Jaak Panksepp were reading this passage he would certainly connect it to his research of emotions in animal brains. It describes the precursor to what Panksepp regards as the most central emotional system in mammals: the SEEKING system (see below). Rose continues:<br />
<br />
"The molecules trapped by the receptor on the surface membrane serve as signals, but very weak ones. To produce as dramatic a cellular response as turning and moving in the right direction requires that signals are highly amplified. The mechanism by which this is carried out, even in the seemingly simplest of unicells turns out to be the basis on which the entire complex apparatus of nervous systems and brains is subsequently built. The receptors are large proteins, oriented across the lipid membrane, with regions sticking out into the external environment, and also 'tails' which reach into the interior of the cell (the cytoplasm). When the signal receptor binds to the receptor protein its effect is to force a change -- a twist, if you like -- in the complex shape of the receptor. ***<br />
<br />
"One way of speaking of this process, favoured by neurologist Antonio Damasio, even in so limited an animal as Paramecium, is as 'expressing an emotion.' Emotion for Damasio, is a fundamental aspect of existing and a major driver of evolution.<br />
<br />
"*** With multicellularity, 'behaviour' becomes a property of the organism as a whole, to which 'needs' of individual cells are subordinated. The internal representation which makes possible the action plan for organism can be delegated to specific cell ensembles. This requires new modes of communication to be developed. Where previously there were only two classes of signals -- those arriving from the external environment to the cell surface, and those internal to the cell --- there are now three. Signals from the external environment are still registered by sensory cells on the surface and are transmuted by molecular cascades with them, but now the response to those cascades requires that further messages be sent from the sensory cells to other regions of the body, including of course the contractile cells. Sometimes the sensory cells make contact with intermediaries <em>whose task it is to synthesise and secrete the necessary 'messenger molecules.'</em> [Emphasis added]. The messengers can then be distributed through the body either by way of a circulatory system or by diffusion through the extracellular space between the body cells, and are detected as before by specialized receptor proteins on the surface membranes of their targets. When molecules that served such messenger functions were first identified in mammals, they were given the generic name of hormones. It was only later, and to some surprise, that it was discovered that many of the same molecules also serve as intercellular signals in very early multicellular organisms, another powerful example of evolutionary conservation.***<br />
<br />
"It is easy to imagine a sequence whereby neurons evolved from secretory cells. Instead of discharging their contents generally into the surrounding space and circulatory system, the secretory cells could have put out feelers (called 'processes') enabling them to make direct contact with their targets so as to signal rapidly to them and them alone. Messages could be conveyed between the two either electrically or chemically --- by a depolarizing wave or by secreting a messenger molecule across the membrane at the point where the two cells touch. In fact, both phenomena are know to occur.<br />
<br />
"The first step towards such nervous systems can be seen among the large group of <span id="goog_170029980"></span><a href="http://www.blogger.com/">Coelenterates<span id="goog_170029981"></span></a>, believed to be amongst the earliest true multicellular animals. The best known is perhaps the <a href="http://en.wikipedia.org/wiki/Hydra_(genus)" target="_blank">Hydra</a>, a tiny creature that sits at the bottom of streams attached to rocks or water plants, waving its tentacles above its mouth. When a potential source of food brushes past its tentacles, the Hydra shoots out poisonous threads, collects the paralysed victim and thrusts it into its mouth. *** A well fed Hydra is quiescent; when hungry it waves its tentacles vigorously, or moves its location by repeatedly turning head-over-heals, seeking food-rich or oxygen-rich environments <em>(once again, Damasio would regard these acts 'expressing emotions</em>').***<br />
<br />
"What distinguishes a fully-fledged nervous system --- our own for instance --- is a one-way flow of information through the system, from dendrites to axon, from sensory cell to effector. Of course this is mediated via all the feedback loops, but none the less there is a directionality to it that the Hydra's does not possess.<br />
<br />
"Whereas the Hydra's neurons are scattered throughout the body, the next crucial step was to concentrate them within an organized system. *** <em><a href="http://en.wikipedia.org/wiki/Caenorhabditis_elegans" target="_blank">C. elegans</a></em> has a head and tail end, and as it is more important for it to know where it is going than where it has been, many of its sensory cells are clustered at its head end. From these, nerve connections run to clusters of interneurons, pack into groups (ganglia) with short interconnecting processes between the cells within the group and longer nerve tracts leading out along its gut and ultimately to the effectors: contractile, egg- and sperm producing cells. These neurons use many of the neurotransmitters that are found in mammalian brains (notably the amino acid glutamate), an indication of how far back in evolutionary terms these molecules were adapted for signaling functions.***<br />
<br />
"The evolutionary track I have been mapping," writes Rose, "has led from proto-cells to faithfully replicating eukaryotes capable of responding adaptively to patchy environments, from single-celled eukaryotes to multicellular animals with internal signaling systems, and from these to fully-fledged nervous systems capable not merely constructing action plans, but of modifying those plans, at least temporarily, in response to environmental contingencies. But we haven't yet arrived at brains. This must have been the next step along the evolutionary path that led to humans. Concentrating neurons in ganglia is a way of enhancing their interactions and hence their collective power to analyze and respond to incoming stimuli. Locating them at the front end of the organism is the beginning of establishing not merely a nervous system but a brain, though head ganglia or brains only slowly begin to exert their primacy over the other ganglia distributed through the body.*** [Turning to invertebrates] although insect (arthropod) and molluscan neurons are pretty similar to human neurons, and the biochemical motors that drive the system -- their electrically excitable membranes and the neurotransmitters --- work in the same way, the organization of the system is entirely different. In molluscs and arthropods the central ganglion --- the nearest any amongst these huge numbers of species have to a brain --- and the principal connecting pathways between it and other ganglia lie arranged in a ring around their guts. This is a device that can be seen even in earthworms, and it imposes a fundamental design limitation on the complexity of the nervous system.***<br />
<br />
"The development of large brains required two major changes in the construction of nervous systems: the separation of the nerves themselves from the gut, and the concentration of nervous power. It also required the first step towards the development of a bony skeleton. Amphioxus, small sea-floor fish, is an example. Less behaviourally sophisticated than octopus or bee, it has a flexible rod of cartilage, a notochord, running down its back --- the forerunner of the spinal column --- with the merit of providing a bracing device against which muscles can pull. More relevantly for the present argument is that eh major nerves and central ganglion lie in a continuous tube running the length of the creature's body, thus disentangling them from the gut and giving space for growth."<br />
<br />
We have not even discussed Panksepp's research yet, but there is much here in Steven Rose's account of the evolutionary development of the animal nervous system that indicates the system of neurotransmitters and specialized receptors found in vertebrates long preceded the development of the brain stem in vertebrates. And there is a suggestion by Steven Rose that this system was capable of "expressing emotions," although probably not in the same sense that Panksepp intends. But it would be fair to say that human emotional systems and those of other mammals not only have their origins in vertebrate animals older than humans, but in the earliest forms of life on earth. This is an anthropomorphic view of human emotions as described by Frans DeWaal in <em><a href="http://www.amazon.com/dp/0465041760" target="_blank">The Ape and The Sushi Master</a> </em>(see <a href="http://csilcox-thebookshelf.blogspot.com/2010/06/frans-de-waal-ape-and-sushi-master.html" target="_blank">June 17, 2010 post</a>). To be sure, Panksepp is careful to admonish in his discussion of similarities between the neurological systems of humans and other mammals that "similar does not mean the same." There are similar structures and similar transmitters and receptors in the brain, but their location within the brain may be slightly different or even vastly different, and those differences may result in small or even large differences between humans and other mammals. But in identifying these similarities, Panksepp observes, as the book's subtitle hints, the neuroevolutionary origins of human emotions. Panksepp decries the history of human psychological research that declined to recognize emotions in animals. There is considerable <a href="http://www.nytimes.com/2013/06/30/magazine/want-to-understand-mortality-look-to-the-chimps.html?pagewanted=all&_r=0" target="_blank">research</a> available today that rebuts that notion.<br />
<br />
Panksepp discusses several emotional systems, but central to nearly all of them is what he has labeled the <a href="http://sourcesofinsight.com/seeking-is-the-granddaddy-of-emotional-systems/" target="_blank">SEEKING system</a>. And in beginning this discussion, we can think back to Steven Rose's reference to the "<em>goal-directed movement-- of a unicell swimming towards food for instance." </em><br />
<em></em><br />
Panksepp is controversial within the neuropsychiatric community, challenging some of the dogmas of neuroscience and human psychotherapy. One of the dogmas is reflected in this statement from Rita Carter's <em><a href="http://www.amazon.com/Mapping-Mind-Revised-Updated-Edition/dp/0520266285" target="_blank">Mapping the Mind</a> </em>(see <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html" target="_blank">November 6, 2011 post</a>): "A huge volume of evidence suggests that consciousness emerges from the activity of the cerebral cortex that the particular type of consciousness that includes the sense of self requires activation in the frontal lobes. Ask yourself this: Where, precisely, do I feel that "I" am centered? If you are like most people, you will point to a position just above the bridge of your nose. It is right behind here that you will find the prefrontal cortex --- the area of the frontal lobe most closely associated with the generation of consciousness. This region is also responsible for our conscious perception of emotion and our ability to attend and focus. Most important of all, it endows the world with meaning and our lives with a sense of purpose. The symptoms of schizophrenia, depression, mania and Attention Deficit Disorder are mainly due to frontal lobe disorder." Carter's sentiment reflects a view that leads psychotherapists to focus on treating the executive, regulatory capacity of the human brain in the frontal cortex in order to overcome these disorders. While Panksepp does not dismiss the role of the prefrontal cortex in the conscious life of humans, he does disagree with the directionality implicit in this statement: for Panksepp, like Antonio Damasio (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>) "the generation of consciousness" begins with the evolutionarily older parts of the brain --- in the <a href="http://www.innovateus.net/health/what-function-midbrain" target="_blank">midbrain</a>, where neurotransmitters are generated --- as well as the <a href="http://www.hhs.gov/opa/familylife/tech_assistance/etraining/adolescent_brain/Development/limbic/" target="_blank">limbic system</a>, which together are at the foundation of the seven emotional systems he describes in <em>Archaeology of Mind.</em> It is here that the "<a href="http://books.google.com/books?id=uIQQskejGwUC&pg=PA61&lpg=PA61&dq=%22core+self%22+and+Panksepp&source=bl&ots=3NYiQVmky9&sig=u_Ki6T82duJvC1Q2mDi4_TqWJ5k&hl=en&sa=X&ei=7Nq9Ua-hHbe24AOenYDgCA&ved=0CDsQ6AEwAg#v=onepage&q=%22core%20self%22%20and%20Panksepp&f=false" target="_blank">core self</a>" of consciousness emerges, or as Panksepp calls it, the core affective self. The symptoms of certain mental disorders, Panksepp believes, are <em>not </em>"mainly due to frontal lobe disorder" but may have more to do with the imbalanced (excessive or diminished) production of specific chemicals in the brain in the more ancient parts of the brain. And as the <a href="http://csilcox-thebookshelf.blogspot.com/2013/04/nessa-carey-epigenetics-revolution-2012.html" target="_blank">previous post</a> suggests, epigenetics provide some explanation in the case of stress disorders caused by early childhood abuse leading to excessive production of cortisol that overwhelms the ability of the limbic system to restore calm. <br />
<br />
The seven emotional systems described by Panksepp (and he does not rule out that there may be more) are these:<br />
<br />
<em>The Seeking System</em>. This does not immediately sound like it describes an emotional system, but clearly Panksepp is correct in characterizing the Seeking System. This is the system "that allows animals to search for, find and acquire resources that are needed for survival. Arousal of this Seeking System produces all kinds of approach behaviors, but it also feels good in a special way. It is not the kind of pleasure we experience when eating a fine meal, or the satisfaction we feel afterwards. Rather it provides the kind of excited, euphoric anticipation that occurs when we look forward to eating that meal . . . the anticipation of sex . . . the thrill of exploration." Panksepp refers to the Seeking System as the primary process emotional powers that makes animals into active agents in their environments. "Among animals in the wild, it is easy to see the Seeking system in action. Resources are not readily available and animals must persistently seek them out in order to survive. They must hunt or forage for food and search for water, find twigs or dig holes to fashion sheltering nests. The Seeking system urges them to nurture their young, to search for a sexual partner, and when animals live in social communities, to also find nonsexual companions, forming friendships and social alliances. . . Although this system vigorously responds to homeostatic needs, to emotional urges and to enticing temptations, it operates more or less continuously in the background, albeit at much lower levels when people and animals are not in any particular need of resources or troubled by problems that urgently require solutions. This system keeps animals constantly exploring their environments so they can remember where resources are." Importantly, in Panksepp's view, it is the Seeking System that is the motivator behind the intellectual pursuits of the neocortex: "the neocortex does not provide its own motivation; the neocortex is activated by subcortical emotional systems . . . the neocortex is the servant of our emotional systems." It is the Seeking System that urges architects, artists, writers, politicians, and scientists to discover new and better ways to solve problems and express themselves. It "energizes all human creativity." Seeking arousal "is an anticipatory gift of nature that provides seemingly infinite opportunities for learning; with the developmental/epigenetic emergence of higher mental processes, it gradually fine-tunes reasonable expectations, working hypotheses, as in the conduct of science." It is intimately connected with learning, which Panksepp describes as an "automatic, unconscious process that enhances are natural proclivity to engage with the world in ever more subtle ways as our minds mature." In contrast, <em>affect </em>(behavioral outcomes connected to arousal of instinctual emotional systems) is never unconscious; it is felt.<br />
<br />
Chemically, the Seeking System is understood to be aroused by <a href="https://en.wikipedia.org/wiki/Dopamine" target="_blank">dopamine</a> transmitters, but <a href="https://en.wikipedia.org/wiki/Glutamate" target="_blank">glutamate</a>, which functions in learning and memory, and <a href="https://en.wikipedia.org/wiki/Neuropeptides" target="_blank">neuropeptides</a> such as <a href="https://en.wikipedia.org/wiki/Orexin" target="_blank">orexin</a> and <a href="https://en.wikipedia.org/wiki/Neurotensin" target="_blank">neurotensin</a> are understood to activate the Seeking System while <a href="https://en.wikipedia.org/wiki/Dynorphin" target="_blank">dynorphin</a> is believed to deactivate it. The neurons for these transmitters are found in the midbrain: anatomically, <a href="http://en.wikipedia.org/wiki/Ventral_tegmental_area" target="_blank">ventral tegmental area</a>, the <a href="http://en.wikipedia.org/wiki/Medial_forebrain_bundle" target="_blank">medial forebrain bundle</a>, the <a href="http://en.wikipedia.org/wiki/Lateral_hypothalamus" target="_blank">lateral hypothalamus</a>, the <a href="http://en.wikipedia.org/wiki/Nucleus_accumbens" target="_blank">nucleus accumbens</a>, and then running to the <a href="http://en.wikipedia.org/wiki/Ventromedial_prefrontal_cortex" target="_blank">medial prefrontal cortex</a> via the <a href="http://en.wikipedia.org/wiki/Mesolimbic_pathway" target="_blank">mesolimbic</a> and <a href="http://en.wikipedia.org/wiki/Mesocortical_pathway" target="_blank">mesocortical</a> dopamine pathways. "In all mammals," notes Panksepp, "the nucleus accumbens interacts with the medial frontal cortex to promote simple appetitive learning (and addictions). Because the Seeking System energizes the frontal neocortical regions, especially the medial zones that focus on immediate emotional needs, we are able to devise strategies to obtain rewards and escape sanctions (pain) and other pitfalls. We remember particularly pleasurable experiences and the possibility of addiction is created. Dopamine transmitters are associated with drugs of abuse, and when they are overly excited there can be negative consequences from addiction. On the other hand, when the Seeking System is underactive, depressive feelings can emerge. Humans differ from other animals here in one important respect; the dopamine pathways that energize the cortex are linked not only to the frontal cortex but to other sensory-perceptual cortices in the back of the brain.<br />
<br />
<em>The Rage System. </em>The Rage System needs little explanation: the foundation of anger and aggression. What it is not deserves some explanation: it probably has little to do with war among societies (group aggression), nor is it about predatory aggression such as seeking food. In contrast to the Seeking System, which is largely a "positive" emotion, the Rage System produces unpleasant affects. The Rage System is connected to dominance systems in species. The Rage System runs from the medial areas of the <a href="http://en.wikipedia.org/wiki/Amygdala" target="_blank">amygdala</a> to the <a href="http://en.wikipedia.org/wiki/Hypothalamus" target="_blank">medial hypothalamus</a> to areas of the <a href="http://en.wikipedia.org/wiki/Periaqueductal_gray" target="_blank">periaqueductal gray</a> (PAG). As with the Seeking System (and all the other emotional systems Panksepp describes), these are the ancient areas of the brain. The chemicals that can promote rage include <a href="http://en.wikipedia.org/wiki/Testosterone" target="_blank">testosterone</a> (known to promote physical aggression in males to a greater extent than females), <a href="http://en.wikipedia.org/wiki/Substance_P" target="_blank">Substance P</a> (important to pain perception), <a href="http://en.wikipedia.org/wiki/Norepinephrine" target="_blank">norepinephrine</a>, <a href="http://en.wikipedia.org/wiki/Glutamate" target="_blank">glutamate</a>, <a href="http://en.wikipedia.org/wiki/Acetylcholine" target="_blank">acetylcholine</a>, and nitric oxide synthases. The Rage System can be controlled by chemical inhibitors such as <a href="http://en.wikipedia.org/wiki/GABA" target="_blank">gamma-aminobutyric acid</a> (GABA) and <a href="http://en.wikipedia.org/wiki/Oxytocin" target="_blank">oxytocin</a>. <br />
<br />
<em>The Fear System. </em>Similarly, the Fear System needs little explanation. Like the Rage System, it is not a positive emotion; it produces anxiety, stimulates flight, fight or freezing. The Fear System operates between the PAG and the amygdala and it is aroused by external and internal stimuli, notably pain, but some responses appear to be innate caused by hard-wired sensory inputs. Panksepp mentions rats fear of open spaces, sudden movements and loud noises as example innate fear responses. But fear is connected to memory as well, and memory plays a significant role in conditioning fear responses. On memory, Panksepp explains, that learning and memory are automatic and involuntary responses (mediated by unconscious mechanisms of the brain), which in their most lasting forms are commonly tethered to emotional arousal. Emotional arousal is a necessary condition for the creation of fear-learning memories.<br />
<br />
<em>The Lust System. </em>The Lust System drives basic mammalian physical impulses (sexual affects) on the one hand and social emotions on the other, which can be both positive and negative. It can drive anti-social behavior (rape, stalking) as well as building families and promoting other forms of well-being. In the male brain the center of primary sexual urges is in the medial regions of the <a href="http://en.wikipedia.org/wiki/Anterior_hypothalamus" target="_blank">anterior hypothalamus</a>, (although Panksepp notes that "the precise brain location varies from one species to another). Testosterone stimulates pleasure in the male, which activates neuropeptides such as vasopressin and promotes sexual ardor, courtship, intermale aggression and possibly jealousy. Testosterone also activates <a href="http://en.wikipedia.org/wiki/Biological_functions_of_nitric_oxide" target="_blank">nitric oxide in the brain</a>, which promotes heightened sexual eagerness. In females, <a href="http://en.wikipedia.org/wiki/Estrogen" target="_blank">estrogen</a> and <a href="http://en.wikipedia.org/wiki/Progesterone" target="_blank">progesterone</a> (the estrus cycle) controls sexual arousal, but <a href="http://en.wikipedia.org/wiki/Androgen" target="_blank">adrenal testosterone</a> plays a role in sexual receptivity. The Lust System, Panksepp says, "recruits" the Seeking System "dopamine-fueled search for companionship.<br />
<em></em><br />
<em>The Care System.</em> The Care System is not universal in the animal kingdom, but nearly all mammals and birds exhibit maternal care for their young. In fish, the job of tending to a nest of eggs is left to fathers, and the brain circuits that drive this behavior Panksepp calls the Care System. Panksepp notes that researchers learned of the existence of the Care System in mammals when they discovered that blood transfusions <em>from </em>postpartum female rats <em>to virgin rats </em>would lead to maternal behavior in the virgin rats, including nest building, hovering over young, and gathering the young who strayed from the nest. Panksepp concedes that researchers still do not which chemicals in the transferred blood interact in the brains of virgin rats to cause these behaviors, but given similarities between the urge to provide Care and the urges underlying the Seeking System, brain arousal from dopamine in conjunction with opioids, as well as oxytocin and prolactin are likely involved. Panksepp hypothesizes that the evolution of the Care System might be traced back to chemicals found in the Lust circuits of reptiles, such as <a href="http://www.jneurosci.org/content/26/25/6749.full.pdf" target="_blank">vasotocin</a>, which has a calming effect and promotes nurturant moods in some birds, and neuropeptides like <a href="http://www.ncbi.nlm.nih.gov/pubmed/15855627" target="_blank">mesotocin</a> that may have evolved in <a href="https://en.wikipedia.org/wiki/Vasopressin" target="_blank">vasopressin</a> and oxytocin, which is recognized as a key maternal chemical. The maternal (and paternal) nurturing behavior must be recognized as a critical factor in the development of social brain systems. <a href="http://www.jneurosci.org/content/21/20/8278.full.pdf" target="_blank">Research shows</a> that both oxytocin and vasopressin strengthen social memories and are believed to be promote social bonds among mammals. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/07/dacher-keltner-born-to-be-good-science.html" target="_blank">July 16, 2010 post</a>).<br />
<br />
<a href="http://en.wikipedia.org/wiki/Michael_Meaney" target="_blank">Research on the Care System in rats</a> also reveals evidence of epigenetic changes leading to more prosocial behavior. Female rats lick their pups during early development and this has been shown to influence the emotional abilities of young rats later in life. Abundantly licked rats grow up to be less anxious, more resistant to stress, and more capable of exhibiting learning and other adaptive behavior later in life. These adult rats have diminished stress hormones (<a href="http://en.wikipedia.org/wiki/Corticotropin-releasing_factor_family" target="_blank">corticotrophin-releasing factor</a> (CRF)) and <a href="https://en.wikipedia.org/wiki/Adrenocorticotropic_hormone" target="_blank">adrenocorticotrophic hormone</a> (ACTH), more GABA receptor cites, which promotes reduced anxiety, and more receptors for glutamate and norepinephrine, which facilitate learning. Emotionally, these animals are less anxious, showing more activity and fearlessness, and better learning and performance in a variety of fear-inducing situations. This research could have been cited by Nessa Carey in <em>The Epigenetic Revolution. </em>(See <a href="http://csilcox-thebookshelf.blogspot.com/2013/04/nessa-carey-epigenetics-revolution-2012.html" target="_blank">April 28, 2013 post</a>). <br />
<em></em><br />
<em>The Panic/Grief System. </em>Panic and grief intuitively seem like strange bedfellows but the common emotional/behavioral link in this "system" is separation anxiety, something that is seen across a number of species. Grief connotes a sadness that arises from social loss; panic connotes a separation from a secure or stable environment. Immediately, one can conjure linkages between what Panksepp labels the Panic/Grief System and the Care System, the Fear System. The Panic System is seen in early childhood development over anxiety in separation of mother and child ("Born to Cry" is the title of this chapter), but it has also been found to be less active in adults. The Panic/Grief circuits are found in several of the same subcortical areas identified with other systems, including the <a href="http://en.wikipedia.org/wiki/Periaqueductal_gray" target="_blank">PAG</a> and surrounding subcortical regions including the <a href="http://en.wikipedia.org/wiki/Medial_dorsal_nucleus" target="_blank">dorsomedial thalamus</a>, the ventral septial area, the dorsal <a href="http://en.wikipedia.org/wiki/Preoptic_area" target="_blank">preoptic area</a> and the <a href="http://www.ncbi.nlm.nih.gov/pubmed/12600711" target="_blank">bed nucleus of stria terminalis</a>. Previously identified stress neuropeptides such as <a href="http://en.wikipedia.org/wiki/Corticotropin-releasing_factor_family" target="_blank">CRF</a> and <a href="https://en.wikipedia.org/wiki/Adrenocorticotropic_hormone" target="_blank">ACTH</a>, and glutamate (an excitatory neurotransmitter associated with every emotional response) arouse the Grief System. Imbalances in the Grief System are a key factor in a variety of emotional disorders because so much mental illness, Panksepp notes, is rooted in the incapacity to enjoy the security of warm interpersonal attachments. Panic attacks, depression, autism, and a variety of other social phobias are part of the Grief pathologies. The identification of neuropeptides that actually diminish separation distress and mediate the Care System, such as oxytocin and prolactin,and the stimulation of <a href="https://en.wikipedia.org/wiki/%CE%9C-opioid_receptor" target="_blank">mu-opioid receptors</a> in the brain may have role in treatments of these disorders.<br />
<br />
<em>The Play System. </em>Finally, but not least, something that one might not think of as an emotional system, but Panksepp clearly documents that it is, particularly in mammals: the Play System. "Physical playfulness is a birthright of every young mammal and perhaps of many other animals as well. . . It is now certain that a genetically determined Play network that mediates positive affect exists in mammalian brains, although many details remain to be worked out." The Play System is likewise concentrated in subcortical brain regions, intimately linked to the Seeking System: the urge to play is like a type of hunger, and is not necessarily a social need, although it is linked to social emotional systems. Play is linked to the capacity to laugh, a positive emotional affect. Laughter is not merely found in humans, but also noises made by rats, chirping of birds. Laughter is stimulated early in children, including mimicry. Like the Seeking System, <a href="https://en.wikipedia.org/wiki/Dopamine" target="_blank">dopamine, </a>which is engaged during activity that entails considerable positive anticipation and euphoria, is believed to fuel the Play System because it is aroused (correlated) during play. Play activates sensory inputs, such as touch, which go directly to older midline regions of the brain such as the <a href="http://en.wikipedia.org/wiki/Allothalamus" target="_blank">parafascicular complex</a> and the posterior dorsomedial thalamic regions. <br />
<br />
In the foregoing, I have catalogued for each of Panksepp's seven emotional systems of the brain the suspected chemistries and at the outset I tried to demonstrate that research documents the ancient role of chemicals in the neurological systems of species and their potential link to the development of emotional system. My objective in this outline is to highlight a point in a previous post about social emotions, including moral emotions. In his book <em><a href="http://www.amazon.com/Moral-Origins-Evolution-Virtue-Altruism/dp/0465020488" target="_blank">Moral Origins</a></em>, <a href="http://dornsife.usc.edu/cf/faculty-and-staff/faculty.cfm?pid=1003114" target="_blank">Christopher Boehm</a> concludes by saying that in a few generations we "may have identified some of the genetic mechanisms that help us to behave egoistically, nepotistically, and altruistically, along with others that make for sympathetic generosity, domination and submission, and a variety of other socially significant behaviors that are relevant to morality, including our shame responses." The earlier post (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>) observed that "Boehm may well be right that we will identify the genetic <em>mechanisms</em> behind moral and immoral behavior in a few generations, but the roadmap of investigation is already before us and it begins with emotions. I say this for two reasons: first, if anything, genes code for our body chemistry; genes may or may not code for specific behavior (moral or otherwise), although I doubt it (see <a href="http://csilcox-thebookshelf.blogspot.com/2009/11/richard-powers-generosity-2009.html" target="_blank"><span style="color: #473624;">November 30, 2009 post</span></a>). But emotions are driven by electro-chemical actions and reactions in our various body systems and ultimately the neurological system leading to our brains, and genes do code for these electro-chemical actions and reactions and genes code for our brain and other body organs. If we want to understand the genetic basis for moral and immoral behavior we will look for the genes tied to these body systems and the chemistry that drives emotions." Panksepp's aggregation of the research on these primary process emotions is a good peek into the links between genes, chemistries, and anatomical structures related to emotions. In addition to linking genes with the chemicals and brain structures that drive these emotional systems, the inquiry contemplated by Boehm would presumably link these seven emotional systems to other more complex emotional systems not considered "primary process" systems, including the social emotions discussed in the <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a> such as embarrassment, shame, guilt, contempt, indignation, sympathy, compassion, awe, gratitude, and pride. <br />
<br />
One cannot help read <em>The Archaeology of Mind </em>without feeling that Panksepp believes he has been walking in the wilderness of neuroscientific research that treats emotional systems as fundamental, more fundamental than research of the neocortex. While he now believes that Antonio Damasio has joined his crusade with the publication of <em><a href="http://www.amazon.com/Self-Comes-Mind-Constructing-Conscious/dp/030747495X" target="_blank">Self Comes to Mind</a> </em>(see <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>), in which Damasio gave a tip of the hat to Panksepp's research, Panksepp is skeptical of Damasio's earlier <a href="http://cel.huji.ac.il/courses/structureandprocesses/Bibliography/Somatic_Marker_%202006.pdf" target="_blank">somatic marker hypothesis</a> and the assertion that core consciousness (a higher order mapping process outside the subcortical regions) generates inner emotional feelings of what is happening by synthesizing information from maps abut the body and about the environment. As stated earlier, it is the subcortical emotional system that energizes the neocortex, says Panksepp, not the other way around. Fundamentally, Panksepp believes that mental and emotional disorders go hand in hand and are best understood as a chemical problem, and when understood in that leads to two important conclusions: (1) that chemistry will have a key role in providing treatment, and (2) it will cause psychotherapists to recognize that treatment must deal with the emotional aspects of the older subcortical parts of the brain. For Panksepp, the key question for all neuroscientists and biological psychiatrists is this: "How are raw affective experiences created in the brain?" The answer he believes will clarify the foundational nature of experience in general as well as affective disturbances. For example, Panksepp writes, for depression he would ask: Why does depression feel so bad? Why does depression hurt? Why is it so psychologically painful? What does it mean to experience social pain? Few neuroscientists have been willing to ask these questions.<br />
<br />
One cannot conclude a statement about Panksepp's research without noting what he neither ignores, but nonetheless does not dwell on: the role of the cortical areas of the brain in human consciousness. When he does acknowledge higher order BrainMind structures, he says this: "Although arousals of the primary process emotional networks of mammalian brains are intensely experienced by humans and other animals, it is especially important to recognize that the <em>secondary</em> processes of the BrainMind, the basic forms of learning, memory, and habit formation are among the most unconscious 'mental' processes of them all. Once we understand this, then many of the bizarre and faulty views from psychology's past may be rectified. For instance, 'free will' is not a figment of our imagination as too many scientists are ready to claim these days. Free will is a higher <em>tertiary-level</em> neurocognitive function that we use on a regular basis (and quite effectively when we are not too emotionally aroused) for future planning actions. This is brought out beautifully in the concept of 'autonomy' and 'self-determination' as developed by <a href="http://www.selfdeterminationtheory.org/SDT/documents/2006_RyanDeci_Self-RegulationProblemofHumanAutonomy.pdf" target="_blank">Ryan and Deci (2006)</a>. However, we cannot readily will ourselves out of underlying emotional turmoil that has been created through the consolidation of maladaptive affective patterns at primary and secondary levels of BrainMind organization. At primary-process levels of emotional processing there is no free will, there is no 'controlled cognitions.' Neither do the automatic secondary processes of learning and memory functions, that are molded by our wild animal passions developmentally, exhibit free will. That can only emerge from well-sculpted, deeply reflective, cognitive attitudes." He adds, "It is surely our vast cerebral 'thinking cap' --- our extensive cortico-cognitive apparatus --- that distinguishes us mentally from our animal ancestors. That adds layers of complexity that cannot be readily addressed with animal models." Michael Gazzaniga would agree. (See <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/michael-gazzaniga-human-2008.html" target="_blank">September 27, 2009 post</a>). But "language," Panksepp says," our most unique cerebral skill, "emerges through emotional guidance. Through language, however, we can uniquely study the extended <em>tertiary-process </em>cognitive affective consciousness of humans. And this is why there continues to be enormous growth in descriptive (ie. nonneuroscientific) emotion studies in psychology (<a href="http://www.amazon.com/Handbook-Affective-Sciences-Series-Science/dp/0195377001" target="_blank">Davidson et all., 2003</a>)." <br />
<br />
And with that paragraph, I pull the next book off of The Bookshelf. CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-24753652831876180052013-04-28T16:13:00.001-07:002013-06-01T16:03:16.061-07:00Nessa Carey, The Epigenetics Revolution (2012)A recurring theme running through the posts on this blog is the recognition of a unit of information as the fundamental units of physical nature. Collateral to this theme: the transmission of information and the avoidance of errors in translation during transmission, whether those communications are at the cellular level or communications among species or between individuals of a species. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/03/donella-meadows-thinking-in-systems.html" target="_blank">March 6, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html" target="_blank">August 15, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2010/11/matt-ridley-genome-1999.html" target="_blank">November 27, 2010</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/charles-seife-decoding-universe-2006.html" target="_blank">August 23, 2009</a>, and <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/seth-lloyd-programming-universe-2006.html" target="_blank">August 17, 2009</a> posts). A slight change in a unit of information can potentially change the "meaning" of a larger assembly of units of information, things akin to what we might label words, a sentence, a paragraph, an entire story. Matthew Ridley uses this kind of analogy when he discusses the genome. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/11/matt-ridley-genome-1999.html" target="_blank">November 27, 2010 post</a>):<br />
<br />
"Ridley calls the genome a book, the chromosome a chapter, the gene a story, an <a href="http://en.wikipedia.org/wiki/Exon"><span style="color: #473624;">exon</span></a> a paragraph, a <a href="http://en.wikipedia.org/wiki/Genetic_code"><span style="color: #473624;">codon</span></a> a word consisting of three letters, and a <a href="http://en.wikipedia.org/wiki/Base_(chemistry)"><span style="color: #473624;">base</span></a> is a letter, either (in the case of <a href="http://en.wikipedia.org/wiki/DNA"><span style="color: #473624;">DNA</span></a>) an A, C, G, or T (or U in the case of <a href="http://en.wikipedia.org/wiki/RNA"><span style="color: #473624;">RNA</span></a>), for <a href="http://en.wikipedia.org/wiki/Adenine"><span style="color: #473624;">adenine</span></a>, <a href="http://en.wikipedia.org/wiki/Cytosine"><span style="color: #473624;">cytosine</span></a>, <a href="http://en.wikipedia.org/wiki/Guanine"><span style="color: #473624;">guanine</span></a>, and <a href="http://en.wikipedia.org/wiki/Thymine"><span style="color: #473624;">thymine</span></a>, each consisting of one or two aromatic rings and arrangements of carbon, hydrogen, nitrogen, and/or oxygen atoms. These chemical units are the basic units of information that comprise life forms, but alone they do not give rise to life. What gives rise to life is (1) the pairing of these letters along a <a href="http://en.wikipedia.org/wiki/Double_helix"><span style="color: #473624;">double helix </span></a>that makes up DNA, and (2) their subsequent transcription into RNA to form three letter codons, which, (3) are subsequently "translated" into a specific amino acid depending on which three of the four letters are transcribed and their sequence. (4) The particular chain of amino acids creates a <a href="http://en.wikipedia.org/wiki/Protein"><span style="color: #473624;">protein</span></a>. By this process, it is said that "genes" code for "proteins." While the RNA amino acid chains may have been the earliest form of life, "life" as we know it received a boost with the creation of cellular membranes to form the first cells that carried the proteins containing genetic information central for the cell's organization. This development is still not fully understood.<br />
<br />
"This ability of the genes to copy themselves, read and transmit their story, under the right conditions, is the ability to create another life form. . . . [T]hese units of information [have the ability] to communicate among themselves --- an electrochemical means --- to say "Let's stick together," or "Let's avoid each other," and then to store itself as if in memory. This is what we find in the genome, whatever the species."<br />
<br />
"Life began with RNA --- which by itself can replicate itself, and translate and transmit its meaning, as well as catalyze with --- break up or join with --- other chemicals, creating amino acids and proteins. The storage device for these words and paragraphs is DNA. An RNA gene found on chromosome 1 translates the information found in DNA to proteins, which become the primary agent for carrying out the direction specified by the information contained in the genes within a cell." <br />
<br />
A prior post discusses how alterations in the structure of DNA and the chromosome, what we typically refer to as mutations, can lead to changes in species. (See <a href="http://csilcox-thebookshelf.blogspot.com/2010/12/sean-b-carroll-making-of-fittest-2006.html" target="_blank">December 14, 2010 post</a>):<br />
<br />
"And there are a variety of <em>mutations</em> of the genetic code. The most common is the equivalent of a typographical error in the process of copying the genetic code --- a substitution of one of the four letters for another. But there are also mutations involving deletions of code and insertions of repeating code or duplications. Sometimes these mutations actually mean something --- changing something about the phenotype in which the genetic code resides, but many, many times these changes mean nothing --- they don't change a thing about how the gene works. Some genes simply lose their meaning over time because they are no longer used, and these are called <em>fossil genes</em>. And some mutations that do have meaning simply do not survive to live another generation because <em>selection</em> is neither accommodating nor forgiving. When mutations occur repeatedly and have meaning --- in the sense that it changes something about the phenotype in which it resides --- and selection favors the survival of that mutation, then given enough <em>time </em>(many generations, thousands of years) we can find new species evolving. Carroll has reduced his mantra of chance, selection and time to this expression: "i) given sufficient time, ii) identical or equivalent mutations will arise repeatedly by chance, and iii) their fate (preservation or elimination) will be determined by the conditions of selection upon the traits they affect."<br />
<br />
<a href="http://en.wikipedia.org/wiki/Epigenetics" target="_blank">Epigenetics</a> concerns a molecular examination of genetics, and how substitutions of molecules on a piece of DNA can have consequences for the phenotype of which the DNA is a part. This does not involve a change in the arrangement of the genetic letters, or the words, or the paragraph or the chapter, in Ridley's terms. These molecular alterations change the way in which <a href="http://en.wikipedia.org/wiki/Gene_expression" target="_blank">genes are expressed</a>. Broadly speaking, the "environment" is believed to play a role in these alterations, importantly during embryonic development, but at any stage of life. And <a href="http://www.the-scientist.com/?articles.view/articleNo/32637/title/Lamarck-and-the-Missing-Lnc/" target="_blank">recent research</a> is beginning to show that epigenetic changes can be inherited for a generation or more. The change in gene expression can alter the very nature of cells themselves. <br />
<br />
Currently, there are two fairly well known types of epigenetic modification. The first is called DNA <a href="http://en.wikipedia.org/wiki/Methylation" target="_blank">methylation</a>, which involves the addition of a <a href="http://en.wikipedia.org/wiki/Methyl_group" target="_blank">methyl group</a> (carbon atom bonded to 3 hydrogen atoms) to one of Ridley's genetic "letters" --- <a href="http://www.worldofmolecules.com/life/cytosine.htm" target="_blank">cytosine</a>. The underlying DNA sequence is not altered by the modification. Cytosine, says <a href="http://www.nessacarey.co.uk/" target="_blank">Nessa Carey</a> in <em><a href="http://www.amazon.com/Epigenetics-Revolution-Rewriting-Understanding-Inheritance/dp/0231161166" target="_blank">The Epigenetic Revolution</a></em>, has been "decorated," not changed. DNA methylation is associated with genes that are turned "off." This can lead to a number of disorders that the individual suffers from for the rest of their life. In addition to turning a gene off, it is also known to prevent <a href="https://en.wikipedia.org/wiki/Messenger_RNA" target="_blank">messenger RNA</a> molecules from being produced, thus stopping the DNA transcription machinery from working. The second epigenetic modification is called <a href="http://en.wikipedia.org/wiki/Histone_acetylation_and_deacetylation" target="_blank">histone acetylation</a>. Histone acetylation is associated with turning genes on, although that is not always the case. Again the underlying gene sequence is not altered.<br />
<br />
Understanding epigenetic modifications allows us to better understand why "identical" twins are not identical. It leads to a better understanding of certain disorders (cancer, obesity, and other disorders). Nessa Carey describes a wide number of research projects in which science is studying whether epigenetic modifications triggers a particular disorder. I will only discuss one: early childhood abuse triggers an event that has consequences into adulthood; childhood trauma causes an alteration in gene expression in the brain that is generated or maintained by epigenetic mechanisms. The focus of this research is on a hormone known as <a href="http://en.wikipedia.org/wiki/Cortisol" target="_blank">cortisol</a>, which is produced in response to stress. Research shows that the average level of cortisone production in adults seems to be higher for persons with traumatic childhoods. The <a href="http://www.news-medical.net/health/Hippocampus-What-is-the-Hippocampus.aspx" target="_blank">hippocampus</a> in the brain responds to stress, and it releases two hormones: <a href="http://en.wikipedia.org/wiki/Corticotropin-releasing_hormone" target="_blank">corticotrophine-releasing hormone</a> and <a href="http://en.wikipedia.org/wiki/Vasopressin" target="_blank">arginine vasopressin</a>. They stimulate the pituitary, which in turn releases a hormone called <a href="http://en.wikipedia.org/wiki/Adrenocorticotropic_hormone" target="_blank">adrenocorticotrophin</a> into the bloodstream. When cells of the <a href="http://en.wikipedia.org/wiki/Adrenal_gland" target="_blank">adrenal gland</a> take up adrenocorticotrophin, the cells release cortisol. Some of this cortisol makes its way through the bloodstream back to the brain. Receptors in the hippocampus, hypothalamus and pituitary all "recognize" cortisol and the cortisol binds to these receptors creating a signal to the brain to calm down. This reduces the production of cortisol and the result is that we are prevented from being overstressed. Adults who suffered traumatic childhoods are actually overstressed and produce too much cortisol. Something about the feedback loop that would normally reduce stress is broken. Research on mice suggests that DNA methylation of the arginine vasopressin hormone leads to increased expression of this hormone and the stimulation of a stress response. This is all very contentious at this time, and more research is underway on the impact of stress, and other brain-related topics such as memory.<br />
<br />
Carey reports on research involving honeybees. The research is far from conclusive, but there is interest in histone modifications in the control of honeybee development and activity and DNA methylation in the changes of honeybee memory. Carey describes <a href="http://www.uphs.upenn.edu/news/News_Releases/2013/02/berger/" target="_blank">research </a>showing that expression of different epigenetic enzymes varies between different social groups in colonies of ants, and the data suggests that epigenetic control of colony members may be the mechanism that has evolved in the social insects.<br />
<br />
In <em><a href="http://www.nytimes.com/2012/05/13/books/review/the-social-conquest-of-earth-by-edward-o-wilson.html?pagewanted=all&_r=0" target="_blank">The Social Conquest of Earth</a>,</em> biologist E.O. Wilson referred to epigenetics in an entirely different way. (<a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012 post</a>) To determine what evolved that made us humans, he begins by asking "What is human nature?" He suggests that the place to look for the answer to this question is "in the rules of development prescribed by genes, through which the universals of culture are created." Human nature, he says, is the "inherited regularities of mental development common to our species. <em>They are <a href="http://www.pnas.org/content/77/7/4382.abstract" target="_blank">epigenetic rules</a></em>, which evolved by the interaction of genetic and cultural evolution that occurred over a long period in deep prehistory. These rules are the genetic biases in the way our senses perceive the world, the symbolic coding by which we represent the world, the options we automatically open to ourselves, and the responses we find easiest and most rewarding to make. . . They determine the individuals we as a rule find sexually most attractive. They lead us differentially to acquire fears and phobias concerning dangers in the environment, as from snakes and heights, to communicate with certain facial expressions and forms of body language, to bond with infants; to bond conjugally; and so on across the wide range of other categories of behavior and thought." (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/martin-nowak-supercooperators-2011.html" target="_blank">September 17, 2012 post</a>). Wilson's use of the term epigenetics is more related to the term <em><a href="http://books.google.com/books?id=sUlQTveh_lgC&pg=PA246&lpg=PA246&dq=epigenesis+and+culture&source=bl&ots=Er9akoIMeV&sig=zlw89MZ89AMjIwyUghsYp8TLtTg&hl=en&sa=X&ei=tnmqUYOcMse84AOgroDQAg&ved=0CFYQ6AEwCQ#v=onepage&q=epigenesis%20and%20culture&f=false" target="_blank">epigenesis</a></em>, and this is slightly (although not entirely) different than the study of epigenetics that forms the basis of Nessa Carey's <a href="http://www.amazon.com/Epigenetics-Revolution-Rewriting-Understanding-Inheritance/dp/0231161166" target="_blank"><em>The Epigenetics Revolution</em></a><em>. </em>Nor is Wilson approaching this subject from a singularly genetic angle. He is concerned with the interaction of genes and culture (broadly considered) in the evolution of a social unit. I suspect at the bottom of Wilson's use of the term are the neurochemical actions behind learning and memory that triggers epigenetic change affecting the neurological system resulting in patterns of behavior. Of interest to Wilson would be social behavior; for example, why do primates groom each other?<br />
<br />
Epigenetics as described by Nessa Carey or in the sense intended by Edward Wilson both involve the transmission of information, alterations in the information units transferred, and the effect of these <a href="http://www.cam.ac.uk/research/news/scientists-discover-how-epigenetic-information-could-be-inherited" target="_blank">information transfers</a> (positive, negative or neutral). This area of inquiry is a work in progress, and promises to enhance our understanding of disease and health. <br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-59403560505509269232013-03-28T17:25:00.000-07:002013-04-14T14:34:47.131-07:00Richard Wrangham, Catching Fire: How Cooking Made Us Human (2009)<a href="http://eowilsonfoundation.org/wilson-the-scientist" target="_blank">Edward O. Wilson's</a> research on <a href="http://www.nature.com/scitable/knowledge/library/an-introduction-to-eusociality-15788128" target="_blank">eusociality</a> led him to identify the nest as a common attribute among the eusocial species. Although not proven, Wilson surmises that a gene has been suppressed among the eusocial species that silences the brain's program for dispersal from the nest, leading to the sustained survival of the eusocial community. (<a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012 post</a>) Humans are included among the eusocial species, but humans disperse; they do not build and congregate in nests, but they do build and maintain social communities comprised of multiple generations and humans are organized into groups by altruistic division of labor, which are characteristics of eusocial species. As a surrogate for the nest, Wilson suggests that the campfire served a nest-like function in the development of the genus <em>homo</em>, which strongly suggests that mastery of fire was critical to humans eusociality. <br />
<br />
As I read Wilson's <a href="http://www.amazon.com/Social-Conquest-Earth-Edward-Wilson/dp/0871404133" target="_blank"><em>The Social Conquest of Earth</em></a><em> </em>(see <a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012 post</a>)<em>, </em>I was reminded of a book on the bookshelf that addressed this topic, <a href="http://bigthink.com/users/richardwrangham" target="_blank">Richard Wrangham's</a> (see <a href="http://csilcox-thebookshelf.blogspot.com/2010/07/richard-wrangham-and-dale-peterson.html" target="_blank">July 1, 2010 post</a>) <em><a href="http://www.amazon.com/Catching-Fire-Cooking-Made-Human/dp/0465020410" target="_blank">Catching Fire: How Cooking Made Us Human</a>. </em>Wrangham believes that mastery of fire was critical to human evolution, but even more important, mastery of fire enabled early humans to cook their food on a regular basis. According to Wrangham, cooked food is even more significant than mastery of fire for human evolution. Armed with data and concrete examples Wrangham demonstrates that eating cooked food is linked to two evolutionary changes in the human body: (1) comparatively smaller, more efficient digestive systems (particularly the stomach and the small intestine) that require less energy to digest food and absorb nutrients than our predecessors, and (2) larger brains. <a href="http://www.scielo.br/scielo.php?pid=S0100-84551997000100023&script=sci_arttext" target="_blank">Large brains require significant amounts of energy, and that energy is available to the brain only if it is not needed for other activities essential for survival such as eating and digesting</a>. Compared to apes and chimpanzees (and presumably extinct australopithecines and habilines), humans spend a fraction of their daily life eating and digesting food. Apes and chimps spend hours eating plant food or fruit every day. The relative weight of the human gut is roughly only 60% of the relative weight of the gut of apes and chimpanzees.<br />
<br />
The controversial question is when did the first species among the genus <em>homo </em>begin cooking food? For certain the benefits of and development of a preference for cooked food was accidentally discovered. Wrangham believes that human cooking begins with <em>homo erectus. </em>There is anthropological evidence cited by Wrangham that cutting meat with primitive stone tools began as early as 2.6 million years ago. Roughly 300,000 years later, a new species, referred to by some as <em><a href="http://en.wikipedia.org/wiki/Homo_habilis" target="_blank">homo habilis</a>, </em>which still had many <span style="background-color: white;">australopithecine</span> characteristics, emerged, and roughly another 500,000 years later the species referred to as <em><a href="http://en.wikipedia.org/wiki/Homo_erectus" target="_blank">homo erectus</a></em>, emerged according the available fossil record and lived on the African continent for nearly 1.5 million years (until roughly 300,000 years ago). While it is doubtful that <em>homo erectus</em> had language capacity or skills (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/01/stephen-hawking-and-leonard-mlodinow.html" target="_blank">January 31, 2013 post</a> discussing <em>homo neanderthalensis</em>), what we do know is that the cranial capacity of early specimens of <em>homo erectus</em> was 200cc greater than <em>homo habilis </em>and later specimens 400-500cc greater than <em>homo habilis, </em>representing an increase in brain size of approximately 33-75% over the habilines. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). That would be the largest incremental percentage increase from species to species within the genus <em>homo</em>. <em>Homo erectus</em> is recognized to be, in many respects, to be much closer to modern <em>homo sapiens</em> than <em>homo habilis</em>. Combined with some evidence of the use of controlled fire at sites where <em>homo erectus</em> bones have been found, the control of fire and the significant increase in brain size (the energy for which is enabled by decreased energy used in eating and digesting food) lead Wrangham to identify <em>homo erectus</em> as the first human species to favor and consume cooked food on a regular basis. Wrangham also speculates that <em>homo erectus, </em>unlike its predecessors, favored sleeping on the ground (instead of trees) and the control of fire would have been useful in providing light to see predators at night or keeping predators away. The morphology of <em>erectus</em> is not as suitable for sleeping in trees as its predecessors.<br />
<br />
Others (<a href="http://www.jstor.org/discover/10.2307/2744104?uid=3739584&uid=2129&uid=2&uid=70&uid=4&uid=3739256&sid=21101931262461" target="_blank">Aiello and Wheeler</a>) have concluded that cooking food is the invention of <em><a href="http://en.wikipedia.org/wiki/Homo_heidelbergensis" target="_blank">homo heidelbergensis</a></em> (the predecessor to <em>homo neanderthalensis</em>) a later species. Aiello and Wheeler believed that brain size was steady among <em>homo erectus</em> until the emergence of <em>heidelbergensis</em> with its larger brain. Wrangham finds the fossil record sufficient to support the view that brain size gradually grew among <em>erectus</em> and believes that the steady increase in size is attributable to improved cooking techniques, and that continued growth in brain size to <em>heidelbergensis </em>and ultimately to <em>homo sapiens</em> is likely similarly associated with improved cooking techniques, not cooking as a novel adaptation or spandrel. <br />
<br />
Wrangham's thesis is this: "An important step in fire's becoming a central part of human lives was to maintain it at night. Suppose some <em>habilines</em> carried a smoldering log by day to protect against predators, then left it at the base of a sleeping tree when they climbed to make a nest at night. It would not have been such a big step to give it extra fuel so the log will still be burning the next day --perhaps after seeing this happen first by accident. From there it would have been a smaller step to sitting near the fire to keep it burning, and thereby take advantage of its protection, light, and warmth. Once they kept fire alive at night, a group of habilines in a particular place occasionally dropped food morsels by accident, at them after they had been heated, and learned that they tasted better. Repeating their habit, this group would have swiftly evolved into the first <em>Homo erectus</em>. The newly delicious cooked diet led to their evolving smaller guts, bigger brains, bigger bodies, and reduced body hair; more running; more hunting; longer lives; calmer temperatures; and a new emphasis on bonding between females and males. The softness of their cooked plant foods selected for smaller teeth, the protection fire provided at night enabled them to sleep on the ground and lose their climbing ability, and females likely began cooking for males, whose time was increasingly free to search for more meat and honey." So despite the relative dearth of evidence of fire dating back to the time of <em>homo erectus</em>, Wrangham believes that the dramatic shift in brain size and tooth size is significant evidence that <em>Homo erectus</em> started the first outdoor cooking kitchen. <br />
<br />
<em>Division of labor by sex. </em>E.O. Wilson also includes altruistic division of labor among the attributes of eusociality (<a href="http://csilcox-thebookshelf.blogspot.com/2012/09/edward-o-wilson-social-conquest-of.html" target="_blank">September 12, 2012 post</a>). Wrangham has a discussion that dovetails with Wilson on this point. First, cooked food liberated males to spend more time hunting for meat in a way that chimps and apes cannot because they spend so much time chewing their food. Fire enabled men to confine their eating time to the hours around dusk and even after dark. Hunting enabled the male to contribute food to his family (including an extended family group), but this effort was ultimately dependent upon a reliable, predictable economic exchange between women and men. Women became foragers and this provided a reliable source of food energy in the event that the men of the group returned with no meat. Women also became primarily responsible for cooking.<br />
<br />
Wrangham argues that while relying on cooked food created opportunities for cooperation, more importantly it exposed female cooks to exploitation because cooking takes time and lone cooks could not easily guard their wares from thieves. This problem was solved, Wrangham believes, by pair-bonds among males and females: a "husband" ensured that the woman's gathered foods were not taken by others and from this evolved "a simple marriage system." The male provided the female (and their children) with meat. Consistent with Boehm's observations (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>), Wrangham observes (based on anthropological evaluation of modern hunter-gatherers) that meat is actually shared among a larger group that includes not only the male's "wife" and children, but also an extended family (and possibly a stranger). The female's distribution of gathered food is largely shared just with her "husband" and their children. In the gathering of food, there may very well be cooperation among women, but the sharing of the gathered food is limited to the immediate family. Presumably sharing meat among a larger group evolved because direct reciprocity is essential to the hunting and killing of the large animal, bringing the meat back to the campfire and slaughtering it. CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-16785447130183378302013-03-14T18:19:00.001-07:002013-09-15T11:27:57.489-07:00Edward Humes, Monkey Girl: Evolution, Education, Religion and the Battle for America's Soul (2008)Deception and religion have been joined at the hip for a very long time, perhaps as long as religion has existed in human culture given that religion has its origins in believing what we can never see or know. <em><a href="http://www.amazon.com/Monkey-Girl-Evolution-Education-Religion/dp/0060885491" target="_blank">Monkey Girl</a></em> is <a href="http://www.edwardhumes.com/" target="_blank">Edward Humes'</a> account of the Dover Township, Pennsylvania school board's effort to introduce the subject of <a href="http://en.wikipedia.org/wiki/Intelligent_design" target="_blank">intelligent design</a> into the high school science curriculum and the litigation that ensued when parents stepped forward and asked a court to enjoin the school board's effort on the ground that it offended the <a href="http://www.law.cornell.edu/wex/first_amendment" target="_blank">First Amendment of the U.S. Constitution</a>. What the 6-week <a href="http://video.pbs.org/video/980040807/" target="_blank">trial </a>in a United States District Court exposed was concerted deceit on the part of groups opposed to the teaching of <a href="http://evolution.berkeley.edu/evolibrary/article/evo_25" target="_blank">natural selection</a> and what Charles Darwin called "<a href="http://animals.about.com/od/d/g/descentwithmodification.htm" target="_blank">descent with modification</a>" in public school curriculum because it offended the biblical stories that lead them to the belief that god (an intelligent designer) created each of the species separately and the view of some that these acts of creation began no more than 10,000 years ago. Comparable acts of deceit in the commercial world would be called mislabeling or misbranding or fraud. In court, it is called perjury.<br />
<br />
The drive to engage in the acts of deceit documented by Humes begins with the <a href="http://www.law.cornell.edu/supct/search/display.html?terms=Edwards%20v.%20Aguillard&url=/supct/html/historics/USSC_CR_0482_0578_ZS.html" target="_blank">United States Supreme Court's decision</a> in 1987 that the teaching of <a href="http://en.wikipedia.org/wiki/Creationism" target="_blank">creationism</a> offended the First Amendment's Establishment Clause and could not be taught in public schools. If creationism could not be mandated as a subject of instruction in United States public schools these <a href="http://www.discovery.org/" target="_blank">groups</a> began to think about branding creationism as something else, something that sounded like it belonged in the science classroom --- intelligent design. Their legal strategy, for example, compelled them to abandon the words "god" and "creator" and relabel god an "intelligent designer." Their legal strategy also compelled them to create a controversy when, at least in the scientific community, no substantial controversy existed: the existence of an intelligent designer would be deemed a serious scientific question and one that demanded that schools "teach the controversy." The lingo of creationism and its relationship to the book of Genesis had to be purged if science students had any chance of being taught an alternate explanation of the creation of species alongside natural selection and descent with modification in the classroom. This was no easy task. To biblical literalists, it was confusing and did not sit well with the hard core biblical believers who wanted to drive natural selection and "<a href="http://en.wikipedia.org/wiki/Darwinism" target="_blank">Darwinism</a>" from science class because, in their view, it was atheistic. But for the advocates of intelligence design, their difficulties extended beyond the religious motivations of the Dover school board. Not only were the intelligent design advocates ultimately unable to succeed in concealing the religious motivations of the school board, it turns out there was a long and unambiguous record demonstrating that intelligent design had its intellectual seed in creationism. The very book that the intelligent design advocates wanted the high school students of Dover to have in their classroom, <em><a href="http://www.amazon.com/Pandas-People-Central-Question-Biological/dp/0914513400" target="_blank">Of Pandas and People</a>, </em>had been drafted prior to the Supreme Court's 1987 decision in <em><a href="http://en.wikipedia.org/wiki/Edwards_v._Aguillard" target="_blank">Edwards v. Aguillard</a></em>, and the drafts had used the word creationism. By the time of publication, after the Supreme Court rendered its decision in <em>Edwards</em>, the word creationism had been deleted everywhere and replaced with the term intelligent design. <br />
<br />
At the heart of the lawsuit, known as <em><a href="http://scholar.google.com/scholar_case?case=16465861447416053365&hl=en&as_sdt=2&as_vis=1&oi=scholarr" target="_blank">Kitzmiller v. Town of Dover</a></em>, was this question: was intelligent design science or religion? For the plaintiffs, intelligent design was on trial; for the defendants and their supporters, traditional science was on trial. After a six week trial in which the court heard from scientists on both side of the question, the court found that intelligent design was not science; it was religion. <br />
<br />
The scheme to inject intelligent design --- as opposed to creationism --- into the science curriculum begins with a paper developed by a University of California law professor, <a href="http://www.youtube.com/watch?v=uMKxx_KeTF8" target="_blank">Phillip Johnson</a>, that came to be known as the "<a href="http://en.wikipedia.org/wiki/Wedge_strategy" target="_blank">wedge strategy</a>," because it envisioned hammering a "wedge" into the tree of science by criticizing evolutionary theory --- putting science on the defensive and exploiting religious sentiment that was not only skeptical of evolutionary theory, but was essentially ignorant about natural selection and the body of scientific literature that had substantiated Darwin's natural selection model. The <a href="http://www.antievolution.org/features/wedge.pdf" target="_blank">wedge document</a> was developed by Johnson in collaboration with the Discovery Institute, and essentially outlines not a scientific research program, but a public relations strategy to persuade people that a scientific controversy existed and that the public needed to be made aware of the controversy. The wedge document was never intended to be made public, and it was forthright and honest in expressing the goals behind the wedge strategy, leaving no doubt about its theistic underpinning: <br />
<ul>
<li>"to defeat scientific materialism and its destructive moral, cultural and political legacies.</li>
<li>"to replace materialistic explanations with the theistic understanding that nature and human beings were created by God."</li>
<li>to initially see, within five years, "intelligent design theory as an accepted alternative in the sciences and scientific research being done from the perspective of design theory" and within 20 years to see intelligent design theory as the dominant perspective in science" and to see "design theory permeate our religious, cultural, moral and political life." </li>
<li>"Design theory promises to reverse the stifling dominance of the materialist worldview, and to replace it with a science consonant with Christian and theistic convictions."</li>
</ul>
During the <em>Kitzmiller </em>trial, however, the lawyers for the defendants, <a href="http://en.wikipedia.org/wiki/Denial_of_Peter" target="_blank">like Peter denying Jesus</a>, wanted nothing to do with the wedge document. They argued that it was a "mere fundraising proposal" of little significance. The lawyers for the defendants desperately tried to prevent the plaintiffs' witness, <a href="http://en.wikipedia.org/wiki/Barbara_Forrest" target="_blank">Barbara Forrest</a>, who testified not only about the revisions to the <em>Pandas and People </em>book but also about the wedge document, from testifying at all. <br />
<br />
A central part of the Discovery Institute's strategy was to change the ground rules of science so that it not only included the natural, material world, but also the supernatural ethereal world. The problem with this project is that it is nothing less than the merger of science and religion. According to the testimony of the plaintiff's expert at the <em>Kitzmiller</em> trial, "Science is the systematic attempt to provide natural explanations for natural phenomena." The exclusion of the supernatural from science was unavoidable. A scientific theory is testable, and is capable of being proven false. The supernatural is not testable. Judge Jones concluded, "Intelligent design is predicated on supernatural causation. . . . Creationism, intelligent design, and other claims of supernatural intervention in the origin of life or of species are not science because they are not testable by the methods of science. These claims subordinate observed data to statements based on authority, revelation, or religious belief."<br />
<br />
Numerous posts in this blog raise issues that are relevant to the <em>Kitzmiller</em> case:<br />
<br />
<em>Teleology v teleonomy</em>. (<a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2010/05/david-hume-dialogues-concerning-natural.html" target="_blank">May 24, 2010</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/03/david-c-lindberg-beginnings-of-western.html" target="_blank">March 24, 2010 post</a>).<br />
<br />
<em>The human propensity for self-deception and deception</em>. (<a href="http://csilcox-thebookshelf.blogspot.com/2012/02/robert-trivers-folly-of-fools-2011.html" target="_blank">February 4, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/william-shakespeare-king-lear-1607.html" target="_blank">August 28, 2011</a>, and <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/05/robert-wright-evolution-of-god-2009.html" target="_blank">May 12, 2010 post)</a><br />
<br />
<em>Anthropomorphism, anthropotheism, and anthropodenial</em>. (<a href="http://csilcox-thebookshelf.blogspot.com/2012/03/benjamin-decasseres-spinoza-liberator.html?m=0" target="_blank">March 20, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2010/06/frans-de-waal-ape-and-sushi-master.html" target="_blank">June 17, 2010 post</a>).<br />
<br />
<em>Dualism and materialism</em>. ( <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/steven-nadler-book-forged-in-hell-2011.html" target="_blank">December 17, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/02/david-hume-treatise-of-human-nature.html" target="_blank">February 27, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/03/benjamin-decasseres-spinoza-liberator.html?m=0" target="_blank">September 27, 2009 post</a>)<br />
<br />
At its core, the intelligent design movement, as exposed in the wedge document, is about as un-American as any group can be. The long-run goal is for design theory to permeate not only religious, cultural, and moral life, but also "political life." This is so contrary to the First Amendment of the United States Constitution, one would think the design movement's adherents were really living in modern Iran or some other theocracy. Yet what <em>Monkey Girl </em>reveals<em> </em>is that the intelligent design movement has so little respect for the First Amendment, because they believe the government has abandoned religion by recognizing the freedom of atheists, skeptics (agnostics), and pantheists who imagine a universe governed by natural laws (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/01/stephen-hawking-and-leonard-mlodinow.html" target="_blank">January 31, 2013 post</a>) and they believe the government has abandoned its moorings as a "Christian nation." In contrast, Humes closes out <em>Monkey Girl</em> with a quotation from the 1796 <a href="http://en.wikipedia.org/wiki/Treaty_of_Tripoli" target="_blank">Treaty of Tripoli</a>, signed by founding father President John Adams:<br />
<br />
"As the Government of the United States of America is not, <em>in any sense</em>, founded on the Christian religion, as it has in itself no character of enmity against the religion, or tranquility of Mussulmen; and, as the said States never entered into any war, or act of hostility, against any Mahometan nation, it is declared by the Parties, that no pretext arising from religious opinions, shall ever produce an interruption of the harmony existing between the two countries." (Emphasis added).<br />
<br />
Nor should one forget the <a href="http://en.wikipedia.org/wiki/Jefferson_Bible" target="_blank">Jefferson Bible</a>, in which founding father Thomas Jefferson, excised the text pertaining to miracles and other supernatural events. <br />
<br />
I have a proposal that will surely bring the intelligent design movement and creationists running back for the protection of the First Amendment. Congress should pass a law that requires every religious school class to teach the following every Saturday or Sunday: "The Book of Genesis is a story. It was written and later edited by men who could not explain their origins or the origins of the physical universe including other life on earth and life and other material beyond the earth. It's a wonderful story and it even has meaning, but it is just a story. Our origins really did not happen they way, Adam and Eve were not real people, and the other stories that purport to be written history of the Hebrews are merely stories as well. There may be some little historical basis in some of these stories, but they have been gilded, edited, redacted, and revised to fit a collective memory long after the events described in Genesis purportedly took place. And by the way, children, did you not see that Genesis mentions nothing about the dinosaurs and other animals that lived on earth millions of years ago, whose bones we find in the ground today. Children, do you not wonder why Genesis does not mention dinosaurs and other animals who no longer exist? The answer is simple. The men who wrote the stories in Genesis did not know about these animals. They were not as knowledgeable as you are today." Once the law is passed, I am sure there will be a lawsuit. Maybe the ACLU will be the plaintiff.<br />
<br />
In <em>Edwards v. Aguillard</em>, Associate Justice of the Supreme Court Antonin Scalia dissented from the majority's decision that struck down Louisiana's statute that called for the "balanced treatment" of "creation science" and "evolution science" in Louisiana schools. The 7 member majority of the Court authored by Justice Brennan and the concurring opinions of Justice Powell and Justice White found plenty of evidence that creation science lacked a secular purpose and was religiously inspired. "This is not a hard case," wrote Justice White. The case came before the Supreme Court based on the trial court's grant of a motion for summary judgment, which meant the trial court found enough undisputed evidence presented by the plaintiff's challenging the Louisiana statute to warrant granting a judgment without a full evidentiary trial. Justice Scalia professed to take no position on the merits of "creation science," but he felt that Louisiana deserved a full evidentiary trial before an appellate court such as the US Supreme Court decided whether or not there was a valid secular purpose. One would think, and hope, that Justice Scalia, informed by the full evidentiary record in <em>Kitzmiller, </em>would have recognized as Justice White did in <em>Edwards </em>that "this is not a hard case" had the <em>Kitzmiller</em> case made its way to the Supreme Court for judicial review, and that he would recognize that intelligent design deserved the same fate that creation science received in <em>Edwards</em>. CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-65684935709736346782013-02-26T18:20:00.001-08:002013-08-12T16:56:16.478-07:00Jose Saramago, Small Memories (2009)Memory is fragile. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/09/daniel-schacter-seven-sins-of-memory.html" target="_blank">September 20, 2011 post</a>). <a href="http://www.nobelprize.org/nobel_prizes/literature/laureates/1998/saramago-autobio.html" target="_blank">Jose Saramago's</a> honest account of his memories of some events in his life when he was small in <em><a href="http://www.amazon.com/Small-Memories-Jose-Saramago/dp/B00A1A72LY" target="_blank">Small Memories</a> </em>concedes as much. "Sometimes I wonder," he writes, "if certain memories are really mine or if they're just someone else's memories of episodes in which I was merely an unwitting actor and which I found out about later when they were told to me by others who had been there, unless, of course, they, too, had only heard the story from someone else." He refers to memory's "reconstructive powers," and the capacity for memory to be refreshed: "Thanks to some documents I had assumed lost, bet which providentially turned up when I was searching for something else entirely, my disoriented memory has finally been able to fit together various disparate pieces of the puzzle and replace what was uncertain and doubtful with what was right and true."<br />
<br />
"We often forget what we would like to remember, and yet certain images, words, flashes, illuminations repeatedly, obsessively return to us from the past at the slightst stimulus, and there's no explanation for that' we don't summon them up, they are simply there. And it is for those memories that tell me that although, at the time, I was basing myself more on intuition than, of course, on any real knowledge of these facts. . ." <br />
<br />
This is not the first time that a post in this blog has connected Saramago's work with the subject of memory. In <em>The Notebook</em> (<a href="http://csilcox-thebookshelf.blogspot.com/2010/09/jose-saramago-notebook-2010.html" target="_blank">September 28, 2010 post</a>), the Nobelist created a memory bank in blog form. In the posting on his final novel, <em>Cain</em> (<a href="http://csilcox-thebookshelf.blogspot.com/2011/12/jose-saramago-cain-2011.html" target="_blank">December 20, 2011 post</a>) I remarked, "I also believe storytelling evolved in part to preserve our memories of things past. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/08/james-gleick-information-2011.html"><span style="color: #473624;">August 15, 2011 post</span></a>). And storytelling, whether historical or fictional or both, enables the construction of both personal and social/group identity." And Saramago is a master at clutching collective memory --- history we call it --- and creating stories --- fiction we call it --- as in <em>The Year of the Death of Ricardo Reiss</em> (<a href="http://csilcox-thebookshelf.blogspot.com/2011/06/jose-saramago-year-of-death-of-ricardo.html" target="_blank">June 28, 2011 post</a>) and <em>Baltasar and Blimunda</em> (<a href="http://csilcox-thebookshelf.blogspot.com/2013/01/jose-saramago-baltasar-and-blimunda-1982.html" target="_blank">January 1, 2013 post</a>). <br />
<br />
A series of postings in September 2010 revolved around the subject of memory (<a href="http://csilcox-thebookshelf.blogspot.com/2010/09/james-s-hirsch-willie-mays-life-legend.html" target="_blank">September 9, 2010 post</a>) but more recently a posting observed: "Personal identity is a matter of <a href="http://en.wikipedia.org/wiki/Autobiographical_memory"><span style="color: #473624;">autobiographical memory</span></a>. This is our autobiographical self (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html"><span style="color: #473624;">April 8, 2011 post</span></a>). But our <a href="http://articles.mibba.com/Health/3416/Autobiographical-Memory-What-Is-It"><span style="color: #473624;">autobiographical memories </span></a>are shared, and this facilitates social bonding and the building of relationships. It also influences our story-telling and the stories we tell each other, whether represented as fact or fiction. Cultures are built on the sharing of autobiographical memory, yet at the same time personal identity is strongly influenced by the culture that one personally experiences. While at the outset I said that personal identity owes its existence to cultural or group identity, the reverse is true as well. Cultural identity ultimately owes its existence to the sharing of many personal identities. Autobiographical memories are merged and revised into a collective memory. But as we have seen in prior posts, memory is fluid, constantly changing and redeveloping in incremental ways. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/rita-carter-mapping-mind-rev-2010.html"><span style="color: #473624;">November 6, 2011 post</span></a>)."<br />
<br />
It is collective or shared memory that I believe is one subject that is missing from John Searle's accounting of the creation of a social world (see <a href="http://csilcox-thebookshelf.blogspot.com/2013/02/john-searle-making-social-world-2010.html" target="_blank">February 24, 2013 post</a>). Memory, observes Searle, is important for intentionality, and therefore collective memory should be just as important for collective intentionality. That evokes the importance of culture in the creation of the human social world. That is not lost on Saramago.<br />
<br />CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-38979424767441702932013-02-24T11:24:00.001-08:002014-02-27T17:03:49.633-08:00John Searle, Making The Social World (2010)The substance of <a href="http://socrates.berkeley.edu/~jsearle/" target="_blank">John Searle's</a> most recent book, <em><a href="http://www.timeshighereducation.co.uk/story.asp?storycode=411537" target="_blank">Making the Social World</a></em>, is largely covered by the last chapter of his 2008 selection of essays, <em><a href="http://www.whatfredhasread.com/book/484" target="_blank">Philosophy in a New Century</a></em> entitled "<a href="http://ant.sagepub.com/content/6/1/12.abstract" target="_blank">Social Ontology: Some Basic Principles</a>" (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/01/john-searle-philosophy-in-new-century.html" target="_blank">January 21, 2011 post</a>). I refer the reader back to this earlier post for Searle's discussion of <a href="http://reasontostand.org/archives/2011/07/21/language-and-social-ontology-john-searle" target="_blank">status functions, deontic powers, and desire-independent reasons for action</a>. This is a discussion of the language-enabled creation of obligations, permissions, rights, responsibilities, duties, obligations and the like (what Searle calls deontic powers) that Searle tells us are the glue of the human social world and collective action. I wish to cover two topics in this posting: first, the importance of language in making a social world, and second, the significance of human imagination in Searle's model of the social world. <br />
<br />
Language is the foundation of all social institutions, says Searle. "We will not understand an essential feature of language if we do not see that it necessarily involves social commitments, and that the necessity of these social commitments derives from the social character of the communication situation, the conventional character of the devices, used, and the <a href="http://plato.stanford.edu/entries/intentionality/" target="_blank">intentionality</a> of speaker meaning. It is this feature that enables language to form the foundation of human society in general." Language, adds Searle, introduces <a href="http://en.wikipedia.org/wiki/Deontological_ethics" target="_blank">deontology</a> into social relations and how it creates an institutional reality with a deontic power. The foundation of Searle's thesis is this: "If a speaker intentionally conveys information to a hearer using socially accepted conventions for the purpose of producing belief in the hearer about a state of affairs in the world, then the speaker is committed to the truth of his utterance." There is no way, Searle comments, that if I say to someone publicly, intentionally, explicitly, "There is an animal coming toward us," without being committed to the truth of the propositional content that there is an animal coming toward us. Both the belief and the statement involve commitments, but the commitment of the statement is much stronger, for if the commitment of the privately held belief turns out to be false, I am free to revise it. In the case of the statement, however, I am committed to not only to revision in the case of falsehood, but I am also committed to providing reasons for the original statement, I am committed to sincerity in making it, and I am publicly responsible if it turns out to be false. A speech act is more than just an expression of belief; a speech act is a public performance.<br />
<br />
To appreciate the significance that Searle attaches to language in humans, it is important to understand what Searle believes language added to our prelinguistic capabilities and therefore ask: what are the features that prelinguistic human mentality and language have in common (and therefore what did language contribute over and above our prelinguistic mentality)? The common features, according to Searle, are these:<br />
<ul>
<li><span style="font-family: inherit;"><em>Perception</em>. These are our sensory capabilities. Perception and the object perceived are causally self-referential, says Searle: we experience an object only if the presence of the object caused our sensory experience of the object.</span></li>
<li><span style="font-family: inherit;"><em>Beliefs, desires, intending, and emotions such as hopes, fears and the like. </em>These are the capabilities of the mind by which it is directed at or about objects and states of affairs in the world. This is referred to as intentionality (a concept not limited to "intending"). Beliefs, etc. are not causally self-referential.</span></li>
<li><span style="font-family: inherit;"><em>Intentional action. </em>This capacity is embraces a causal sequence (intention and action are causally self-referential), assuming that action actually occurs. There can be a prior intent to act; it can be an intention that is coincident to acting. All actions require intentions-in-acting, but not all actions require prior intent.</span></li>
<li><span style="font-family: inherit;">(<em>At least) short-term memory. </em>Like intentional action, memory is causally self-referential: we recall something only if we experienced the thing that triggers our present memory of that thing.</span></li>
</ul>
Thus, in the cases of perception, memory, and intentional action, there is a match between the mind and the world. Beliefs, desires, and the like are not necessarily tethered to the world, although they are derivative of perception, memory and intentional action, which are. Because they are not tethered to the world in the same way, beliefs, desires etc are much more "flexible" in relating to reality.<br />
<br />
While denying that he is engaging in speculative evolutionary biology, Searle asks us to imagine hominids with the full range of prelinguistic capabilities just noted, but not having language. Evolutionary biology has, in fact, established that this scenario likely existed more than 50,000 to 100,000 years ago. (See <a href="http://csilcox-thebookshelf.blogspot.com/2013/01/stephen-hawking-and-leonard-mlodinow.html" target="_blank">January 31, 2013 post</a>) depending on when we determine that language emerged in humans. What we are capable of achieving with language, says Searle, that we cannot achieve with our prelinguistic consciousness is the ability to manipulate the syntactical elements. Language consists of sentences composed of syntactical elements that can be manipulated; prelinguistic intentional states are not: "the dog might think that someone is approaching the door but the dog cannot think the false thought that door is approaching someone." [This may or may not be true for a dog, but I am skeptical that it is necessarily true for the prelinguistic human --- to be discussed below when I touch on imagination.] Importantly, <a href="http://plato.stanford.edu/entries/speech-acts/" target="_blank">speech acts</a> come in five categories: (i) assertives (representing how things are); (ii) directives (orders, commands); (iii) commissives (promises, pledges); (iv) expressives (apologies, thanks); and (v) declarations. The first four speech acts have their analogs in intentional states (corresponding to beliefs, desires, intentions, and emotions such as fear, hope and the like) and are not causally self-referential. Declarations are different. In the case of a declaration, "we make something the fact by declaring it to be the case." Declarations, on the other hand, have no prelinguistic analog and they are causally self-referential: the prelinguistic intentional states "cannot create facts in the world by representing those facts as already existing. This remarkable feat requires language." This has enormous significance for the construction of a social reality (derived from perception, intentional action, and/or memory). But through a declaration we have the ability to declare things to be the case that were not necessarily the case prior to the declaration: that I am the shaman of this tribe, I am the leader of this tribe, these five persons comprise our governing council, this piece of paper shall be legal tender for all debts and obligations public or private. Equally, if not more important for Searle, language creates <a href="http://www.uqtr.ca/~vandervk/SearleOnMeaning.pdf" target="_blank">speaker meaning</a> for those prelinguistic intentional states, and, as noted in the opening paragraph of this post, with respect to those causally self-referential intentional states, language necessarily involves social commitments by declaring what we perceive, intend, or recall to the be the case. And so once we have language, we have a deontology --- the ability to establish duties, obligations, rights and the like that are desire-independent. With collective acceptance of these duties, obligations, rights and the like, we can have collective intentionality. <br />
<br />
Not everyone concurs with Professor Searle's view on the importance of language. <a href="http://www.rug.nl/staff/f.a.hindriks/" target="_blank">Frank Hindriks</a>, for example, <a href="http://www.rug.nl/staff/f.a.hindriks/restructuringmsw_final.pdf" target="_blank">surmises</a> that collective acceptance and collective intentionality can arise through gesture (including sanctions):<br />
<br />
<span style="font-family: Cambria;"></span><div align="left">
<span style="font-family: Cambria;"><span style="font-family: Times, "Times New Roman", serif;">"Consider Searle’s example of a wall that decays and turns from a physical structure into an institutional boundary (94--‐96). It is not obvious that any linguistic communication is required in the process. People can observe each other’s behavior including sanctioning behavior such as frowning when someone crosses the boundary. At some point it is true that the stones that are left form a boundary, and this fact involves the obligation not to cross it. The stones form a boundary because the relevant people recognize it as such. These people believe that it is a boundary, and recognize the deontic powers that come with being a boundary. In light of this, it seems fair to say that the collective intentional states that are involved in the constitution of the boundary have the double direction of fit. Language needs not enter, neither to account for the double direction of fit [causal self-reference] nor to explain the normative nature of this institutional fact. I am not sure what Searle has in mind when he mentions conventionally encoded commitments, but see no reason to believe that I have left anything out of the picture that essentially involves linguistic conventions. So it seems that Searle overestimates the role language plays in institutions when he claims language is constitutive of institutions."</span></span></div>
<span style="font-family: Cambria;">
<span style="font-family: Cambria;"></span></span><br />
As a matter of anthropology and evolutionary biology, Hindriks may be closer to the mark as Chris Boehm's <em><a href="http://www.amazon.com/Moral-Origins-Evolution-Virtue-Altruism/dp/0465020488" target="_blank">Moral Origins</a></em> indicates. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). Searle admits that cooperation among hominids is a characteristic of pre-linguistic humans. But Boehm's thesis is that forms of human organization (egalitarian in nature) emerged as early as 150,000 to 200,000 years ago (if not earlier in other homo species), primarily through sanctioning behavior (subtle or lethal), well before language emerged 50,000 to 100,000 years ago. It may very well be true that the kind of social institutions created by humans for the first time 10,000-35,000 years ago could not have occurred without language, but if Hindriks is correct, as some evidence suggests, then it means that humans were capable of non-linguistic declarations and that "hearer meaning" and collective acceptance in the pre-linguistic world was secured through a punch in the face. And perhaps by virtue of mirror neurons. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/10/vs-ramachandran-tell-tale-brain-2011.html" target="_blank">October 25, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2009/09/marco-iaccoboni-mirroring-people-2008.html" target="_blank">September 18, 2009 posts</a>). <br />
<br />
This brings me to the more disappointing accounting in Searle's account of the creation of a social reality, although admittedly he does not ignore the subject. Professor Searle is certainly correct when he says that what typically gets communicated in speech acts are intentional states representing the world. A previous post noted the research that truth telling is the default position of the human brain (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/02/robert-trivers-folly-of-fools-2011.html" target="_blank">February 4, 2012 post</a>) and this seems to make common sense as well. But the human capacity to engage in both deception and self-deception cannot be overlooked. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/02/robert-trivers-folly-of-fools-2011.html" target="_blank">February 4, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/05/robert-graves-king-jesus-novel-1946.html" target="_blank">May 22, 2011 posts</a>). This is missing from Searle's analysis, although he expressly acknowledges that the "one faculty that is left out of [his listing of intentional states], because it does not have a direction of fit, is imagination. . . unlike belief, which has the downward direction of fit, or desire, which has the upward direction of fit, my imagining something commits me neither to believing that what I imagine is the case, nor to the wanting it to be the case. Sometimes one fantasizes what one would like to occur, but it is not an essential feature of fantasy or imagination that they are forms of desire. One can fantasize what one fears or hates, as well as what one believes might happen, and indeed what one believes could not possibly happen. There is no responsibility for fitting with imagination. Another feature peculiar to imaging is that it is, or can be, free voluntary action. . . Imagination will have a role in our account of social ontology, because the creation of a reality that exists only because we think it exists requires a certain level of imagination." The mistake here, it seems to me, is a virtual assumption that the truth-telling default position of the human brain is the only position. We know that is not the case. We lie and deceive more frequently than we care to admit, and, importantly for this discussion, we do occasionally engage in acts of deception and make our imagination fit the world by declaring it to be the case even though that is inconsistent with the objective facts. The Catholic Church's denial of the Copernican System is an obvious example. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/steven-nadler-book-forged-in-hell-2011.html" target="_blank">December 17, 2012</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/michael-shermer-science-of-good-and.html" target="_blank">December 5, 2012 post</a>). <br />
<br />
This is why, in discussing Searle's example of the dog who "cannot think the false thought that door is approaching someone" I questioned whether this was necessarily true for prelinguistic humans. If a prelinguistic human believed that the earth circles the sun, it is equally possible that the prelinguistic human could have held the false thought that the sun circles the earth. There is no apparent reason why prelinguistic human mental states are not capable of manipulation on account of imagination. Human social reality can be constructed on the basis of false beliefs and by declaring that to be the case, and Searle admits that this can be the case as when "a community believes that someone has divine powers," where the belief goes beyond the fact. He admits that "a whole system of status functions may be based on false beliefs," but he says that from the perspective of institutional analysis, it does not matter whether the beliefs are true or false; it only matters whether the people do in fact collectively recognize or accept the system of status functions. OK. So we create a "social" reality that is not made of the same the brute physical facts made of "mindless, meaningless, physical particles." I can accept that, but it seems to contradict the purpose of this book which is "not to allow ourselves to postulate two worlds or three worlds or anything of the sort. Our task is to give an account of how we live in exactly one world, and how all these different phenomena, from quarks and gravitational attraction to cocktail parties and governments are part of that one world."CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com1tag:blogger.com,1999:blog-2282031583522712001.post-66433006961646304512013-01-31T18:31:00.000-08:002013-10-20T11:15:05.129-07:00Stephen Hawking and Leonard Mlodinow, The Grand Design (2010)Imagine that approximately 160,000 - 200,000 years ago, <a href="http://library.thinkquest.org/J0112388/asteroids.htm" target="_blank">a huge asteroid miles across is hurtling toward the earth</a> as the earth's gravity draws it in toward a cataclysmic event. The asteroid strikes the African continent in an area that is now called Ethiopia, Kenya, Uganda, and Zambia. Many forms of life are destroyed by the impact of this asteroid, including the entire population of an incipient large-brained species among the genus Homo, <em><a href="http://www.becominghuman.org/node/homo-sapiens-0" target="_blank">Homo sapiens </a>(subspecies <a href="http://en.wikipedia.org/wiki/Homo_sapiens_idaltu" target="_blank">idaltu</a>),</em> that walked upright. Other species, including one which some have labeled <em><a href="http://en.wikipedia.org/wiki/Homo_rhodesiensis" target="_blank">Homo rhodensiensis</a>,</em> are destroyed as well. At the time of this catastrophe, there were relatively small numbers of this incipient species, <em>homo sapiens</em>, alive. As a consequence, the species <em>homo sapiens</em> went extinct, and modern homo sapiens (<em>subspecies</em> <em>sapiens</em>) never emerged. To the north of Africa in the area now known as the Middle East and Europe, the species <em><a href="http://en.wikipedia.org/wiki/Neanderthal" target="_blank">Homo neanderthalensis</a>, </em>who is believed by some to have evolved from a common ancestor with <em>Homo sapiens</em>, <em>H</em><em>omo rhodesiensis</em> (or perhaps <em><a href="http://en.wikipedia.org/wiki/Homo_heidelbergensis" target="_blank">Homo heidelbergensis</a></em>), is not directly impacted by this catastrophe. Only this species within the genus <em>Homo</em> survives.<br />
<br />
One of the mysteries of <em>Homo neanderthalensis </em>is whether this species had the capacity for the spoken word and language (see <a href="http://csilcox-thebookshelf.blogspot.com/2009/08/christine-kenneally-first-word-2007.html" target="_blank">August 31, 2009 post</a>). The earliest humans (e.g. 160 - 200,000 years ago at the time of our imaginary encounter with the asteroid in Africa) are not believed to have developed language as we know it. At best, early humans and some of their predecessors may have enjoyed some kind of capability for communicating by gesture or perhaps making sounds, what some have labeled a proto-language. Exactly when the human capacity for language evolved is unclear, but it was <a href="http://www.newscientist.com/data/doc/article/dn19554/instant_expert_6_-_the_evolution_of_language.pdf" target="_blank">at least 50,000 years ago</a> ( the time of the out-of-Africa migration) <a href="http://www.nytimes.com/2011/04/15/science/15language.html?_r=3&hp&" target="_blank">if not earlier</a>, in Africa. (See also <a href="http://csilcox-thebookshelf.blogspot.com/2012/02/daniel-j-levitin-this-is-your-brain-on.html" target="_blank">February 15, 2012 post</a>). <em>Homo neanderthalensis </em>was still a viable north-of-Africa species within this time frame, but if <em>Homo neanderthalensis </em>was already out-of-Africa in this time frame and language first emerged in Africa, this suggests that <em>Homo neanderthalensis </em><a href="http://cup.linguistlist.org/2012/11/the-origin-of-language-in-gesture-speech-unity-3/" target="_blank">may not have had a language capacity</a>. For purposes of our asteroid story, let us assume that <em>Homo neanderthalensis </em>did not have the capacity for language (despite its large brain) and let us assume, as a consequence of the catastrophe brought about by our imaginary asteroid, that because <em>Homo sapiens </em>went extinct and never invaded the European habitat of <em>Homo neanderthalensis, the</em> latter remained undisturbed by another <em>Homo</em> species.<br />
<br />
We can now imagine a number of alternative "histories" over the next tens or hundreds of thousands of years after the asteroid catastrophe for life on earth that might have evolved. One such history witnesses that <em>Homo neanderthalensis</em> becomes the dominant hominid species on the earth, migrating across to North America and South America, into southern Asia, and back to Africa. Another history witnesses the extinction of <em>Homo neanderthalensis</em> and the disappearance of the genus <em>Homo</em> entirely from the earth. Other histories involve one of the first two just described <em>plus</em> the emergence of another large-brained species, perhaps an evolved Neanderthal or perhaps another evolving hominid species in Africa. Undoubtedly, there are many other such "histories" that can be imagined, but in the interest of keeping these speculations simple, in any of the above scenarios it is possible if not most likely that 160 - 200,000 years after our imaginary asteroid struck the earth, the earth would be much different than it is today. If the Neanderthals did not go extinct and survived today under the first scenario, we still don't know what their capacity for creating social structures and institutions might have been much less there capacity for creating technology as part of their historical evolution. There is <a href="http://ngm.nationalgeographic.com/2008/10/neanderthals/hall-text/9" target="_blank">some evidence</a> that their capacity for creating social structures would have been quite different than <em>homo sapiens </em>as Neanderthals may have lived in much smaller social units comprised of extended families, whereas humans lived in somewhat larger groups whose members included individuals from outside the family. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). It is possible that their brain structures may have been similar to <em>homo sapiens </em>given that the cranial capacity is similar to humans, but this is by no means certain. What I am getting at, however, is the possibility that Neanderthals may have never had or developed the communications and language skills of humans, and this would have significant impact on how they perceived the world around them, including developing ideas, as <em>homo sapiens</em> later did, of the physical universe beyond the earth or even the biosphere on and around the earth. In such a circumstance, we can imagine that no species would have developed an idea or belief in a creator of the universe; there would be no idea of a "god," and there would be no religious institutions. Human social evolution described by Richard Boehm in <em>Moral Origins</em> (<a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>) may never occur either. <br />
<br />
Physicist <a href="http://www.hawking.org.uk/" target="_blank">Stephen Hawking</a> is a <em>homo sapien</em> who has spent all his adult life contemplating the physical universe beyond the earth, its origin(s), and the physical laws that describe its behavior. Together with physicist Leonard Mlodinow (see <a href="http://csilcox-thebookshelf.blogspot.com/2011/11/leonard-mlodinow-euclids-window-2001.html" target="_blank">November 20, 2011 post</a>), they consider three questions, assuming that there are laws of nature:<br />
<ul>
<li>What is the origin of the laws?</li>
<li>Are there any exceptions to the laws, i.e., miracles?</li>
<li>Is there only one set of possible laws?</li>
</ul>
The first question is readily recognized as posing the question whether there was a Creator who established the laws of nature? Or did the laws evolve as a result of some spontaneous event?<br />
<br />
The second question examines whether the laws of nature establish a deterministic system or whether there are occasions when the laws are suspended, accomplishing something that physical law would not permit. This question was resolved by Spinoza in the negative (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/12/steven-nadler-book-forged-in-hell-2011.html" target="_blank">December 17, 2012 post</a>), and Hawking does likewise, although he notes that there is a long line of physicists before him who felt differently.<br />
<br />
The third question receives a lot of attention in this book, <a href="http://www.washingtonpost.com/wp-dyn/content/article/2010/09/03/AR2010090302118.html" target="_blank"><em>The Grand Design,</em></a><em> </em>and draws on <a href="http://www.nobelprize.org/nobel_prizes/physics/laureates/1965/feynman-bio.html" target="_blank">Richard Feynman</a>'s <a href="http://people.ccmr.cornell.edu/~muchomas/8.04/Lecs/lec_FeynmanDiagrams/node3.html" target="_blank">sum over histories</a> approach to quantum mechanics. This is a probabilistic approach to epistemology attributable to the uncertainty (<a href="http://plato.stanford.edu/entries/qt-uncertainty/" target="_blank">Heisenberg's uncertainty principle</a>) in determining the specific historical pathway that an object takes to its present or future position. For Feynman, the object of interest was a particle that we cannot see. Particles can take an infinite number of paths to reach an endpoint, and each pathway has a probability associated with it. As others have explained:<br />
<br />
"The crucial point is that these different [probability] amplitudes have a wavelike nature, and as they spread through space they interfere with each other, their respective wave patterns either reinforcing or canceling each other out at various points. And if you sum over all the amplitudes of all the different paths, i.e. you sum-over-histories, then the different amplitudes will reinforce or cancel each other in such a way that the only path that survives this interference process is the one that the particle actually follows."<br />
<br />
I'm not sure we can apply the sum-over-histories approach to the non-quantum world that earthbound hominids live in, such as the alternative histories of hominid evolution I described at the top, and it may just be mathematically too difficult because of the difficulty in describing "laws" applicable to animal evolution that would tell us who dies, who survives, and who prevails among those who survive. But the evolutionary pathway is just as probabilistic as the elements of nature. The real point here is the analytical model that Hawking brings to bear on looking at the universe: something he calls <em><a href="http://en.wikipedia.org/wiki/Model-dependent_realism" target="_blank">model-dependent realism</a>. </em>This concept goes back to the debate between Niels Bohr and Albert Einstein and what Bohr described as "observer dependent reality." (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/07/manjit-kumar-quantum-2008.html" target="_blank">July 30, 2011 post</a>). For Einstein, he was certain that there was a reality that was independent of the observer; a tree falls in the woods even though no one has witnessed it falling. For Hawking, our understanding of physical reality is dependent on what the observer perceives and senses, and it is quite possible that different observers will witness the same thing differently. As noted in a prior post, Michael Shermer, borrowing Hawking's concept and applying it to his views of human belief, described a <a href="http://www.scientificamerican.com/article.cfm?id=the-believing-brain" target="_blank"><em>belief dependent realism</em></a>. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/06/michael-shermer-believing-brain-2011.html" target="_blank">June 12, 2011 post </a>). The ultimate question for Hawking (and Shermer) is this: it is pointless to ask whether a theory or model or belief is real, but only whether the theory or model or belief agrees with observation. It is possible that more than one theory or model of explains the way things really are and observations can agree with both of them, and Hawking cites particle and wave duality as one example of co-existing models of the same thing. Likewise, Hawking says "no single theory can describe every aspect of the universe." It is likely that we will find multiple models explain our observations of different aspect of the universe. Shermer, of course, is concerned about how certain tendencies of the human mind (biases) color our observations and cause us to believe something that is not real.<br />
<br />
This is an epistemological issue. It is different than the question of whether something actually exists. Hawking does not deny that the observer and the observed are parts of the world that has an objective existence. <br />
<br />
<em>The Grand Design </em>is devoted in substantial part to explaining that the universe had a beginning and that beginning can have occurred spontaneously. It then discusses what we know about how the microscopic elements that were present at the spontaneous beginning and, shortly thereafter, could yield the universe of complex compounds that we find in the universe we intelligent humans observe today. Hawking asks us to think of the expanding universe as the surface of a bubble, and to imagine the formation of bubbles of steam in boiling water. Many tiny bubbles (in our model, corresponding to alternative universes, each with very different or similar sets of physical laws) appear and then disappear again while still of microscopic size. Since they do not last long, these "universes" (and their different physical laws) do not, as the universe that we humans now observe, develop galaxies and stars needed to create elements heavier than hydrogen, helium and lithium, like carbon that is essential for life and intelligent life. But then, Hawking invites us to further consider, a few bubbles are able to grow large enough so that they are safe from collapse and they continue to expand at an ever-increasing rate and they form the bubbles of steam we are able to see. These "bubbles" correspond to that beginning of universes in a <a href="http://en.wikipedia.org/wiki/Inflation_(cosmology)" target="_blank">state of inflation</a>. Others have described this scenario as well, such as Steven Weinberg in <em><a href="ftp://89.209.81.27/public/Sci_Library/Phys%20Library/PPop_Popular-level/Weinberg%20S.%20The%20first%20three%20minutes(168s).pdf" target="_blank">The First Three Minutes</a></em>. <em>In the beginning there were tiny bubbles . . . and now there is a very large universe that includes intelligent life. </em>No help from a creator is needed to explain how the universe went from Point A to Point B. But would imagination and storytelling ever have emerged?CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0tag:blogger.com,1999:blog-2282031583522712001.post-13797888972214154872013-01-23T18:12:00.000-08:002013-01-24T16:17:01.974-08:00Fyodor Dostoyevsky, The Adolescent (1875)The subjects of personal identity and group identity have surfaced in a couple of previous posts in this blog, (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/04/irvin-d-yalom-spinoza-problem-2012.html" target="_blank">April 1, 2012</a>, <a href="http://csilcox-thebookshelf.blogspot.com/2011/12/rebecca-skloot-immortal-life-of.html" target="_blank">December 10, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/12/howard-jacobson-finkler-question-2010.html" target="_blank">December 2, 2011 posts</a>), and now personal identity resurfaces in <a href="http://en.wikipedia.org/wiki/Fyodor_Dostoyevsky" target="_blank">Dostoevsky</a>'s <em><a href="http://www.amazon.com/Adolescent-Fyodor-Dostoevsky/dp/0375719008" target="_blank">The Adolescent</a></em>. <br />
<br />
The circumstances here underlying the search for personal identity are not remarkable: Dostoevsky's protagonist and narrator, Arkady Makarovich Dolgoruky, has two fathers, a biological father and an adoptive father. The more unique circumstance is that neither of these "fathers" has played much of a role in his life through adolescence. Arkady does not really know either father. The adoptive father is a wanderer across Russia who returns home to Arkady's mother once in awhile; the biological father, a nobleman who has squandered his wealth, who like others of his class during the second half of the 19th century seem to flit back and forth between Russia and other European countries, while arranging for someone to take care of Arkady as he is growing up. Alternatively, Dostoevsky's protagonist is referred to by some as Arkady Andreevich (patronymic name derived from his biological father) and Arkady Makarovich (patronymic name derived from his adoptive father). From which father does Arkady derive his personal identity? "How can you not feel your father's blood in you?" Arkady is asked at one point, suggesting a blood or genetic basis for identity. Personal identity, as another posting on this blog has observed, is a matter of autobiographical memory, and is not always a matter of genetic identity. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011</a> and <a href="http://csilcox-thebookshelf.blogspot.com/2011/12/howard-jacobson-finkler-question-2010.html" target="_blank">December 2, 2011 posts</a>). <br />
<br />
This is Russia, between the reign of <a href="http://en.wikipedia.org/wiki/Nicholas_I_of_Russia" target="_blank">Tsar Nicholas I</a> (1796-1855) who authorized landowners to free their serfs and <a href="http://en.wikipedia.org/wiki/Nicholas_II_of_Russia" target="_blank">Tsar Nicholas II</a> (1868-1918), the last of the Romanovs. It is a period of great change in Russia, including the beginning of the industrial revolution and liberal reforms under <a href="http://en.wikipedia.org/wiki/Alexander_II_of_Russia" target="_blank">Tsar Alexander II</a> (1818-1881), including emancipation of the serfs. The intellectual air in Russia at the time is full of "ideas," and Dostoevsky, I believe, more than any other Russian writer of his era, brings the exchange and conflict of ideas to dramatic life, not only in <em>The Adolescent</em>, but his more famous novels, <em>Crime and Punishment</em>, <em>The Devils</em>, and <em>The Brothers Karamazov</em>. <br />
<br />
As <em>The Adolescent </em>opens, Arkady, age 20, fresh from completing the equivalent of high school in Moscow, arrives in Petersburg with an "idea" in his mind. He has developed the naive, if not "adolescent" idea that he can achieve a life of independence of mind and solitude, whereby he is liberated from a life that depends on others. Moving into adulthood he has to be prepared to no longer be a dependent. But solitude, he believes, requires"power." And to achieve power, his idea is to become a "Rothschild." He is therefore determined to work, earn and save, with "persistence and continuity" until he is financially independent and can control his own destiny. His family now resides in Petersburg, including his adoptive father, his biological father, his mother and sister, as well as coterie of friends of immediate family members.<br />
<br />
Ideas have a way of not maturing to actualities, which certainly happens here. As the plot progresses our protagonist is unable to flee the mix of his adolescent friends, family and relatives, and friends of relatives. Instead of a life of solitude, Arkady encounters an ocean of social interactions, social emotions and feelings (see <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>) and by the end of his narrative he admits that his "idea" is "no longer recognizable." Instead of working to accumulate the wealth he believed he needed to pursue a solitary life, Arkady believes he must now work to support his mother and his sister. Even "Rothschild's", as another posting in this blog describes, cannot live and succeed in solitude. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/06/david-nasaw-andrew-carnegie-2006.html" target="_blank">June 30, 2012 post</a>).<br />
<br />
Arkady's first discovery is that his biological father possesses a moral anchor. Andrei Petrovich is embroiled in litigation over an inheritance. After arriving in Petersburg, Arkady is presented with a document, which he is told contains information that is inconsistent with his biological father's claim to the inheritance. Arkady is presented with his own moral crisis: should he deliver the document to his father? or should he deliver it to the other party to the litigation? or should he conceal it? After all, there is a possibility that Andrei Petrovich's inheritance might trickle down to his mother and even Arkady. Arkady is assured that the document has no decisive legal significance because his father would win his case even if the court knew of the contents of the document: the testamentary instrument (will) is clear; the document, on the other hand, contains only <a href="http://legal-dictionary.thefreedictionary.com/precatory" target="_blank">precatory language</a> that expresses a hope or a wish and does not create an obligation or a command. The document, then, presents "a matter of conscience." Arkady agonizes over the correct course of action before finally deciding to place the document in his father's hands, who in turn, after the court has ruled in his favor, renounces his claim. Arkady is a "dumbstruck, but delighted" about his father's noble choice. Arkady repents his own "cynicism and indifference to virtue" in light of his father's example. Comparing his father to the example of Joshua of Nazareth: "this man was dead and is alive again, was lost and is found."<br />
<br />
<em>The Adolescent</em> reaches a crescendo as Arkady (who is recalling and recording the events of his reunion with his fathers and others a year later) recalls a conservation with his biological father in which the latter imagines a life without God. "A calm has come, and people are left alone, as they wished: the great former idea [of god] has left them; the great source of strength that had nourished and warmed them all then is departing, like the majestic inviting sun in Claude Lorrain's painting, but it already seemed like the last day of mankind. And the people suddenly realized that they remained quite alone, and at once felt a great orphancy. My dear boy, I've never been able to imagine people ungrateful and grown stupid. The orphaned people would at once begin pressing together more closely and lovingly; they would hold hands, understanding that they alone were now everything for each other. The great idea of immortality would disappear and would have to be replaced; and all the great abundance of the former love for the one who was himself immortality, would be turned in all of them to nature, to the world, to people, to every blade of grass. They would love the earth and life irrepressibly and in the measure to which they gradually became aware of their transient and finite state, and it would be with a special love now, not as formerly. They would begin to observe and discover such phenomena and secrets in nature as they had never supposed before, because they would look at nature with new eyes, the eyes with which a lover looks at his beloved. *** Every child would know and feel that each person on earth was like a father and mother to him." If we stopped loving god, Andrei Petrovich suggests, we would naturally turn to loving each other and becoming more appreciative of nature after experiencing only a brief period of orphancy. Profound, but not implausible given what we know about the origins of morality and conscience. (See <a href="http://csilcox-thebookshelf.blogspot.com/2012/11/christopher-boehm-moral-origins.html" target="_blank">November 21, 2012 post</a>). But it may leave one to wonder: who is the adolescent, the son or the father? And then Andrei Petrovich explains, "this is all a fantasy, even quite an incredible one; but I have imagined it only too often, because all my life I've been unable to live without it and not think of it. I'm not talking about my faith: I have no great faith, I'm a deist, a philosophical deist, like all the thousand of us." Arkady realizes his father's love more mankind is genuine ("the falseness I had feared wasn't there") and he realizes that his father is extremely comfortable with himself --- happy. "I wouldn't exchange my yearning [for brotherhood among mankind] for any other happiness," his father confesses. "In this sense, I've always been happy, all my life. And out of happiness I came to love your mama then for the first time in my life." <br />
<br />
There is <a href="http://thelectern.blogspot.com/2010/09/adolescent-dostoevsky.html" target="_blank">more to this novel</a> than what I have focused on, and what I have focused on prefigures themes that later appear in Dostoevsky's final novel, <em>The Brother's Karamazov</em>. As a first-person narrative written as young Arkady's "notes" of events that occurred only several months earlier, the notes have the character of autobiographical memory that lies at the core of extended consciousness and the autobiographical self. (See <a href="http://csilcox-thebookshelf.blogspot.com/2011/04/antonio-damasio-self-comes-to-mind-2010.html" target="_blank">April 8, 2011 post</a>). CSilcoxhttp://www.blogger.com/profile/17529541537694303553noreply@blogger.com0