Sunday, November 20, 2011

Leonard Mlodinow, Euclid's Window (2001)

This blog is, in part, an effort to connect dots --- particularly among the books that have come off my Bookshelf. A number of dots connected in my mind as I read Leonard Mlodinow's Euclid's Window.

Euclid's Window opens in Greek antiquity in the port of Miletus on the west coast of what is now Turkey. Mlodinow asserts that a "revolution in human thought, a mutiny against superstition and sloppy thinking," occurred here in the 7th century BCE. Around 620 B.C., Thales of Miletus, who Mlodinow describes as humanity's "first scientist or mathematician," lived here and is purportedly responsible for the systematization of geometry, a methodology that would later be incorporated in Euclid's Elements. This blog first mentioned the scientific contributions of the Milesians in a prior post (see March 28, 2010 post), noting historian David Lindberg's comment, "[I]n the answers offered by these Milesians we find no personification or deification of nature; a conceptual chasm [that] separates their worldview from the mythological world of Homer and Hesiod. The Milesians left the gods out of the story. What they may have thought about the Olympian gods we do not (in most cases) know; but they did not invoke the gods to explain the origin, nature or cause of things." (See March 24, 2010 post). The inference from Lindberg's observation is that when the human mind frees itself from myth, religion, and superstition --- the types of "beliefs" that Michael Shermer wrote about in The Believing Brain (see June 12, 2011 post) --- scientific progress is unshackled.

While Mlodinow does not make the same observation about the Milesians, he does seem to fall into a trap that Lindberg encourages historians to avoid: blaming Christianity entirely for Europe's failure to maintain the scientific progress that the Greek's initiated before the first millennium A.D. (see March 24, 2010 post). Apparently relying on Edward Gibbon, Mlodinow cites the Christians for burning down the greatest library of its era at Alexandria, Egypt and all the scientific and philosophical works that were part of that library. This claim may not be true or entirely true, but the fact the Mlodinow seems to harbor this belief, as revealed in this reference and other statements he makes, strains his credibility as a writer of science history (at least about science and mathematics in the era of the Dark and Middle Ages). It is true that once the institutions of the Dark and Middle Ages lost touch with Greek scientific inquiry and knowledge, institutional biases developed that made it very difficult for that knowledge to be rediscovered, and among those institutions was the Catholic Church. Yet, as Lindberg documents, those institutions also had a small role in the rediscovery of Greek science and thought.

As Mlodinow moves from his portraits of the geometers and Euclid to Descartes to Carl Gauss (and Reimann), the reader senses the impending merger of geometry and physics (or perhaps the impending takeover of geometry by physics) with the development of non-Euclidean geometry. Since the times following Euclid, Euclid's Fifth Postulate (the parallel postulate) had proven troublesome. Euclid stated a proposition that would determine whether two co-planar lines were parallel, converging, or diverging: take two lines and cross them with a third line; if the sum of the two inner angles on the same side of the crossing line is less than two right angles (180 degrees), then the two lines are converging (on that side of the crossing line). The postulate seems intuitively correct. The problem is that the fifth postulate could not be proven as a theorem would be proven. It was assumed as a fact, until non-Euclidean geometry began to address surfaces that are curved and the parallel postulate failed. Geometry began not only to take a hard look at spherical surfaces and topography --- the earth, but it began to turn its attention to space.

Enter Albert Einstein, relativity, and the influence of gravity on the shape of space. Even Mlodinow's brief discussions of Euclid, Descartes, and Gauss made me recall Rita Carter's discussion of the posthumous study of Einstein's brain in Mapping the Mind (see November 6, 2011 post). Carter reports that researchers at McMaster University in Canada found that Einstein's brain "was different from most in several ways, the most notable being that two sulci (infolds) in the parietal cortex had merged during development, creating a single enlarged patch of tissue where usually there would be a division. In normal people, one of these areas is primarily involved in spatial awareness, while the other does (among other things) mathematical calculation. The merging of these two areas in Einstein's brain," Carter speculates, "may well account for his unique ability to translate his 'vision' of space-time into the most famous mathematical equation of all time, e=mc2." Here Carter has been discussing synaethesia, the phenomenon where, because of the close proximity of two parts of the brain, there is a merging of sensory phenomena: e.g., hearing the sound of a certain word or number takes is associated with a certain color. Have the brains of certain mathematicians who can develop mathematical theories or even practical algorithms that describe physical phenomena or physical space developed in a way that facilitates their mathematical skill and insights, in contrast to the brains of most humans? Mlodinow's historical survey of the history of geometry certainly makes one wonder about that.

Reading Euclid's Window also reminded me of a quote from Michael Shermer that I mentioned in the June 12, 2011 post, "We are not equipped to perceive atoms and germs, on the one end of the scale, or galaxies and expanding universes, on the other end." Yet clearly, as Mlodinow's portrait of Einstein and later Edward Witten moves from relativity and quantum mechanics to the "standard model" and ultimately to string theory, it is clear that some minds are clearly capable of not only envisioning atoms, but even smaller particles, and some minds (sometimes the same minds) are capable of envisioning galaxies and expanding universes. Without this capability, Mlodinow would never have had a story to tell. Mlodinow sums this up as follows: "Through Euclid's window we have discovered many gifts, but he could not have imagined where they would take us. To know the stars, to imagine the atom, and to begin to understand how these pieces of the puzzle fit into the cosmic plan is for our species a special pleasure, perhaps the highest. Today, our knowledge of the universe embraces distances so vast we will never travel them and distances so tiny we will never see them. We contemplate times no clock can measure, dimensions no instrument can detect, and forces no person can feel. We have found that in variety and even in apparent chaos, there is simplicity and order."

This is not a deep book. It is written for the general public who has an interest in mathematics and the history of science. I began by criticizing Mlodinow for his knowledge of history in the Dark and Middle Ages, but by the end of the book and the discussion of string theory, I came to conclude that I wished I had read this book before embarking on other deeper books about string theory.

Wednesday, November 16, 2011

David Liss, A Conspiracy of Paper (2000)

The year is 1719. The scene is London, England. King George I, recently arrived from Germany, sits on the throne of England. Unlike the French, the English have not yet created a local or national police force to protect its citizens. The entrepreneurial class filled the official void, and established themselves as "thief-takers," bounty hunters hired to capture criminals. The most notorious of the thief-takers, Jonathan Wild, exploited his status to form an organized crime gang of thieves who stole property only to be hired by the victim, who would pay for its return, to "find" the same stolen property.

England is suffering financially at this time under the weight of a growing national debt because of the War of Spanish Succession. The South Sea Company, a business organized in the early 18th century as a stock company, buys half of the national debt in exchange for its stock, pursuant to a plan to convert that debt to lower interest debt that would ease the government's financial burden, but also provide the South Sea Company with steady revenue. The South Sea Company then pursued a program to drive up the price of its stock and a speculative frenzy ensued. By 1720, the infamous South Sea Bubble, the first stock market crash, occurred, leading to bankruptcies and other financial problems across Europe.

There is a nascent, unregulated stock market operating out of coffeehouses on and around Exchange Alley, where "stock jobbers" trade in company stocks. Stock jobbers are not held in high reputation, apparently for all the reasons that, over 200 years later, the United States of America established a Security and Exchange Commission to regulate this trade.

All of this is true, and against this background, David Liss' fictional story of a competing thief-taker, Benjamin Weaver, begins in earnest. Weaver, a Jew among the predominantly Protestant community of England, has assimilated reasonably well. He has recently become a "thief-taker," retired from his earlier professional roles as highwayman and competitive boxer. He now competes with Jonathan Wild for clients, but unlike Wild he forswears the unethical practice of stealing only to later "find" the stolen booty for a fee. Weaver is the grandson of Miguel Lienzo, the protagonist of Liss' third novel, The Coffee Trader, a prequel of sorts to A Conspiracy of Paper. Liss' oeuvre, if we want to call it that is not generic historical fiction, but economic historical fiction. The marketplace is as much a part of his work as the cast of characters and the plot.

A murder has allegedly occurred, and A Conspiracy of Paper is essentially a whodonit. The precise "who" is known early on, but the "official" conclusion is that the death was merely an accident. Others, however, suspect foul play --- a "conspiracy." Suspects abound as to who is really behind the conspiracy.

This is the period of the English Enlightenment. John Locke is fifteen years in the grave. David Hume (see February 27, 2011, post) is only 8 years old. But Isaac Newton is in the golden era of his illustrious life. Bernard Mandeville was editing his Fable of the Bees (see January 30, 2010 post), and George Berkeley was active trying to undo Locke's view of a materialist world.

To solve the murder mystery, however, none of these Enlightenment philosophers contributes to a method of investigation. A French mathematician and Catholic philosopher, Blaise Pascal, provides a source of inspiration. Probability theory is invoked. And so is right brain/left brain wisdom. I refer to the mind's right brain capacity to make intuitive hunches, and the left brain capacity to meditate, analyze, and sort through information. Weaver's friend, Elias, advises him, "[Pascal's] thinking is precisely what will allow you to resolve this matter, for you must work with probability rather than facts. If you can only go by what is probable, you will sooner or later learn the truth." Weaver responds, "Are you suggesting I conduct this matter by randomly choosing paths of inquiry?" Elias responds, "Not randomly. If you know nothing with certainty, but you guess reasonably, acting upon those guesses offers the maximum chance of learning who did this with the minimal amount of failure. Not acting offers no chance of discovery. The great mathematical minds of the last century, Boyle, Wilkins, Glanvil, Gassendi --- have set forth the rules by which you are to think if you are to find your murderer. You will not act on what your eyes and ears show you, but on what your mind thinks probable."

The murder victim is Weaver's father --- the son of Miguel Lienzo, who migrated from Amsterdam to London with his brother. Weaver's father is a stock-jobber, and Weaver suspects, as information starts to become available, that his father uncovered a conspiracy to manipulate the value of South Sea Company stock.

Only later in the novel, as Weaver laments how difficult it has become to bring the investigation of his father's murder to a conclusion, and he says, "Your philosophy [referring to Elias] has brought me this far, but I cannot see how it takes me much farther," Elias responds, "If philosophy no longer yields results, perhaps it is not because you have reached your limit to understanding philosophy. I think it is far more probable that philosophy had done what philosophy can do, and you would be wise to trust your instincts as fighter and a thief-taker. . . Trust your instincts." What would Sherlock Holmes say?

In the end the crime is solved, not because of instinct, but because a crucial piece of information suddenly falls into Weaver's lap -- the revelation of a lie that reveals the identity of the murderer. The revelation is not accidental. Jonathan Wild pushed the information in front of Weaver to help him out. The conspiracy behind the murder of his father turns out to be a vastly different conspiracy than Weaver initially postulated and conceived. Probabilities and beliefs did not solve this murder. Factual information did, much in the way that scientific experimentation during the Enlightenment era was undermining long-held beliefs that were products of mental reasoning, faith, and bias.

Decisions are made based on probabilities because we have incomplete information, uncertainty. Hume essentially made the point (see February 27, 2011 post), and he could not have been the first. Some people have access to more information than others and can act on superior information to their advantage. Wild is such a person, and in Wild's version of thief-taking where the taker is also the thief, it is just an early version of what we now call insider-trading. A Conspiracy of Paper was written and published as the 20th century came to a close and the technology stock bubble burst. And stock market manipulation and insider trading have not disappeared either. Liss is an excellent storyteller, and he is very clever at detecting in the annals of economic history, just as he did too in The Coffee Trader, the parallel times of the human past that reverberate in the modern mind.

Sunday, November 6, 2011

Rita Carter, Mapping the Mind (Rev. 2010)

Five postings in 2011 on books and subjects related to the mind and brain, and I was consciously aware of that fact and a desire to move on to something new. But as I wrapped up V.S. Ramachandran's The Tell-Tale Brain (see previous post), his remark that as neuroscientists map the brain they are "grouping their way toward the periodic table of elements" reminded me that a book on The Bookshelf that I had purchased last year, Rita Carter's Mapping the Mind, was waiting to be read. I found this book at the bookshop at the conclusion of the American Museum of Natural History's exhibit on The Brain. I recalled that what impressed me toward a purchase was the exquisite drawings of the brain, many of which included arrows to illustrate the interconnectedness of specific brain regions to explain a specific neuronal process. For those who are not practicing neurologists, a picture can nicely supplement a thousand words.

By the end of Mapping the Brain, I wondered if this should have been the first book I ever read on the brain. Would I have better appreciated all the other material I have read on this subject if I had already read this book? I can't answer that, but studying Rita Carter's text after I had read these other books, many of which are discussed or mentioned in prior posts, was facilitated by my prior exposure to the subject. Either way, Mapping the Brain is a good overview and introduction to the brain and a good review as well.

Ramachandran's analogy that neuroscience's understanding of the brain is moving in the direction of establishing something akin to chemistry's periodic of table of elements is best left as a metaphor rather than a suggestion of equivalence (as I think the statement was intended). It is fair to say, like the periodic of table of elements, that the brain is organized, but it is not sequential in the same sense that the elements can be organized sequentially according to their atomic weight or related in their properties as part of one of 18 different groups. While thinking about this, it crossed my mind that evolutionary age would be one way of sequentially organizing the parts of the brain. Antonio Damasio did something like this in Self Comes to Mind (see April 8, 2011 post), describing the sequential evolution of the brain stem, the limbic system, and ultimately the cerebral cortex. But the brain is indeed very complex, as Carter notes in her closing paragraph, when she says that "today's mind voyagers are discovering a biological system of awe-inspiring complexity." One could also try to organize the parts of the brain sequentially, starting with a particular sensory input, and follow the connections to other parts of the brain to conscious awareness and action (or unconscious action, as the case may be) that ensues. In the end, however, that effort would not be particularly useful given the multi-layer network of sensory inputs that are processed simultaneously, including the presence of variable emotional reactions in connection with a particular sensory input that could stimulate a different behavioral outcome.

Both Carter and Ramachandran caused me to question a statement I made in a prior post (June 12, 2011 post) while discussing Michael Shermer's The Believing Brain. I wrote:

"I strongly suspect that if we dissected human brains and a network of connected neurons from a representative sample of humans, we would find a very high level of near identity among brains. There will be some differences due to DNA, and there will be some pathological differences as well, perhaps caused during embryonic development. But I believe that by and large we will find that human brains, neuron by neuron, are organized and folded and layered in substantially identical ways." There is substantial truth in my remark --- I did not say identical, but I did say "near identity" and "substantially identical." These words came at the risk of maybe overstating the case. Carter writes, "Human brains are constructed along fairly standard lines and so we all tend to see the world in a fairly standard way." And as I noted, there are differences due to DNA, pathological injury, and embryonic development. But I overlooked perhaps the largest exception --- experience (nurture) and its impact on memory -- and what is referred to as synaptic plasticity. And it is long-term memory -- enhanced by repeated experiences in some that others do not share --- that gives rise to our autobiographical self and what makes each one of us unique. So while we may "tend to see the world in a fairly standard way," Carter notes:

"Every brain constructs the world in a slightly different way from any other because every brain is different. The sight of an external object will vary from person to person because no two people have precisely the same number of motion cells, magenta-sensitive cells, or straight line cells. . . . An individual's view is formed both by their genes and by how their brain has been moulded by experience. Musicians, for instance, have brains which are physically different from others and which work differently when they play or hear music. . . . Extraordinary individual ways of seeing things may also arise from strange 'quirks' of brain development. Albert Einstein, for instance, had a very oddly constructed brain which might account for his astonishing insights into the nature of space and time."

Carter's treatment of language pretty much follows that of Ramachandran. And her treatment of memory restates much of what Antonio Damasio (see April 8, 2011 post) and Daniel Schacter (see September 20, 2011 post) discuss, but I still learned something new. For example, "Episodes that are destined for long-term memory are not lodged there right away. The process of laying them down permanently takes up to two years. Until then they are still fragile and may quite easily be wiped out. It is this replay from hippocampus to cortex and back again -- a process known as consolidation, that slowly turns fleeting impressions into long-term memories. . . Much of the hippocampal replay is thought to happen during sleep. Recordings from hippocampal cells show them engaging in a 'dialogue' with cortical cells, during which they signal one another, back and forth, in a call-and-reply formation. Some of this is known to take place during the 'quiet' phase of sleep, when dreaming, if it occurs at all, and is vague and instantly forgotten. Until memories are fully encoded in the cortex they are still fragile and may quite easily be wiped out. And even when they are established, they are not fixed. A memory is not, in fact, a recollection of an experience but the recollection of the last time you recalled the experience. Hence our memories are constantly changing and redeveloping. The process by which a memory changes is more or less the same as the consolidation process that lays it down for the first time. As we will see, each time we recall something, it is changed a little because it becomes mixed up with things that are happening in the present. Reconsolidation is a process by which this slightly altered memory effectively replaces the previous one, writing over it, so to speak, rather like re-recording over a rewritable DVD." I mentioned this phenomenon in the September 20, 2011 post discussing Daniel Schacter's discussion of the consistency bias, whereby the brain infers past beliefs from our current state and the reference to Joseph LeDoux's discussion of reconsolidation. Carter explains it better.

I also learned that not all memories are stored in cortical areas. While long-term memories are initially stored in the hippocampus, as described above, over the course of roughly two years they are transferred to the cortex and the hippocampus is no longer required for their retrieval. These memories are distributed in the same parts of the brain that encoded the original experience. So sounds are found in the auditory cortex, taste and skin sensory memories are found in the somatosensory cortex, and sight in the visual cortex. But procedural -- "how to" --- memories are stored outside of the cortex, in the cerebellum and putamen, and fear memories are stored in the amygdala.

Carter also addresses, albeit briefly, the subject of imagination, which I have touched on in several previous postings (see, for instance, July 30, 2011 post and May 22, 2011 post) and describes the connection to memory. "Our ability to conjure up scenarios which have not actually happened is prodigious. Imaginative capacity runs along a spectrum from the mundane skills required to envisage what your supper might taste like if you combined the onion, mushrooms, and left over chicken in the fridge with some curry sauce, through to the awe-inspiring visions of artists, writers, and excitable children. Even the humblest of these skills outranks the abilities --- as far as we can tell --- of every other species. . . . At first sight memory and imagination seem quite distinct: the first in concerned, after all, with what happened already whereas the second is all about what has not. But recent studies show that imagination is wholly dependent on memory, because memories are its building blocks. When we imagine something happening we root around in our memory and come up with experiences which seem likely to recur, then combine them, chop them, shake them and blend them until they come out as something entirely different." In my view, this is cannot be unrelated to the process of reconsolidation that I mentioned above and some of the biases that other writers have described (see September 20, 2011 post and June 12, 2011 post) whereby memories are altered.

A text box entitled "Remembering the Future" by Eleanor Maguire is inserted by Carter, which perceptively quotes Lewis Carroll's Alice in Wonderland: "It's a poor memory that only works backwards." Maguire cites an MRI study that found recalling past experiences and imagining future possible ones activated a network of brain regions in common including the hippocampus. She notes a new theory of 'scene reconstruction,' which allows for the internal rehearsal of events or scenes, which underpins the process of creating a simulated event. Maguire writes: "[I]n humans, the use of this scene reconstruction process goes far beyond simply predicting the future, to general evaluation of fitness for purpose and ultimately creativity. For example, a scriptwriter or novelist who is writing a passage in a film or book may play out the whole scene using their construction system, not with the idea of predicting the future, but instead for the purpose of evaluating its aesthetic suitability." This is the sort of discussion I was hoping to find in The Tell-Tale Brain (see previous post). What is missing from her discussion is the identification of the parts of the brain involved in imagination, although the text seems to identify the hippocampus as one candidate for assembling disparate memories. Also missing is the evolutionary basis for this ability. Obviously, the ability to plan for the future has survival value, and our planning capacity is lodged in the frontal cortex, where working memory occurs. The building blocks for understanding creative imagination are before us, and understanding a collateral capacity --- the capacity for deception, including self-deception --- by imaginatively rearranging memories and treating them as factual, when they are not, needs to be understood as well. Part of this is reflected in the previous discussions of mental biases. Carter notes that the subject of belief and non-belief is not unrelated to how the brain treats statements it believes to be true and those it believes to be false. Research, she says, shows that truth telling appears to be the default position for the human brain, and that telling lies involves extra cognitive effort. I am not sure this is entirely true, as bias mechanisms appear to be short-cuts for resolving conflicts in our memories.