Monday, August 31, 2009

Christine Kenneally, The First Word (2007)

In my worldview, there are three very big subjects for human inquiry: 1) the realm of the very large --- the universe (and whether there is more than one, making the word universe a possible oxymoron), its origin and history; 2) the realm of the very small -- the smallest molecular (sub)units, and their behavior; and 3) the human mind --- how it works, consciousness, intentionality. Some might object and cite Schrodinger's interrogatory --- What is Life? --- as a big subject for human inquiry, and I do not disagree, but I submit that if humanity can get its consensus arms around my three big subjects, the subject of "life," will fall into place in large measure. The first two subjects are critical and fundamental to understanding how physical objects and living beings were first created, died or evolved; the third subject is really about "us" --- a characteristic that is specially defining of human beings.

The first two books I discussed below surveying the "new" science of information theory actually address all three of the very big subjects in one way or another. Christine Kenneally's survey of the current research examining the evolution of human language, The First Word, fits in the third big box - the human mind; yet as you read her survey of language and speech research, one can't help but think about the communications --- computational activity --- going on everywhere in the realm of the very large and the realm of the very small documented by the Lloyd and Seife (see August 23, 2009 and August 17, 2009 posts). The constituents of the entire physical world have been computing --- communicating --- for billions of years, so it should come as no surprise that our species communicates. The form of those communications varies from constituent to constituent -- by collision among atomic particles in the realm of the very small as Lloyd documents -- but how did the human species develop a very sophisticated form of communicating that includes not just speech and written language, but speech acts such as gestures, pointing, and other animated conduct? That is the question that Christine Kenneally investigates in The First Word.

I want to suggest to her that she look to Lloyd's treatment of information theory and complexity theory --- whether we call it communicating or computational activity, there must be emergent properties that cause communications to take on additional complexity in the course of evolutionary history, including gestures, sounds, word formation, symbolic representations, and finally meaning. Kenneally does not address this particular angle, but some of the research she reviews lays the foundation for this type of discussion about human language in light of the types of communications studied in non-human species. She does cite Luc Steels for the proposition that "human language ability is an emergent adaptive system that is created by a basic cognitive mechanism rather than by a genetically endowed language module."

Is human language an adaptation that required many evolutionary events? Or is human language something that is hardwired in our species? Noam Chomsky notwithstanding, the sum of the research reviewed by Kenneally supports only the former view. Not only has the human species evolved biologically to enable our unique type of communicating, but the form of human communications, language, and the meaning of words have evolved as well. Language is a social institution, and social institutions and culture evolve, albeit at a different and faster pace than biological evolution. We seem to be approaching Richard Dawkins' treatment of the evolution of memes here. And while The First Word does not have an answer or definitive conclusion to "the search for the origins of human language," Kenneally does endorse a very Dawkins-like worldview:

"Even if researchers can't pinpoint every evolutionary event that led to the language we have today, and even though we don't know exactly what all the bends in the historical road looked like, the principles for further illuminating the path of language evolution are now self-evident. Fundamentally, the appearance of design in biology and in language can be taken as a sign of evolution, not a designer. Additionally, where complex design does exist, it makes sense not to treat the whole as a monolith that simply developed from nothing to something in one or two quick steps. Finally, the most likely scenario is that both evolutionary novelty and derivation played a significant role in the evolution of a phenomenon as complex as language."

Anyone studying the human mind must investigate speech acts as part of their inquiry, and Kenneally's survey of the recent research on the evolution language is a good place to start. But language is only a piece of the mind's puzzle. How is it that human minds can read other minds without speaking? Speaking and hearing and reading and intentional gestures are not the only forms of communicating.

Sunday, August 23, 2009

Charles Seife, Decoding The Universe (2006)

This book is subtitled How the New Science of Information Is Explaining Everything in the Cosmos, from Our Brains to Black Holes. The "new science" is over sixty years old by one measure --- 1948 was the year that Claude Shannon of Bell Labs, whom Charles Seife labels the "hero of information theory," recognized "that information could be measured and quantified and that it was intimately linked to thermodynamics." But in 1948, Shannon did not set out to explain black holes or how the mind processes information; he was trying to determine how much information could be carried across a telephone line or any other communication channel for that matter. Born from this inquiry was a recognition that information could be reduced to a yes or no, true or false, on or off outcome, and the smallest piece of information consisted of a binary digit (e.g., true or false) that those familiar with computer code recognize as 0, 1. The compressed nomenclature for a binary digit is a "bit."

Most communication is information rich, meaning that multiple pieces of information are represented by a stream or string of bits --- 01000111001. A 70,000 word book containing 350,000 letters, writes Seife, with each letter encoded in five bits, contains about 1.75 million bits of information, which represents less than 0.25% of the capacity of a compact disk --- the amount of space on a CD for about 10 seconds of a recorded song, which contains vastly more information than the written word.

Seife covers much of the same ground as Seth Lloyd's Programming the Universe, (see August 17, 2009 post). Where Lloyd is one of the actors in this "new science," Seife is one of those fine science writers for the general public who can turn scientific investigation into a great story. In Decoding the Universe, Lloyd arrives on the scene only at the very end of Seife's story --- the part where information theory does begin to explain black holes and the future of the cosmos. Sharing Lloyd's description of the universe as a massive information processor, Seife explains that while the universe may be infinite, information processing cannot go on forever --- and at some point information processing will stop and the gezillions of bits of information that life has stored and preserved will be dissipated (not destroyed) so it is useless and life in the visible universe (not just human life) will become extinct. Civilization is doomed (a long time from now); the laws of information have sealed our fate.

The storyline here is the renewal of quantum mechanics --- dressed for success as quantum information. Physicists have been pondering how to reconcile or unify --- mathematically and theoretically --- the very large and the very small. Quantum mechanics explains atomic behavior well in the realm of the very small; it does not explain gravity well in the realm of the very large, the domain of the theory of relativity. Some physicists have explored string theory for a solution to the problem of unifying the very small and the very large --- a quantum theory of gravity. But the mathematics underlying string theory have not yet led to a quantum theory of gravity and the theory has yet to be validated by experiment. Information theory and thermodynamics, which may explain the behavior of gravity and information in black hole, offers a promising alternative direction toward a quantum theory of gravity.

What does any of this mean for you and I? Is there any practical significance for information theory? The answer is clearly yes, and Seife points to the advancements in communications technologies that followed Shannon's 1948 paper. The Department of Defense is closely following developments in quantum computers utilizing quantum information for cryptographic applications. What mathematicians and physicists consider beautiful mathematical models can and does form the foundations for experimental testing (where we are capable of performing a test), and ultimately those experiments can become the basis for useful information --- not merely practical applications that may benefit humans and other living beings in their every day lives, but in addressing large questions about our place in the cosmos. So do not dismiss the elegant mathematics that is only understood by a few merely because they are representational of a theory. For those larger questions like --- is the universe infinite and is the arrow of time infinite? In the long run, we are all dead.

Monday, August 17, 2009

Seth Lloyd, Programming the Universe (2006)

We contemplate the physical universe in terms of atomic particles, their energy, mass, charge, spin, and macroscopically, products that emerge from combinations of individual particles, such as compounds, liquids, gases, solids, atmosphere, water, organic material, crystals, living beings, atmosphere. Take a harder look. At the core of the physical universe is information. Information is physical. It is both visible and invisible. Invisible information is neither spirit, ethereal, or non-material; we can't see it because it is simply too small for our human sensory tools to see. Information is exchanged between particles at an atomic level by their collision, not unlike an exchange of information caused by photons colliding with the lens of an eye, subsequently triggering a movement of neurons and chemical reactions that triggers or creates a memory in the mind.

Until I read Programming the Universe, I was uncomfortable with the idea that the mind was a computer, as espoused by Steven Pinker and others. Perhaps I was just uncomfortable with the analogy. Not because of an attachment to Cartesian duality; more likely because of an emotional need to see the mind as something more than an input/output device. Whether we regard the mind as a computer or not, the fact is that the mind processes information, and Seth Lloyd demonstrates that processing -- computing -- information is what the universe -- not just life as we humans know it -- is all about. Our mind is just an evolved manifestation of the computational activity that is happening everywhere else around us.

University of California philosopher, John Searle, denies the physical nature of information, "except for information that is already in the mind of some conscious agent." (The Mystery of Consciousness (1997)) "Information," he says," does not name a real physical feature of the real world in the way that neuron firings, and for that matter consciousness, are real physical features of the world." While our mind has a way of creating the illusion that there is a non-physical attribute to our thoughts and that categories of data or information are somehow separate from our consciousness, I find it difficult to understand how Professor Searle succumbs to this illusion even "in part." Lloyd challenges this view.

The physicality of information is not what Lloyd's book is really about, although it is a significant insight. The book is about a scientific revolution in our understanding of information -- how the classical understanding of information that was essential for the development of computer science 60 years ago is now being gradually replaced by our understanding of quantum information -- in part thanks to quantum computers that can engage in computational activity that are beyond the capabilities of classical computers. There is a direct parallel here with classical physics and quantum mechanics, and understanding the relationship between a physical reality we can "see" and the physical reality we cannot see, but we can know. If this sounds like science fiction or even religion, it is not. If you are a dualist or a closet dualist, this book will make you think twice (if you are prepared to).

The universe computes by taking measurements. When something is measured, the universe "sees," "hears" -- type in any sensory reaction here, not necessarily limited to human senses and consciousness -- and the universe takes account of information. So atoms colliding in another galaxy and photons hitting the lens of an eye are no different.

Werner Heisenberg taught that the object of a measurement will destroy other complementary information, which becomes unmeasurable -- the uncertainty principle. Do we mean "destroy?" No, after all, Schrodinger's cat did not necessarily die in the quantum universe. Welcome to Entropy 101, a significant teaching of Lloyd's work and others who have mined the field of the theory of information and quantum information. The first and second laws of thermodynamics are active here. Information, like energy, cannot be destroyed (or increased) --- it is conserved (the first law) and entropy increases (the second law).

Whoever said that entropy is a measure of disorder did the world a disservice. The statement is literally true, but under the second law disorder does not increase -- just the opposite. In thermodynamics, entropy classically refers to the dispersal of energy over time and a balance between energy and work -- temperature, pressure and density tending to find a stasis. But, more generically, what is really meant by entropy is that, over time, the probability of equilibrium or stasis or more stable outcomes increases. As a measure of information, when entropy decreases, our measurements record less likely physical outcomes. As entropy increases, our measurements record more likely physical outcomes.

A complementary treatment of this subject is also found in Charles Seife's book --- written the same year that Lloyd's was published --- Decoding the Universe (2007). Seife is one of the best popular science writers, and here I found one of the most accessible discussions of black holes and what happens beyond the event horizon, where information appears to disappear (or does it?). Lloyd's additional contribution comes from connecting information theory to complexity theory -- emergence and self-organizing systems. For those who have followed the work at the Santa Fe Institute and scholars such as Stuart Kaufmann on the subject of dynamic living systems, this chapter will be rewarding.