A theme running through several of this year's posts is the role of human imagination in aesthetics, creating works of fiction, rewriting history, creating sacred texts, as well as in science, and the neuroscience that explains the biology of imagination. We tend to think that scientific research and analysis is all about uncovering what is "real," identifying objectively verifiable facts that are observable to our senses and are widely accepted as true. This is conventional wisdom, but human imagination plays a key role in uncovering what is real and is a core feature of scientific inquiry. Much of what scientific inquiry is directed at, particularly in physics, is the stuff we cannot see --- it is either too small to be detected by our senses, even aided by the latest technological tools that enable us to detect or observe the very small, or it is too far away (in either time or space) or too large to see or comprehend. Inquiry into the very small or the very distant begins with imagination, sometimes called theory, which is hopefully both internally consistent as well as externally consistent with the things we can observe. Consensus over theory typically does not always develop quickly, and it sometimes involves substantial disputes. In scientific inquiry, theory is typically followed by developing experiments that seek to confirm or disprove what the imagination has created. Sometimes those experiments take only a few weeks or months to confirm or disprove the theory; sometimes those experiments follow the development of the theory by decades. Manjit Kumar provides a peek into this process in his book Quantum.
At the beginning of the 20th century, physicists studying light, heat, color and electromagnetism realized that the physical world we think we actually see does not conveniently coincide with the physical world that we do not see. We see a beam of light or we feel the radiation of heat and we sense a continuous emission and absorption of light or heat. But is it continuous? The prevailing wisdom was that radiated energy was emitted as a continuous wave and that changes in heat, energy and color were believed to be explained by changes in the amplitude or frequency of the wave. Max Planck discovered that this was not possible. Energy either increases or decreases discontinuously, and, contrary to the prevailing wisdom, Planck was forced to imagine that energy is released or absorbed in packets --- what he called "quanta." The reason we cannot see these packets is because the increase or decrease in energy occurs in very small steps (6.626 divided by one thousand trillion trillion), a number that is now known as Planck's constant. These steps are so small they cannot be observed. Experiments subsequently confirmed Planck's theory. A few years later, Albert Einstein extended Planck's conclusions to light in his 1905 paper on the photoelectric effect: while light depends on wavelength and frequency, all electromagnetic radiation actually travels in a stream of tiny "light-quanta," later called photons.
Although the notion that at its most fundamental level reality was based on tiny atoms had been around since classical Greece, atomism was never seriously developed in science until the 19th century. Einstein and Planck's work accelerated modern atomic theory. During the first three decades of the 20th century, the basic structure of the atom and the behavior of electrons was close to being fully developed. Ernest Rutherford and Niels Bohr were hugely responsible for this work. Bohr introduced the idea that as an electron orbiting the atomic nucleus dropped from a higher energy orbit to a lower energy orbit, a photon (a quantum of energy) was released and spectral light is produced. This has been referred to as a quantum leap. Beginning with Planck and Einstein and developed further by Rutherford, Bohr, Max Born, Werner Heisenberg, Wolfgang Pauli, Louis de Broglie and others, this concept of physical reality became known as quantum mechanics, as distinct from classical physics initially developed by Isaac Newton.
The subtitle of Kumar's book is "Einstein, Bohr, and the great debate about the nature of reality." The "great debate" ironically did not dispute the theory of quantum mechanics. Yes, there were debates along the way as the unexplained attributes of quantum theory were worked out, and one of those debates was whether quantum theory left any room for classical Newtonian physics. Another debate was whether physics was constrained to what could be known and observed. Emerging from the latter debate is Werner Heisenberg's now-famous uncertainty principle, which establishes that it is not possible to simulaneously measure two or more complementary variables --- in the case of an electron, its momentum and location. It is only possible to measure one of them accurately at a given point in time. And there have been further debates about the meaning or significance of uncertainty relations: does it mean that we can never predict causal relations? Or is the meaning of the principle confined to a problem of measurement? To Bohr, quantum mechanics took on a probabilistic character where "only the probability of a given outcome among a range of possibilities can be predicted."
The "great debate" was over the question of whether quantum theory was "complete." Bohr claimed that it was. Einstein had his doubts. Einstein believed that quantum mechanics was observer dependent, and that the act of measuring physical pheonemana interfered with our understanding of reality. In Einstein's view, reality was observer-independent --- yes, trees fall in the forest when there is no one around to watch them fall. From Einstein's perspective, a "complete" theory of physics should be able to describe reality without creating the uncertainty relations created by the observer of reality: physics ought to be able to explain causal relations without having to rely on probabilistic assessments over a range of possibilities, although later Einstein dropped his criticism on quantum mechanics' focus on probabilities. From the 1920s through the early 1950s, Einstein and Bohr politely and professionally debated the completeness of quantum mechanics and whether it truly described reality. The debate was never fully settled during their lifetime, nor has it been settled thereafter.
The difference between the two men was even deeper however. As Kumar explains, for Bohr quantum mechanics was not a description of reality. "There is no quantum world," Bohr declared. "There is only an abstract mechanical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature." For Bohr, physics was an observer-dependent exercise aimed at explaining what we could about nature in our own terms. Einstein disagreed. He believed there was an observer-independent reality, and physics "has the sole purpose of determining what is."
Einstein, notes Kumar, never put forward his own theory about the nature of reality. He challenged Bohr, not with facts or data, but with a number of "thought experiments" that would purport to undercut an aspect of quantum mechanics. Bohr would parry with rebuttal that would undermine Einstein's thought experiement. These were mind games among bright men. Kumar describes a conversation between Einstein and Heisenberg, in which the former probed the latter about the philosophical foundations of his work:
"You assume the existence of electrons inside the atom," said Einstein, "and you are probably right to do so. But you refuse to consider their orbits even though we can observe electron tracks in a cloud chamber. I should very much like to hear more about your reasons for making such strange assumptions." Heisenberg replied, "We cannot observe electron orbits inside the atom, but the radiation which the atom emits during discharges enables us to deduce the frequencies and corresponding amplitudes of its electrons. Since a good theory must be based on directly observable magnitudes, I thought it more fitting to restrict myself to these, treating them, as it were, as representatives of the electron orbits." Einstein reacted: "You don't seriously believe that none but observable magnitudes must go into a physical theory? It is quite wrong to try founding a theory on observable magnitudes alone. In reality the very opposite happens.
It is the theory which decides what we can observe."
This is a fascintating conversation. Kumar observes that 100 years earlier in 1830, Auguste Comte had argued that, "while every theory has to be based on observation, the mind also needs a theory in order to make observations." For Einstein, "A phenomenon under observation produces certain events in our measuring apparatus, which eventually produce sense impressions and help fix the effects in our consciousness." These effects, said Einstein, depend upon theories. Heisenberg appears to be taking the position of David Hume in the Treatise on Human Nature when he says "a good theory must be based on directly observable magnitudes." Hume wrote (see February 27, 2011 post), "Should it be demanded why men form general rules, and allow them to influence their judgment, even contrary to present observation and experience, I should reply, that in my opinion it proceeds from those very principles, on which all judgments concerning causes and effects depend. Our judgments concerning cause and effect are derived from habit and experience; and when we have been accustomed to see one object united to another, our imagination passes from the first to the second, by a natural transition, which precedes reflection [that is before we even seriously think about what we just experienced] and which cannot be prevented by it." Einstein, on the other hand, argues that it is a theory that decides what we can observe.
But what if the "theory" --- the means by which we explain our observations --- embraces probabilities? In other words, humans are capable of measuring directly some of our observations and "know" that measurement with relative certainty, but we cannot measure every observation and we are compelled to admit that we can only "probably" know what we have observed with respect to these other observations. Does this undermine what we can say is "real" or what we can claim is an objective reality? Even Hume appears to concede this limitation on our knowledge, and that probabilities are central to "beliefs." Would this not have been satisfactory to Einstein? Probably, says Kumar, who concludes that "Einstein accepted that quantum mechanics was the best theory available --- the only one which can be built out of the fundamental concepts of force and material points." But acceptance of quantum mechanics, for Einstein, was without prejudice to continue the search for the theory that explains everything, a theory that demystified the uncertainties that quantum mechanics was willing to leave unexplained. For Bohr, there was no reality beyond what could be observed and measured and there was no physical reality beyond what observation and measurement could account for. In this debate, Einstein clearly prevails. If Bohr prevails, we would never have spent millions of dollars looking for quarks, bosons, gluons, leptons, and smaller atomic units.
Imagination leads us into different directions. It can lead us closer to understanding and knowing a confirmable, accepted reality that we do not yet fully see, hear, smell, or feel because it is too small for our technologically-assisted senses to observe. It can also lead us to the realm of fantasy, fiction, and reinventing history. The "great debate" is a testament to imagination. Quantum mechanics powerfully explains physical pheonomenon in a way that has been confirmed by experiment. The "thought experiment" in science is nearly an exercise in pure imagination, not for the purpose of creating fantasy or fiction, but for the purpose of explaining what "is." Einstein understood that if the thought experiment is found wanting in the light of new experimental evidence, then the philosophical position it supports collapses with it.