Peter Pesic. American Scientist. Volume 90, Issue 3. May/Jun 2002.
Quantum mechanics is the heart of modern physics. It is the guiding thread in the maze of atoms and nuclei, the key to understanding the stability of matter. Despite this, quantum theory remains enigmatic, for its basic assumptions seem bewildering: Everything is both wave and particle; uncertainty and probability rule. Even after a hundred years, both experts and the general public remain baffled. Albert Einstein called quantum theory “spooky” and tried to get around it. Paul Dirac thought that it “cannot even be explained adequately in words at all,” which is why physicists rely so much on the abstract mathematical structure of the theory. Yet as eminent a physicist as Freeman Dyson has suggested that, even after one has struggled to master the formal language of quantum mechanics, the best that one can do is to say “I understand now that there isn’t anything to be understood.” Richard Feynman, a master of finding simple ways to understand complex ideas, also threw up his hands. After summing up the basic rules of quantum theory, he confessed that “one might still like to ask: ‘How does it work? What is the machinery behind the law?’ No one has found any machinery behind the law. No one can ‘explain’ any more than we have just ‘explained.’ No one will give you any deeper representation of the situation. We have no ideas about a more basic mechanism from which these results can be deduced.”
There is no getting around the difficulty of quantum theory, which, despite its bizarre character, has passed every experimental test brilliantly. Yet there may be a way to view this theory that draws its strangeness into clearer perspective. For 20 years, I have been studying the role of identity and individuality in the history and foundations of quantum mechanics. These studies lead me to the view that at its heart lies the radical conception that all elementary particles lack individuality. Their identities merge and, in so doing, give rise to the weird world of quantum phenomena.
To be sure, quantum identity has long been a well-accepted aspect of the theory, crucial in understanding the chemical bond and the structure of elements. Still, it has been considered the consequence of other, more abstract assumptions. I want to tell the story the other way around, putting quantum identity at center stage. In more specialized papers, I have given mathematical arguments to justify this position; here, I will present some broader considerations that support my assertion that the weirdness of quantum mechanics is best understood as a consequence of quantum identity.
The Royal Cubit
In seeking a new understanding of quantum theory, one does well to begin with the concept of individuality because it is so deep an issue both for philosophy and physics, reaching back to the earliest Greek speculations about atoms. For Aristotle, individuality was basic, whether for persons, objects or atoms. Yet he thought that individuality was less important than shared membership in a species or genus.
Later philosophers fundamentally reconsidered the nature of individuality. In the 17th century, Gottfried Wilhelm Leibniz argued that no two leaves could ever be exactly the same, even if they came from the same tree. In his view, their uniqueness reflects the distinct history and vantage point of each leaf in the whole universe, rather than any intrinsic individuality in the leaf by itself. Leibniz thought that a rational God would not allow identical individuals, for how could He put one of them here and the other one there, if He could not tell them apart?
Leibniz was convinced that individual uniqueness contradicted the existence of anonymous atoms. At the same time, Isaac Newton advocated the reality of atoms but treated each as distinguishable. For Newton, the distinction between the individualities of two particles is so marked that it is impossible to confuse them.Yet the application of Newtonian mechanics to matter disclosed a radically different aspect of reality: the forces between objects. Newton himself speculated that gravity acts over vast distances not directly but through “the mediation of something else, which is not material.” Nearly two centuries later, in a kindred spirit, Michael Faraday described electricity and magnetism in terms of what he called lines of force or fields. For Faraday, matter was no longer central; instead, lines of force were the true reality.
Faraday extended this concept to the invisible fields that constitute visible light, understood as waves in the lines of force. He was deeply struck that these vibrations were not fixed in place: Stationary fields surround static charges or magnets, but when these sources accelerate, their fields leap free, manifest as waves of light traveling through boundless space. Not long before, Thomas Young’s famous diffraction experiment showed that light was indeed composed of waves. The light in Young’s apparatus could show interference precisely because waves, being processes rather than material objects, are interchangeable and can coincide in space and time.
It became clear to Faraday that, whether stationary or moving, atoms give rise to fields that are completely indistinguishable from one another. This notion bears on the individualities of the atoms themselves, for Faraday’s radical view had a further consequence, although one not drawn at the time: If charge and matter really do not exist except as a way of speaking about the density of field lines, which are themselves indistinguishable, then material particles are really not distinguishable either. Faraday’s vision of a world of fields cast doubt on the kind of individuality appropriate to impenetrable particles.
Faraday’s work began the ongoing dialogue between fields and particles. A few decades earlier, as atomic theory developed, another crucial advance took place: Atoms lost their individuality. In 1800, John Dalton argued that all atoms of hydrogen must be exactly the same in their observed properties, or hydrogen would not really be a distinct element but would instead be a spread of different “hydrogens.”
Faraday’s younger colleague James Clerk Maxwell also argued that all molecules are absolutely identical, “whether they are found on earth, in the sun, or in the fixed stars.” Maxwell could make this assertion because he noted that the same spectral patterns were observed on Earth as in the light of distant stars. The spectra of those stars also showed that atoms had the same properties eons ago, when they first emitted the light. Maxwell concluded that each molecule is stamped with a “royal cubit,” a mark of such uniformity that gives it “the essential character of a manufactured article.”
Later experiments showed other marks of the royal cubit, beginning with the discovery of “cathode rays,” luminous electric discharges in evacuated glass vessels, the ancestor of television and video monitors. In 1897, J.J. Thomson succeeded in making cathode rays travel in curved trajectories by passing them through crossed electric and magnetic fields. He found that their ratio of charge to mass had a single, universal value, easily understandable if cathode rays really were streams of particles (electrons, as we now call them), each of which had the same charge and mass. In 1899, he also found identical values for the particles produced when light hits a metal plate (the photoelectric effect). Thomson referred to both of them as “corpuscles,” to emphasize their particulate nature.
Planck’s “Act of Desperation”
Thomson’s experiments with electrons supported the idea of indivisible units of matter but did not directly address their individuality. But that issue surfaced at about the same time in another area of physics: thermodynamics. In principle, Newtonian mechanics relies on following the trajectory of each separate particle. To deal with the vast numbers of atoms in ordinary objects, Ludwig Boltzmann and other physicists of the 19th century developed statistical mechanics. Still, Boltzmann thought that the continuity and distinguishability of each particle’s trajectory was “the first fundamental principle” of mechanics. Here the essential equality of atoms conflicted with the individual determinism of Newtonian mechanics.
In retrospect, one sees that the crisis came in 1900, in the work of Max Planck, although he did not articulate the significance of individuality as he altered it. Planck was trying to reconcile the reversible principles of Newtonian physics with the irreversible laws of thermodynamics. To do so, he considered an oven with a small opening that allows light to escape. Earlier physicists had shown that if the oven is perfectly black, it absorbs and emits all frequencies of light in a manner independent of its material. The color of the emitted light– the black-body radiation-depends only on the oven’s temperature. Planck sought to uncover the fundamental physics that determines the spectrum of this light.
At the same time, his colleagues were measuring the spectrum of black-body radiation with new accuracy (including the previously unmeasured infrared and ultraviolet components), and Planck hoped to test his theory against their observations. What followed Planck later called “an act of desperation.” To make his theory agree with the experiments, he had to require that the energy exchanges in the oven always took place in discrete amounts, which Planck called quanta.
Planck was disturbed that this assumption made no sense to him. To clarify it, he constructed a new derivation of his result that emphasized the central importance of the atom. Although atomic theory commanded wide acceptance by the end of the 19th century, such eminent scientists as Wilhelm Ostwald and Ernst Mach viewed atoms as theoretical constructs, not physical realities.
Initially one of these doubters himself, Planck became a convinced atomist because of his understanding of probability. He argued that there can be no statistics without discrete objects to count, just as the probabilities of rolling several dice rely on the discrete states given by the faces of a single die. In this way, Planck argued that the understanding of physics rests on the relative probability of states, which must therefore be discrete or atomic.
Planck extended that approach to include not only gases or assemblies of atoms but also light itself. He was sure that light, although immaterial, obeys the laws of thermodynamics. Those laws seemed so general that he could not imagine an exception to them. If so, he reasoned, there must be some basis for counting the different probabilities of the states of light; there must be, so to speak, “atoms” of light, which could only mean discrete states of light energy, or quanta. The crux was that in counting the different ways the light quanta could share the energy, Planck had to treat them as absolutely indistinguishable from one another, for if he did not, he got results that disagreed with experiment.
At that point, Planck did not comment on the curious method of counting he used, perhaps more struck by the necessity of discreteness, the “desperate” move of letting the size of the quantum step, h, be a finite number, not zero. Even in 1909, he presented the strange counting without seeming to realize its strangeness, as if it were a peculiarity of his theoretical approach rather than of light itself. Yet Planck’s integrity kept him from denying what he had asserted, even while he struggled to overcome its paradox.
Planck’s strange counting of light quanta was only the beginning. Radical loss of individuality eventually became a central feature of the completed structure of quantum theory. Thinking about quantum particles in 1952, Erwin Schrodinger emphasized that “you must not even imagine any one of them to be marked-‘by a red spot’ so that you could recognize it later as the same.” Thus, one cannot pick out “a” neutron, mark it and follow its subsequent career, as both Newton and Einstein had assumed. This amazing combination of perfect equality with complete indistinguishability is the very core of quantum identity. No human language has an adequate word for this condition. Referring to it as “lack of individuality” misleadingly implies that quantum particles should have had individualities but are defective. Instead, I have coined a word to express this condition in a positive way: identicality, which means that the members of a species only have identity as instances of that species, without any features that distinguish one individual from another. Identicality includes total indistinguishability and complete equality of all observable traits, whenever or wherever they are measured.
So strange is such a condition that some philosophers prefer to treat quantum particles as if they still had individualities, which are somehow hidden from view. This concealment is logically possible but seems unnecessarily complex. I will take the simpler position that if individuality is totally hidden, it is lost. I leave the reader to consider which view is ultimately right or whether any judgment is possible.
Here is an example of identicality at work, one based on experiments done by Johann Summhammer and Hermann Rauch at the Technical University of Vienna, Anton Zeilinger at the Massachusetts Intitute of Technology and others. Inside an evacuated vessel, a stream of neutrons (from a nuclear reactor) is split in two, and both sides are diverted toward silicon crystals, which then deflect the two beams back together. The crystals are separated by several centimeters so that the two paths are quite distinct (the neutron beams are effectively no bigger than a small postage stamp).
However, it should not be possible to distinguish identical neutrons according to which path they take. The characteristic pattern they make when the beams recombine results from the wavelike interference between the neutrons, which reflects the impossibility of determining their individualities, although the neutrons always arrive whole and undivided when they reach the detector on the far side. This strange mix of phenomena is what is usually called “wave-particle duality.”
Rather than trying to think of a neutron as somehow both wave and particle, it is more illuminating to see that both these aspects reflect the neutrons’ identicality. To test this idea, one can alter the experiment slightly. Every neutron has a certain intrinsic quantity called spin. It is possible to use a source that produces neutrons that are polarized, meaning that their spins are all aligned, pointing up, for instance. Such a source produces the same result as before, because the neutrons are still indistinguishable. But if one introduces a special magnet that flips the spin of neutrons traversing just one of the routes, the neutrons on the two paths become distinguishable and the pattern of interference disappears! These characteristic effects of quantum interference also arise with larger systems: namely with atoms, small molecules and even buckyballs (60 carbon atoms arranged in a soccer-ball configuration). In such experiments, the source may be so weak that there is only one particle in the apparatus at any given time. So the issue is not primarily the indistinguishability of, say, two neutrons, but even more the continued identity of any neutron with itself over time and through space. Thus, even a single neutron maintains its identicality.
This concept has some subtlety. We can refer to a neutron in room A, as opposed to one in room B (in a different building or even a different galaxy). Yet doing so does not really distinguish the neutrons but only the rooms they are in. If the particles ever escaped confinement, they could never be distinguished from each other.
This applies directly to the amazing images Hans Dehmelt and his coworkers at the University of Washington obtained in 1980 for a single atom, ingeniously penned into a tiny trap and kept far apart from other atoms. Here one can actually see a single atom. Its intermittent twinkling is a direct manifestation of quantum jumps as the atom absorbs light, becomes excited and radiates away its excess energy. Such an atom can stay trapped for months; one became so familiar to its handlers that they even gave it a name (“Astrid”).Yet it remained wild and without individuality, however long they tried to tame it. If you let “Astrid” go free, there is no way to trap it again. The name is human, the atom nameless.
Dehmelt’s coup built on his earlier work trapping a single electron, which despite being held for 10 months, was ultimately lost, becoming a faceless electron in the crowd. This anonymity cannot be evaded by making the trap ever smaller. The smaller the box, the stronger the forces one will need to confine it, and the more the electron will move irregularly and faster, as if it wants to avoid being pinned down. This anthropomorphic way of speaking is not correct-an electron has no volition or sensation, of course-but it does give a vivid and intuitive form to what is usually called the Heisenberg uncertainty principle: The more one localizes a particle in space, the more its momentum becomes uncertain. The box is built from electrons identical with the prisoner; as the box becomes ever smaller, it becomes more possible to assert that the prisoner could have exchanged places with one of the guards.
It is exactly this possibility of exchange that prevents one from treating an atom or a neutron as separate from its brethren. Like identical twins, they are always fooling people by switching places. This metaphor is not exact, since human twins are not as identical as atoms, yet it conveys something helpful.
Confusion of identity leads to weird quantum effects, the “spooky” correlations that Einstein found so paradoxical, as if they were the outrageous pranks played by twins. He insisted that physics must give “a complete description of the individual system,” so that we can “follow the course of individual atoms and forecast their activities.” But this capability would require the identification of each particle as a unique individual, which is impossible.
Einstein’s objections stemmed from his belief that, ultimately, the laws of physics determine perfectly the career of each individual particle, just as Newton had envisioned. But identicality forces one to go beyond the way Newton described identifiable particles, each having an observable position and velocity governed by a mathematical equation. This requirement led to a profoundly new mathematical picture of quantum theory, expressed by Max Born in 1928.
To hide individuality, Born imbued the world with two levels: an outer level of positive numbers, which represent the probability of observing a neutron somewhere in spacetime, and an inner level not accessible to observation but guiding the observable probabilities. On the inner level, there are amplitudes, complex numbers (involving the square roots of negative numbers), which cannot be observed, because our measuring devices register only real, rational numbers. What is more, Born noted that these amplitudes follow strict equations (such as Schrodinger’s equation, or Dirac’s) that determine their unfolding in time with no uncertainty.
However, the probabilities that one observes turn out to be the absolute square of these amplitudes, and these probabilities do not unfold in a determined way. For instance, the probability of a neutron being observed at one time and place does not determine, absolutely, the probability of its being observed later, at another time and place, for that would mean identifying the particle. So there is no master equation for predicting the appearances of the neutron with certainty, even though there is an equation for predicting the probability that you can observe the neutron here or there.
Mathematics provides only a representation of physical reality but gives much insight into the nature of that reality and dispels certain widespread misunderstandings. For instance, quantum theory is not finally about uncertainty and indeterminism. On the inner level, quantum theory is as certain and determined as are the theories of Newton and Maxwell. The apparent uncertainty emerges only when we try to observe outwardly these inwardly determined things. But we are composed of an immense number of electrons and quarks (constituting our protons and neutrons), which can enter into extremely complex states. It is those complex states that “are” us, in the sense of our apparent overall individuality and they characterize all our attempts to view the world. When we turn to examine a single atom, it is with devices that are huge by comparison, attuned to our enormous size and crude sensibilities, and we observe only probabilistic events. Yet our minds can enter a realm of certainty through the equations of the theory.
Still, such a double view of the world is disturbing. For Einstein, quantum mechanics abandoned the central project of physics: the complete understanding of observable data through mathematical theory. He would not accept Born’s view that what is observable is not certain and what is certain is not observable. To be sure, Einstein recognized the power and validity of quantum theory, although he did not regard it as complete. His concept of understanding demanded going beyond these apparent limits to knowledge. He thought that each neutron really has its own individuality, which we are confused about, but which God knows perfectly well.
Yet no observer, not even God, can distinguish one neutron from another. It is no wonder that Einstein resisted this idea, for it is genuinely strange. However, as Francis Bacon observed, “there is no excellent beauty that hath not some strangeness in the proportion.” Viewed a different way, the strangeness of identicality might lead us to see its beauty.