The Inner World: The Search for Subatomic Particles

Larry Gilman, Paul Davies, K Lee Lerner. Scientific Thought: In Context. Editor: K Lee Lerner & Brenda Wilmoth Lerner. Volume 2, Gale, 2009.

Introduction

Greek philosophers of the fifth century BC were among the first people to guess that the world is made up of invisibly small particles. They speculated that these ultimate building blocks were atomos, uncuttable, from which we derive our word “atom.” As it turns out, the elementary units of chemistry are not atomos at all, but can be divided into smaller particles, namely protons, neutrons, and electrons. Even protons and neutrons are not elementary, but are made up of smaller particles called quarks. Quarks are presently classed as truly elementary or atomos. Electrons also appear to be truly elementary.

Protons, neutrons, electrons, and quarks are subatomic particles, meaning that they are below (sub) the atomic level. Hundreds of other subatomic particles are not found in atoms. Most of these appear only under certain rare conditions, then rapidly break down into other particles and energy.

The basic structure of the atom and the existence of subatomic particles was not known until the early twentieth century. The search for experimental evidence of certain elementary particles and their properties continues in the early 2000s, over a century after the discovery of the electron in 1897.

Historical Background and Scientific Foundations

Scientific and Cultural Preconceptions

Although ancient Indian thinkers such as Kanada (c.600 BC) proposed atomic theories of matter independently of the Greeks, modern scientific investigation of atoms and subatomic particles grew out of the European intellectual heritage. The philosopher Democritus (c.460-c.370 BC) was the first known Greek exponent of atomism, the belief that the material world is composed of atoms. Democritus had no experimental evidence for atoms—the technology of his day was too limited—but it struck him that atoms could explain many observations. The properties of various substances could, he suggested, be due to the properties of their atoms: sticky substances consisted of atoms covered with tiny hooks, liquids consisted of slippery, round atoms, different flavors were the results of differently shaped atoms coming in contact with the tongue, and so on. Between atoms, he argued, was void—nothingness, empty space. “There is,” he taught, “nothing but atoms and the void.” After Democritus, Greek philosopher Plato (c.427-c.347 BC) and Roman philosopher and poet Lucretius (99-55 BC) also propounded forms of atomism.

For over two thousand years, people did not know how to use atoms to explain observations in an exact way or even to verify that atoms are real. Until the thirteenth century, the dominant authority in European scientific thought was the Greek philosopher Aristotle (384-322 BC), who denied the possibility of both atoms and the void. In 1277, theologians of the Church condemned Aristotle’s strict determinism and asserted that God could create anything, including atoms and void. This opened room for philosophical speculation about multiple world’s, infinity, gravity, projectile motion, and atoms. European scientists proceeded to speculate about atoms, more or less fruitlessly, for the next 500 years.

Only after the scientific revolution of the sixteenth and seventeenth centuries did atomism begin to be used for the quantitative explanation of natural phenomena. For example, English scientist John Dalton (1766-1844) proposed in 1805 that the chemical law of multiple proportions could be explained by the weights of different atoms. The law of multiple proportions is the observation that whenever two or more elements combine to make a compound, the ratio of their combining weights is always a fraction of small whole numbers. For example, 8 grams of oxygen always combine with 1 gram of hydrogen (an 8:1 ratio) to produce 9 grams of water, while 16 grams of oxygen always combine with 1 gram of hydrogen (a 16:1 ratio) to produce 17 grams of hydrogen peroxide. Dalton proposed that this fact could 2 be explained by supposing that compounds like water consist of clusters of atoms. (Today we call these clusters molecules.) Each particle of water, for example, consists of two hydrogen atoms (written H 2) attached to a single oxygen atom (O), which we write as H O. Since each oxygen atom weighs about 16 times as much as a hydrogen atom, 1 gram of hydrogen has twice as many atoms in it as 8 grams of oxygen—just the right numbers to combine to make 9 grams of water, as recorded by the law of multiple proportions.

Dalton’s theory explained many facts, but there was still no direct evidence for the existence of molecules or atoms and no suspicion of the existence of subatomic particles. The inner structure of the atom itself was to remain mysterious for another century.

In the late 1700s, a curious phenomenon was observed: flashes of light were produced by electrical discharges in glass tubes out of which most of the air had been pumped. In the 1850s, it was discovered that when even more air was removed (by improved pumps), mysterious rays seemed to be emitted by a positively-charged piece of metal (cathode) at one end of the tube, travel the length of the tube, and cause a green glow on the glass at the far end. The rays were dubbed “cathode rays.” In the 1890s, British physicist J.J. Thomson (1856-1940) began a careful analysis of cathode rays. He hypothesized that they consist of a stream of negatively-charged particles. The existence of such particles, electrons, had been suggested by Irish physicist G. John-stone Stoney (1826-1911) in 1874. By measuring the bending of cathode rays by electric and magnetic fields, Thomson was able to calculate the mass-to-charge ratio of the electron, that is, how much electrical charge there is for a given amount of electron mass. The electron was thus the first subatomic particle to be studied.

In 1887, German physicist Heinrich Hertz (1857-1894) observed the photoelectric effect, in which light knocks electrons free from a metal surface. In 1905, German-American physicist Albert Einstein (1879-1955) showed that the amounts of energy possessed by the electrons liberated by the photoelectric effect could only be explained by assuming that light also consists of particles. He called the light particle a photon. Photons have nonzero energy, but, unlike electrons, zero mass.

Light also remained a wave, however, when considered from some points of view, and this double nature was referred to as wave-particle duality.

A fuller understanding of the atom and its constituent particles came with the discovery of the atomic nucleus by English physicist Ernest Rutherford (1871-1937) in 1911. Rutherford showed that almost all the mass of an atom is concentrated in a tiny lump at its center, the nucleus. Electrons surround the nucleus, accounting for little of the atom’s weight but most of its size. Since electrons were known to be negatively charged and atoms normally have zero charge, Rutherford and his colleagues reasoned that the nucleus must be positively charged. In 1918, he proposed that the nucleus of the simplest and lightest of all atoms, the hydrogen atom, is an elementary particle. In 1920, he dubbed this hypothetical particle the proton. A proton is 1836 times as massive as an electron and carries an equal (but opposite) electrical charge.

Yet a puzzle remained: elements heavier than hydrogen did not seem to contain enough hydrogen nuclei (protons) to balance out the charges on their electrons. For example, a calcium atom weighs 40 times more than a hydrogen atom, but its nucleus carries an electrical charge only 20 times that of a hydrogen nucleus. Physicists proposed that these heavier nuclei actually did contain more protons—in the case of calcium, 40 of them—but also electrons to cancel out some of those positive electrical charges. A calcium nucleus, if this theory were correct, would consist of 40 protons (charge + 40 e, where e is the charge of a single electron) and 20 electrons (charge ‒20), and so have a net charge of ;40‒20 = +20. The atom as a whole would be electrically neutral—zero total charge—when this nucleus was surrounded by a cloud of 20 electrons.

This reasonable-sounding theory turned out to be wrong. Only with the discovery of a fourth subatomic particle in 1932 by English physicist James Chadwick (1891-1974), the neutron, was the mystery solved. Now the basic structure of matter could be explained in terms of three subatomic particles: Atoms consist of protons and neutrons (or, in the case of hydrogen, a lone proton) bound together into dense, massive nuclei and normally surrounded by clouds of electrons. The number of electrons in an atom usually equals the number of protons in the nucleus. When the number of electrons attached to an atom is greater or smaller than the number of protons in the nucleus, the atom bears a negative or positive electrical charge. Electrical currents consist of free electrons in motion. Chemical reactions directly involve only the electrons of atoms, never the nuclei. Photons are not a building block of atoms, but can be emitted by atoms. They explain light and other forms of electromagnetic radiation such as radio waves and x rays.

Yet it already was known that the world of subatomic particles could not really be this simple. Observations of radioactive decay, which occurs when individual atomic nuclei spontaneously explode, releasing photons and various high-speed particles, had convinced physicists that an electrically neutral particle with very small mass must also exist—the neutrino (literally, “little neutral one”). The existence of the neutrino was proposed by Austrian physicist Wolfgang Pauli (1900-1958) in 1930, but not proved experimentally until 1956. As of 2007, it was still not known whether neutrinos have zero mass, like the photon, or very small mass; however, it was known that over 50 trillion neutrinos pass through the human body every second. Few of these particles interact with the body’s matter; they pass right through.

A new theory, known as quantum physics, was also lengthening the list of subatomic particles in the 1920s and 1930s. In 1930, British physicist Paul Dirac (1902-1984) proposed for theoretical reasons that a positively charged opposite or antiparticle of the electron must exist, the positron. The positron was detected experimentally in 1932. Antiprotons and antineutrinos were also proposed and, eventually, detected. When particles and their corresponding antiparticles come into contact, they are annihilated and all their mass is released as energy.

As quantum physics was refined, it predicted the existence of more and more subatomic particles and antiparticles. The pion was proposed in 1935 to account for the strong force holding protons and neutrons together in the nucleus (the strong force overpowers the electrical repulsion that would otherwise cause the protons to fly apart, since electrical charges of the same sign repel each other). The pion and muon were first observed in 1937. The W and Z particles were predicted theoretically in the 1940s, to be detected years later. Other particles were observed directly in cosmic-ray collisions in particle chambers. Cosmic rays are very high-energy particles that arrive from outer space; when they strike atoms, large amounts of energy are released, some of which takes the form of particles. Many of these particles are short-lived, breaking down quickly into other particles and photons. By observing the tracks made by these short-lived particles in liquid or vapor, their properties can be deduced.

Starting in the 1950s, physicists stopped relying solely on cosmic rays to produce such collisions. Instead, they began building large machines called colliders that would accelerate charged particles (usually protons) to high energies. Particles moving in opposite directions would be smashed head-on against each other to produce energetic collisions. By observing these collisions, physicists discovered scores of new subatomic particles.

As the list of particles grew, the question of what was really meant by an “elementary” particle arose. Could there really be so many “elementary” particles? Why wasn’t Nature simpler? Could simplicity perhaps be restored by theorizing that some particles are composed of a smaller number of even more fundamental particles, just as the atom itself had proved not to be atomos, uncuttable? The answer was not known certainly until the existence of the particle called the quark was demonstrated in experiments carried out from 1967 to 1973. The discovery of the quark and its place in the outline of modern physics’s theory of the elementary particles is known as the Standard Model.

Quarks

Most of the new particles discovered in the 1930s through the 1950s were of the type called hadrons. Protons and neutrons are hadrons; electrons belong to another, less numerous group called the leptons. In the early 1960s, American physicists Murray Gell-Mann (1929-) and George Zweig (1937-) sought to simplify the long, messy catalog of hadrons by proposing that they are not truly elementary but are composed of smaller building blocks called quarks. Each quark, Gell-Mann and Zweig predicted, would have a charge of +2 e /3 or ‒ e /3. Two quarks of the +2 e /3 type (“up” quarks), combined with a single quark of the ‒ e /3 type (“down” quark), would form a particle with a charge of +1 e, namely, a proton. Neutrons and other hadrons would be explained as other combinations of various quarks. By this theory, photons and electrons would remain truly elementary, atomos or indivisible.

Much as Rutherford and his colleagues used alphaparticle scattering from gold atoms to prove the existence of the atomic nucleus, particle scattering was used again to prove the existence of quarks. This time, however, instead of firing protons at alpha particles (helium atom nuclei), scientists fired electrons at protons. The experiments were carried out from 1967 to 1973 in Menlo Park, California, at the Stanford Linear Accelerator, a U.S. government facility operated by Stanford University. Just as the scattering angles of alpha particles probed the structure of the atom, the scattering angles of electrons probed the structure of the proton. The existence of quarks was proved by data from these scattering experiments.

The Standard Model

In the 1960s and 1970s, even before the existence of quarks was experimentally proved, a structure of mathematical theory to describe the particle world began to be developed—the Standard Model of fundamental particles and interactions. The Standard Model accounted for all known subatomic particles and predicted the existence of some that had not yet been detected. The mathematics underlying the Standard Model are complex, but its basic claims are not hard to summarize. The Model says that all matter and forces can be described in terms of two basic types of particles, fermions and bosons.

Fermions make up matter, that is, all particles with mass. They are subdivided into six leptons (the electron, electron neutrino, muon, muon neutrino, tau, and tau neutrino) and six quarks (the up, down, charm, strange, top, and bottom quarks). All matter, including the protons, neutrons, and other hadrons, is made of fermions.

Bosons, the particles that mediate forces, are mass-less. There are five types of bosons, according to the Standard Model: the photon, the W boson, the Z boson, the gluon, and the Higgs boson.

As of early 2008, the only particle in the Standard Model that had not been definitely observed in the laboratory was the Higgs boson, although signs of the Higgs boson may have already been observed in particle collisions at the Tevatron particle accelerator at Fermilab in Batavia, Illinois. Physicists hoped that the Large Hadron Collider at the CERN laboratory in Switzerland, an underground ring-shaped structure 17 miles in circumference and costing over $8 billion dollars, would begin gathering data on the Higgs boson when it was completed, probably in 2008. The Higgs boson is of particular importance in the Standard Model not only because its existence is still unconfirmed, but because it is essential to giving mass and inertia to other particles.

The Standard Model has passed many experimental tests, but is known to be incomplete. For example, it takes no account of the force of gravity, which is extremely weak compared to other forces. For example, the electromagnetic repulsion between two electrons is about 10 35 times stronger than their gravitational attraction. Another fault in the Standard Model is its apparent inability to account for dark matter. Dark matter is gravitating matter that clumps around the galaxies but does not interact with light and so cannot be observed directly. There are a number of competing theories about the nature of dark matter, which is estimated to be about 85% of all the matter in the universe, but the favored theory at this time is that it consists of some kind of non-baryonic subatomic particle not accounted for by the Standard Model. If this turns out to be correct, it means that the Standard Model describes only a small fraction of the matter in the universe—though it does seem to account for all of the familiar kinds of matter, such as what makes up the stars and planets.

Modern Cultural Connections

Since its inception, the search for ever more fundamental particles has been one of the archetypal successes of physics, confirmation of the assumption that the universe can be understood by the human mind.

Until the early twentieth century, particles were conceived of as tiny, hard spheres like miniature billiard-balls, assumed to obey Newton’s three laws of motion. But the subatomic particles turned out to be much more subtle and strange. They are not tiny, hard spheres, though in some situations they seem to behave that way, but in fact have a deeply ambiguous nature, both particlelike and wavelike—the ambiguity called wave-particle duality. Furthermore, most physicists have for the past century affirmed what is known as the Copenhagen interpretation of quantum physics (because it was formulated in Copenhagen, Denmark, in 1927 by Niels Bohr [1885-1962] and Werner Heisenberg [1901-1976]), which asserts that the behavior of individual particles is truly random—that is, not bound to occur a certain way because of the previous history of the universe, but unpredictable in its essence. Particle physics thus overturned the simplistic materialism of the nineteenth century. And, thanks to other aspects of quantum physics, the observing mind itself no longer seemed independent of the world, a mere shadow cast by its machine-like motions, but, rather, fundamental to it in some way, at least according to some interpretations.

English physicist Sir James Jeans (1877-1946) said in 1930 that “the stream of knowledge is heading towards a non-mechanical reality; the universe begins to look more like a great thought than a great machine. Mind no longer appears to be an accidental intruder into the realm of matter … we ought rather to hail it as the creator and governor of the realm of matter.” This view has been echoed as recently as 2005 in the prestigious science journal Nature, in which prominent American physicist Richard Conn Henry (1940-) wrote that “The universe is immaterial—mental and spiritual.” In the twentieth century, the influence of the new knowledge about particles and their strange properties was felt throughout, not only in science and technology but in philosophy, theology, and art.

However, the effects of the new knowledge were not entirely positive: Understanding the structure of the atomic nucleus made possible the atomic bomb. The splitting of atomic nuclei by collision with neutrons was first observed in 1938. About seven years later, on July 16, 1945, the United States exploded the world’s first atomic bomb. Two decades later, tens of thousands of nuclear weapons had been built. About 27,000 remain in the world today.

Socially, the effect of improving scientific understanding of subatomic particles has been primarily through technology. A detailed understanding of photons, electrons, and the properties of atoms has been essential to the electronics revolution, in which the dimensions of electronic devices have been pushed farther down into the microscopic scale, even approaching the atomic scale. Scientists hope to use the quantum properties of individual atoms to produce future generations of computers far more powerful than those available today, a field called quantum computing. Applied particle physics in the form of nuclear weapons continues to be a concern worldwide, as the number of nations with the ability to build such weapons slowly increases. As of mid 2007, ten countries were known to possess nuclear weapons: China, France, India, Israel, North Korea, the Russian Federation, Pakistan, the United Kingdom, and the United States.

Scientists were still searching for the last particle predicted by the Standard Model, the Higgs boson, as of 2008. After a decade of controversy and the expenditure of over $200 million, construction on the United States Superconducting Supercollider was stopped in 1993 by the United States Congress. Then-president William Jefferson Clinton lobbied against the shutdown because such an act reduced the United States “position of leadership in basic science.” The termination of the SSC project left Organisation Europeene pour la Recherche Nucleaire (CERN) in Geneva, Switzerland, as the major site of international particle physics research. The vast CERN was built primarily to search for the Higgs boson, and, it was hoped, would reveal particle physics beyond the Standard Model by generating extremely high-energy particle collisions after its completion in 2008.

Because the Standard Model accounts for the strong, weak, and electromagnetic fields and forces (including the electroweak theory force), it is a powerful theoretical model. However, the discovery of subatomic particles as a means to understanding the forces ruling the quantum world has become inexorably entwined with relativity-dominated cosmological theory regarding the origin and structure of the universe. A Standard Model built on particle physics research may be a step toward a grand unification theory that can unite incompatible quantum and relativistic theories. Articulation of such a theory would have profound scientific and philosophical impact comparable to, and perhaps surpassing, Newton’s theories of gravitation, Einstein’s theories of relativity, and quantum theory.

Primary Source Connection

Groups of scientists, funded by governments to the tune of billions of dollars, jockey to be the first to discover the Higgs boson, the last particle predicted by the Standard Model of particle physics, which has not yet been definitely observed.

“Quantum Scoop: The Holy Grail of Particle Physics May Already Have Been Found.”

Some call the Higgs boson the Holy Grail of particle physics. As the only undetected element of the field’s theoretical masterpiece—the “standard model”—the Higgs guarantees a Nobel Prize for the experimenters who find it first. Now the European Union has spent an estimated $8 billion to build the world’s largest particle accelerator, the large hadron collider, to finally track it down.

So goes the reasoning, at least, of popular science writers.

In the last month, The New Yorker, the New York Times, and the Boston Globe, among others, have run articles on the LHC, which will be capable of reaching energies seven times greater than any comparable device ever created. All of this coverage has focused on the Higgs.

But what if someone else has already found it?

A rumor flying around physics departments these last few weeks claims that physicists working at the Tevatron, an accelerator located outside of Chicago, have found something new. Originally passed by word of mouth and private e-mail, the rumor made it into the blogosphere May 28, with an anonymous comment on the blog of a particle physicist living in Venice, Italy. Since then, the rumor has spread.

This isn’t the first time a story like this has circulated. Until the LHC opens, the Tevatron remains the largest accelerator in the world. Among its most significant past discoveries is another standard-model particle, the top quark. And in 2009, it will shut its doors forever. Like the LHC, the Tevatron was built with the Higgs in mind, and as time runs out for America’s biggest atom smasher, some nervy experimentalists have jumped the gun. Last summer, two Tevatron groups released some suggestive, but fruitless, graphs (PDF), just before the International Conference on High Energy Physics; in January, a new crop of rumors emerged, which were reported in the Economist and New Scientist in March. These other rumors have described “bumps”: anomalies in the data that suggest a new particle but are too small for a definitive identification.

The current rumor, which comes in time for the summer conference circuit, may be different. It claims an experiment at the Tevatron has found a peak twice as high as the previous rumors’ bumps. And unlike the other rumors, this one includes details: the new particle’s mass, for instance, which fits within theoretical bounds on the standard model Higgs. Some versions include a decay chain, which describes what the new particle turned into as the experiment progressed, and which may be consistent with the standard model’s predictions.

Of course, the rumor also claims that no one associated with the experiment will confirm the new findings until they’ve had time to publish, likely within the next few weeks. And until they do, no one can be certain what the Tevatron has—or has not—found.

The hype surrounding the Higgs boson is well-deserved. The standard model, a unified view of physics first presented by John Iliopoulos in 1974, describes everything we know about the smallest building blocks of nature yet observed. It’s the most accurate theory ever developed, in any field. And without the Higgs, it doesn’t make much sense: Based purely on first principles, elementary particles should be massless. Some, like photons, do have zero mass; yet others are surprisingly heavy. Enter the Higgs, which would—in theory—interact with these latter particles to make the difference.

So, if the rumor is true and the standard model Higgs has been found at the Tevatron, the LHC is in big trouble: Immediately, its “guaranteed” success—the final particle of the standard model, not to mention a couple of Nobel Prizes for European scientists—is gone.

The irony is that things look just as bleak for the LHC if the rumor is false, and the Europeans end up finding the standard model Higgs themselves. Physicists have developed such a complete description of elementary particles that, once the final piece of the theory is in place, the chances that the LHC will find anything the standard model doesn’t predict are almost negligible.

Particle theorists talk a big game. They get excited and tell reporters, not to mention government funding agencies, that the Higgs is just the beginning: The LHC, some say, may find examples of a class of particles indicative of a new fundamental property of nature, called supersymmetry. Others say there may be two or three particles, which together perform the job the standard model assigns to the Higgs. The truth is that these alternatives patch up the standard model, should something unexpected happen. If a Higgs-like particle is found, say, but it’s too light to be the standard model boson. Or if it decays in a surprising way. In cases like these, the LHC could indeed produce dramatic new discoveries.

But what happens if the Higgs turns out to be just right? Well, then the standard model predicts that you’d need a machine roughly a quadrillion times more powerful than the LHC to find anything new. With current technology, this would mean an accelerator the circumference of the Milky Way. Though some theorists—proponents, for instance, of string theory—speculate about what such an accelerator might find, few other physicists take them seriously.

In fact, finding the “just right” Higgs would be bad news all around. Surely the European Union wants more for its $8 billion than a single particle. But more importantly, it would provide the final proof of the standard model, which happens to be clunky, boring, and infuriatingly silent on the Big Questions that the final theory of physics was supposed to answer. Questions like: Why is there something, rather than nothing? And where does gravity fit in? If the standard model turns out to be a complete description of particle behavior, as the discovery of the Higgs would suggest, these questions may never be answered.

That’s why particle physicists, and the EU member states that have spent Nepal’s annual GDP to build this accelerator, are hoping that no one, in Chicago or Switzerland, finds the Higgs. The future of high-energy physics lies with the small chance that the standard model is wrong, and something exotic happens at LHC energies. Something, I hope, that will help us understand the why questions that the standard model leaves wide open.

Weather Al, James Owen. Slate. June 4, 2007.

Primary Source Connection

Scientists can use neutrinos (subatomic particles only weakly blocked or deflected by matter) to probe Earth’s core. The following New Scientist magazine article by Celeste Biever describes how scientists in Japan have measured the earth’s planetary radioactivity and how this measurement could provide clues to fundamental questions about how Earth formed.

Celeste Biever is a regular contributor to New Scientist and is a freelance science writer based in Boston.

First Measurements of Earth’s Core Radioactivity

Earth’s natural radioactivity has been measured for the first time. The measurement will help geologists find out to what extent nuclear decay is responsible for the immense quantity of heat generated by Earth.

Our planet’s heat output drives the convection currents that churn liquid iron in the outer core, giving rise to Earth’s magnetic field. Just where this heat comes from is a big question. Measurements of the temperature gradients across rocks in mines and boreholes have led geologists to estimate that the planet is internally generating between 30 and 44 terawatts of heat.

Some of this heat comes from the decay of radioactive elements. Based on studies of primitive meteorites known as carbonaceous chondrites, geologists have estimated Earth’s uranium and thorium content and calculated that about 19 terawatts can be attributed to radioactivity. But until now there has been nothing definitive about exactly how much uranium there is in the planet, says geologist Bill McDonough of the University of Maryland in College Park. “There are fundamental uncertainties.”

There is one way to lessen this uncertainty, and that is to look for antineutrinos. These particles are the antimatter equivalent of the uncharged, almost massless particles called neutrinos and are released when uranium and thorium decay to form lead. If antineutrinos are being created deep within the planet they should be detectable, because they can pass through almost all matter.

Now, the KamLAND antineutrino detector in Kamioka, Japan, has counted such antineutrinos. An international team of scientists analysed the data and found about 16.2 million antineutrinos per square centimetre per second streaming out from Earth’s core. They calculate that the nuclear reactions creating these particles could be generating as much as 60 terawatts, but are most likely putting out about 24 terawatts (Nature, vol 436, p 499). “We have made the first measurements of the radioactivity of the whole of Earth,” says John Learned, who heads the KamLAND group at the University of Hawaii in Manoa. The KamLAND group’s finding is like unwrapping a birthday present, says McDonough. With time, as more antineutrinos are detected, KamLAND may be able to determine once and for all whether radioactivity is entirely responsible for heating Earth or whether other sources, such as the crystallisation of liquid iron and nickel in the outer core, also play a significant role. “[Detecting anti-neutrinos] is the way of the future in terms of hard numbers about the system,” says McDonough.

Antineutrinos could also reveal the radioactive composition of the crust and mantle, which will give geologists clues as to when and how they formed. But to do that, they will have to be able to pin down exactly where the antineutrinos are coming from, and this will require a whole network of detectors. “We are heading towards doing neutrino tomography of the whole Earth,” says Learned. “This is just the first step.”

Biever, Celeste. New Scientist. July 27, 2005: 9.