Three months previously, on September 3, 1949, a Geiger counter mounted in the nose of an American B-29 weather-monitoring plane that was flying reconnaissance missions in the western Pacific between Yokota in Japan and Eielson Air Force Base in Alaska, began to chatter furiously. Puzzled technicians swarmed to examine the records and soon determined that atomic radiation seemed to be pouring into the sky, from somewhere.
Two days later a second plane, based in Guam, flew over the same route and picked up signs of even more radioactivity: barium, cesium, and molybdenum fission isotopes were found in the upper atmosphere, signatures that suggested that either there had been a nuclear accident somewhere to the east of the plane’s track or someone had exploded an atomic weapon.
It turned out to be the latter. An atomic bomb known in Russia as First Lightning and elsewhere, eventually, as Joe 1, had been exploded by Joseph Stalin’s Soviet Union five days before, in an experiment conducted at a hitherto unknown and subsequently top-secret nuclear test site at Semipalatinsk, in Kazakhstan. The successful exploding of the bomb, which was modeled on, looked uncannily like, and was in fact slightly more powerful than the plutonium weapon dropped by the Americans four years previously over Nagasaki, stunned the outside world. Few Americans and few of their allies thought the Soviets would be able to catch up with the United States in terms of nuclear capability for many more years. But as was discovered a decade later, Moscow had a spy in Los Alamos, Klaus Fuchs; and though debate continues to this day about how valuable the information was that this brilliant young Briton passed to the Soviets, it is generally agreed that, perhaps more than any other spy before or since, Klaus Fuchs changed world history.
For by allowing the Soviet Union to construct nuclear weapons, and ultimately to make hydrogen bombs and all the other terrible paraphernalia of the nuclear age, his gift of secrets permitted the Cold War between East and West officially to commence—with the consequence that for the next half century, and perhaps for longer still, the planet lived in the shadow of the very real possibility of nuclear annihilation.
There was another consequence of this development, however—that to this day is of great significance to the scientific community and, as it happens, has some bearing on the structure of this book. It concerns radioactive pollution.
The explosion in the atmosphere over the coming Cold War years of hundreds upon hundreds of atomic bombs—big and small; fission and fusion; to be launched by missiles or dropped by aircraft or fired from guns; made by the United States, the Soviet Union, Britain, France, China, Israel, India, Pakistan, North Korea, or perhaps Iran—would contribute to the pollution of the earth’s atmosphere by myriad poisonous and radioactive decay products. Until atmospheric testing was banned in 1963, the world was living under a blanket of increasingly radio-polluted air, with effects that would be likely to last for thousands of years.
Crucially, one of the many products created by atomic explosions is the unstable radioactive isotope of carbon known as carbon-14.
This isotope is already naturally present in the world (its presence caused by cosmic ray bombardment), in extremely tiny but measurable amounts. Compared with the amount of normal, nonradioactive carbon-12 in the air, about one part in a trillion is carbon-14.
Plants absorb this carbon during photosynthesis, and animals that consume the plants absorb it, too. So while an animal or a plant is alive, its cells contain both carbon-12 and carbon-14, and in the same ratio as exists in the atmosphere.
However, once the plant or animal dies, its cells stop absorbing carbon—and at that precise moment, the ratio of the two isotopes begins to change, for the simple reason that carbon-14 is unstable, and begins to decay. The isotope has a half-life of 5,730 years, meaning that after that period, half of it will have vanished. After another 5,730 years, half of what remains will have disappeared, and so on and so on. And, it is important to note, the changing ratio of carbon-12 to carbon-14 in the dead animal or plant can be very accurately measured.
What followed this discovery—first made in 1946 by a University of Chicago chemist named Willard Libby, who would win the Nobel Prize for it—was the realization that by measuring the amount of carbon-14 remaining in a dead creature or plant, it should be possible to date, and with some precision, just when that plant or animal died. Thus was born the technique of carbon dating, and it has been in use ever since, a vital tool of archaeologists and geologists in determining the age of found organic materials.
The technique requires one constant, though: for any age calculation to be accurate, the baseline ratio of naturally occurring carbon-12 and carbon-14 has to be a real baseline—it must, in other words, stay the same as it always has been. The figure accepted by Libby and his colleagues and used as the base was the aforementioned one to a trillion: one atom of carbon-14 to one trillion atoms of carbon-12. With that figure firmly in place, the age calculations could be made, and reliably.
But then came the unexpected. As soon as the testing of atomic bombs began in earnest in the 1950s, that baseline figure suddenly began to change. The bombs created immense mushroom clouds of lethal chemistry. They thrust, among other things, a sizable amount of extra carbon-14 into the atmosphere, upsetting the baseline figure and causing the dating calculations suddenly to go awry.
Radiochemists around the world monitored the situation, and as the levels of new carbon kept increasing, test by test, year by year, they kept on writing algorithms to correct the distortions caused by the bombs. But as more and more bombs produced more and more unstable carbon, the situation was fast becoming complicated, irritating; and for a world that placed value on near-absolute precision, it threatened to render age determinations so inaccurate as to be useless.
To address this problem, a decision was reached that would unscramble matters. A date was chosen before which radiocarbon dating could be regarded as accurate, because the baseline was constant. Radiocarbon results that were achieved after that selected date would continue to be regarded with suspicion.
And the date selected—of what is now known as the start of the standard reference year, or the Index Year—was January 1, 1950. Before January 1950 the atmosphere was radiochemically pure. After January 1950 it was sullied, fouled by bomb-created isotopes. So this date, this otherwise unexceptional Sunday when Ho Chi Minh began his campaign in Vietnam, when the Japanese started recalculating how old they were, the day the music died in Grand Central Terminal, would become for scientists a new Year Zero.
The choice of the date was scientifically elegant, logical, and precise. And it would soon spread beyond the world of science alone. For it would have an impact on the entire question of what was meant by the use of the simple word ago.
Science had until this point never been involved in the creation of human calendars. The fact that these words are being written in the year 2015 has to do, not at all with science, but with the decidedly nonscientific and imprecise concepts of myth, faith, and belief. For, in refining the meaning of ago, most of the Western world would employ the initials BC and AD. It was said that something occurred a number of years “before Christ,” or in the Year of the Lord, “Anno Domini,” as in AD 2015.
But this was, of course, contentious to non-Westerners, to nonbelievers. It was a kind of notation that would fall foul of those for whom Jesus Christ meant little; and so in recent times other circumlocutions were offered to help soothe hurt feelings. There was BCE, most commonly, which referred to “before Christian Era” or, for the secular-minded, “before Common Era.”
Yet even this was still a fudge, still woefully imprecise, still essentially based on myth. And BCE did not appeal to scientists, especially once carbon dating and other, more precise atomic dating techniques had been discovered. So they eventually came up with the idea of using the initials BP, “before present.” The Wisconsin ice age, for instance, had its culmination fifty thousand years BP.
All that the acceptance of this new notation required was an agreement on just when was present? So, in the early 1960s, a pair of radiochemists came up with an answer. They suggested the use of the same standard reference year, the Year Zero moment of January 1, 1950.
Their suggestion seemed logical, neat, appropriate. Everyone, more or less, agreed. So that date is now accepted well-nigh universally among scientists for the ephemeral concept that is fleetingly known as the present. And the present begins at the start of January 1950.