A SHORT HISTORY OF NEARLY EVERYTHINGWhen the skies are clear and the Moon is not too bright, the Rev Robert Evans, a quiet and cheerful man, lugs a bulky telescope onto the back deck of his home in the Blue Mountains, about 50 miles west of Sydney, and does an extraordinary thing. He looks deep into the past and finds dying stars.
Looking into the past is the easy part. Glance at the night sky and what you see is history and lots of it — not the stars as they are now but as they were when their light left them. For all we know, the North Star, our faithful companion, might actually have burnt out last January or in 1854 or at any time since the early 14th century — its light takes 680 years to reach us — and news of it just hasn’t reached us yet.
Stars die all the time. What Bob Evans does better than anyone else who has ever tried is spot these moments of celestial farewell. By day he is a kindly and now semi- retired minister in the Uniting Church in Australia, who does a bit of locum work and researches the history of 19th-century religious movements. But by night he is, in his unassuming way, a titan of the skies. He hunts supernovas.
A supernova occurs when a giant star, one much bigger than our own sun, collapses and then spectacularly explodes, releasing in an instant the energy of 100 billion suns, burning for a time more brightly than all the stars in its galaxy. “It’s like a trillion hydrogen bombs going off at once,” says Evans.
If a supernova explosion happened too close to us, we would be goners, according to Evans.
“It would wreck the show,” as he cheerfully puts it. But the universe is vast and supernovas are normally much too far away to harm us.
And supernovas are significant to us in one decidedly central way. Without them we wouldn’t be here. They are the link between Big Bang and the creation of our own solar system.
The story of their discovery, and of their role in creating life on Earth, involves two of the most singular figures in 20th-century science, one of them a Yorkshireman whose recent obituary accused him of putting his name to “rubbish”. But for the moment let’s stick with the quiet and cheerful Bob Evans.
Most supernovas are so unimaginably distant that their light reaches us as no more than the faintest twinkle. For the month or so that they are visible, all that distinguishes them from the other stars in the sky is that they occupy a point of space that wasn’t filled before. It is these anomalous, very occasional pricks in the crowded dome of the night sky that Evans finds.
To understand what a feat this is, imagine a dining-room table covered in a black tablecloth and someone throwing a handful of salt across it. The scattered grains can be thought of as a galaxy. Now imagine 1,500 more tables like the first one — enough to fill an Ikea car park — each with a random array of salt across it. Now add one grain of salt to any table and let Bob Evans walk among them. At a glance he will spot it. That grain of salt is the supernova.
Evans’s is a talent so exceptional that Oliver Sacks, in An Anthropologist on Mars, devotes a passage to him in a chapter on autistic savants — quickly adding that “there is no suggestion that he is autistic”. Evans laughs, but he is powerless to explain quite where his talent comes from.
“I just seem to have a knack for memorising star fields,” he told me, with a frankly apologetic look, when I visited him and his wife Elaine in their picture-book bungalow on a tranquil edge of the village of Hazelbrook, out where Sydney finally ends and the boundless Australian bush begins.
“I’m not particularly good at other things,” he added. “I don’t remember names well.”
“Or where he’s put things,” called Elaine from the kitchen.
THE term supernova was coined in the 1930s by the first of our eccentric scientists, a memorably odd astrophysicist named Fritz Zwicky. Born in Bulgaria and raised in Switzerland, Zwicky went to the California Institute of Technology (Caltech) in the 1920s and there at once distinguished himself by his abrasive personality and erratic talents.
He didn’t seem to be outstandingly bright, and many of his colleagues considered him little more than an irritating buffoon. A fitness fanatic, he would often drop to the floor of the Caltech cafeteria and do one-armed press-ups. He was also notoriously aggressive, threatening to kill his closest collaborator, a gentle man named Walter Baade, on at least one occasion.
But Zwicky was capable of insights of the most startling brilliance. In the early 1930s he turned his attention to a question that had long troubled astronomers: the appearance in the sky of occasional unexplained points of light, new stars.
Improbably he wondered if the neutron — the subatomic particle which had just been discovered in England and was thus both novel and rather fashionable — might be at the heart of things.
It occurred to him that if a star collapsed to the sort of densities found in the core of atoms, the result would be an unimaginably compacted core. Atoms would literally be crushed together, their electrons forced into the nucleus, forming neutrons. You would have a neutron star.
Imagine a million really weighty cannonballs squeezed down to the size of a marble and — well, you’re still not even close. The core of a neutron star is so dense that a single spoonful of matter from it would weigh 90 billion kilograms. A spoonful! But there was more. Zwicky realised that after the collapse of such a star there would be a huge amount of energy left over — enough to make the biggest bang in the universe. He called these resultant explosions supernovas. They would be — they are — the biggest events in creation.
Interestingly, Zwicky had almost no understanding of why any of this would happen. And he was held in such disdain by most of his colleagues that his ideas attracted almost no notice. When, five years later, the great Robert Oppenheimer turned his attention to neutron stars in a landmark paper, he made not a single reference to any of Zwicky’s work, even though Zwicky had been working for years on the same problem in an office just down the corridor.
Zwicky was also the first to recognise that there wasn’t nearly enough visible mass in the universe to hold galaxies together, and that there must be some other gravitational influence — what we now call dark matter. But his deductions concerning dark matter wouldn’t attract serious attention for nearly four decades. We can only assume that he did a lot of press-ups in this period.
Supernovas are extremely rare. A star can burn for billions of years, but it dies just once and quickly, and only a few dying stars explode. Most expire quietly, like a camp fire at dawn. In a typical galaxy, consisting of 100 billion stars, a supernova will occur on average once every 200 or 300 years. Looking for a supernova, therefore, is a little bit like standing on the observation platform of the Empire State Building with a telescope and searching windows around Manhattan in the hope of finding, let us say, someone lighting a 21st-birthday cake.
So when the hopeful and softly spoken Evans got in touch with the astronomical community more than 20 years ago to ask if they had any usable field charts for hunting supernovas, they thought he was out of his mind.
In the whole of astronomical history before Evans started looking in 1980, fewer than 60 supernovas had been found. From 1980 to 1996 he averaged two discoveries a year — not a huge payoff for hundreds of nights of peering and peering. Once he found three in 15 days, but another time he went three years without finding any at all. This year he recorded his 36th.
Only about 6,000 stars are visible to the naked eye from Earth, and only about 2,000 can be seen from any one spot. With a 16in telescope such as Evans uses, however, you begin to count not in stars but in galaxies. From his deck, he supposes he can see between 50,000 and 100,000 galaxies.
Before I visited him, I had imagined that he would have a proper observatory in his back yard, with a sliding domed roof and a mechanised chair that would be a pleasure to manoeuvre. In fact, he led me not outside but to a crowded storeroom off the kitchen where he keeps his books and papers and where his telescope — a white cylinder that is about the size and shape of a household hot-water tank — rests in a home-made, swivelling plywood mount.
When he wishes to observe, he carries them, in two trips, to a small sun deck off the kitchen. Between the overhang of the roof and the feathery tops of eucalyptus trees growing up from the slope below, he has only a letterbox view of the sky, but he says it is more than good enough for his purposes.
On a table beside the telescope were stacks of blurry photos with little points of haloed brightness. One he showed me depicted a swarm of stars in which lurked a trifling flare that I had to put close to my face to see. This, Evans told me, was a star in a constellation called Fornax from a galaxy known to astronomy as NGC 1365. (NGC stands for New General Catalogue, where these things are recorded.) For 60m silent years, the light from this star’s spectacular demise travelled through space until one night in August 2001 it arrived at Earth in the form of a puff of radiance, the tiniest brightening, in the night sky. It was, of course, Evans on his eucalypt-scented hillside who spotted it.
“There’s something satisfying, I think,” Evans said, “about the idea of light travelling for millions of years through space and just at the right moment as it reaches Earth someone looks at the right bit of sky and sees it. It just seems right that an event of that magnitude should be witnessed.”
I couldn’t get away from the nagging question: what would it be like if a star exploded nearby? Our nearest stellar neighbour is Alpha Centauri, 4.3 light years away. I had imagined that if there were an explosion there we would have 4.3 years to watch the light of this magnificent event spreading across the sky, as if tipped from a giant can. What would it be like if we had four years and four months to watch an inescapable doom advancing towards us, knowing that when it finally arrived it would blow the skin right off our bones? Would people still go to work? Would farmers plant crops? Would anyone deliver them to the shops?
Back in the town in New Hampshire where I live, I put these questions to John Thorstensen, an astronomer at Dartmouth College. “Oh no,” he said, laughing. “The news of such an event travels out at the speed of light, but so does the destructiveness, so you’d learn about it and die from it in the same instant. But don’t worry because it’s not going to happen.”
The reason we can be reasonably confident of this, Thorstensen explained, is that it takes a particular kind of star to make a supernova in the first place. A candidate star must be 10 to 20 times as massive as our own sun, and “we don’t have anything of the requisite size that’s that close. The universe is a mercifully big place”.
Which brings us to the real significance of supernovas. For a long time the theory of Big Bang — the moment of creation — had a gaping hole that troubled a lot of people. It couldn’t begin to explain how we got here.
Although 98% of all matter that exists was created with Big Bang, that matter consisted exclusively of light gases: helium, hydrogen and lithium. Not one particle of the heavy stuff vital to our own being — carbon, nitrogen, oxygen and all the rest — emerged from the gaseous brew of creation.
But — and here’s the troubling point — to forge these heavy elements, you need the kind of heat and energy thrown off by Big Bang. Yet there was only one Big Bang and it didn’t produce them. So where did they come from?
THE man who found the answer to that question was a cosmologist who heartily despised the Big Bang as a theory and coined the term sarcastically as a way of mocking it. He was a Yorkshireman called Fred Hoyle, and he was almost as singular in manner as Fritz Zwicky.
Hoyle, who died in 2001, was described in an obituary in Nature as a “cosmologist and controversialist”, and both of those he most certainly was. He was, according to Nature’s obituary, “embroiled in controversy for most of his life” and “put his name to much rubbish”.
Hoyle claimed, for instance, and without evidence, that the Natural History Museum’s treasured fossil of an archaeopteryx was a forgery along the lines of the Piltdown hoax, causing much exasperation to the museum’s palaeontologists, who had to spend days fielding phone calls from journalists all over the world.
He coined the term Big Bang, in a moment of facetiousness, for a radio broadcast in 1952. He pointed out that nothing in our understanding of physics could account for why everything, gathered to a point, would suddenly and dramatically begin to expand in the way Big Bang theory assumes.
Hoyle favoured a steady-state theory in which the universe was constantly expanding and continually creating new matter as it went. He also realised that if stars imploded they would liberate huge amounts of heat — 100m Celsius or more, enough to begin to generate the heavier elements in a process known as nucleosynthesis. In 1957, working with others, Hoyle showed how the heavier elements were formed in supernova explosions. For this work, WA Fowler, one of his collaborators, received a Nobel prize. Hoyle, shamefully, did not.
According to Hoyle’s theory, an exploding star would generate enough heat to create all the new elements and spray them into the cosmos, where they would form gaseous clouds — the interstellar medium as it is known — that could eventually coalesce into new solar systems. With the new theories it became possible at last to construct plausible scenarios for how we got here. What we now think we know is as follows.
About 4.6 billion years ago a great swirl of gas and dust some 15 billion miles across accumulated in space where we are now and began to aggregate. Virtually all of it — 99.9% of the mass of the solar system — went to make the Sun. Out of the float- ing material that was left over, two microscopic grains floated close enough together to be joined by electrostatic forces. This was the moment of conception for our planet.
All over the inchoate solar system, the same was happening. Colliding dust grains formed larger and larger clumps. Eventually the clumps grew large enough to be called planetesimals. As these endlessly bumped and collided they fractured or split or recombined in random permutations, but in every encounter there was a winner, and some of the winners grew big enough to dominate the orbit around which they travelled.
It all happened remarkably quickly. To grow from a tiny cluster of grains to a baby planet some hundreds of miles across is thought to have taken only a few tens of thousands of years. In just 200m years, possibly less, the Earth was essentially formed.
At this point, about 4.4 billion years ago, an object the size of Mars crashed into Earth, blowing out enough material to form a companion sphere, the Moon. Within weeks, it is thought, the material had reassembled itself into a single clump, and within a year it had formed into the spherical rock that companions us yet.
When Earth was only about a third of its eventual size, it was probably already beginning to form an atmosphere, mostly of carbon dioxide, nitrogen, methane and sulphur. Hardly the sort of stuff we would associate with life, and yet from this noxious stew, life formed. Carbon dioxide is a powerful greenhouse gas. This was a good thing because the Sun was significantly dimmer back then. Had we not had the benefit of a greenhouse effect, Earth might well have frozen over permanently, and life might never have got a toehold. Somehow life did.
For the next 500m years the young Earth continued to be pelted relentlessly by comets, meteorites and other galactic debris, which brought water to fill the oceans and the components necessary for the successful formation of life. It was a singularly hostile environment, and yet somehow life got going. Some tiny bag of chemicals twitched and became animate. We were on our way.
(Extract from "A Short History of Nearly Everything" by Bill Bryson)