- Home
- Paul M. Sutter
Your Place in the Universe Page 11
Your Place in the Universe Read online
Page 11
Indeed, if you add up all the stuff remaining in the hot early universe, it's mostly composed of photons, the ubiquitous carrier of the electromagnetic force. The cosmos at this stage is a proper plasma, the same state of matter you would find in a lightning bolt or the interior of the sun (please don't go personally looking for this; just take my word for it).
In a plasma you've got protons, neutrons, electrons, and photons, all bouncing around together angrily. And some of the protons and neutrons have hooked up to form helium or lithium, and the energy of the surrounding soup is now too feeble to break them apart.
But the battle between matter and antimatter was a Pyrrhic victory—only one proton per billion survived the great primordial war, leaving the numbers of normal matter severely underpopulated. So the plasma was strongly dominated by the photons, the light. Within this opaque energetic soup, atoms (yes, finally, atoms) would try to form when a stray electron would get caught in an orbital around a proton or other nucleus. But as soon as the bond would form, a home-wrecking photon would slam in to destroy the newfound relationship, slapping the electron away and back into the mix of singles.
Nothing could withstand the domination of the photons. Strong nuclear was too short-range, and in the expanded cosmos, it could only find its influence confined to the nuclei. Weak nuclear, as exotic and essential as it can be, was always a pushover. And gravity? By far the weakest of the forces, billions upon billions (upon a few more billions) of times weaker than even the so-called weak force, it was hopeless against these interactions.
But gravity did have one thing going for it—it was playing the long game, fighting with guerilla tactics, laying traps and ambushes so that the tyrannical photons would eventually sow the seeds of their own downfall and forever be relegated to cosmic irrelevance.
Gravity won its ultimate victory by fighting dirty. It couldn't hope to free the helpless atoms from the relentless electromagnetic onslaught in one-to-one battles. But Einstein, de Sitter, Friedmann, Lemaître, and all the others had discovered that gravity, and gravity alone, was sufficient to describe the dynamics of the universe at the very largest scales. While other forces may govern small interactions (like the formation of an atom an hour into the history of the cosmos or the rhythm of your heartbeat billions of years later), gravity is the only force that operates at infinite range and affects all things, regardless of size, shape…or electric charge.
That's the key. Photons could win individual battles, but the war was over before electromagnetism even started marshaling its forces. That's because the universe is, on balance, electrically neutral. For every negative charge, there's a corresponding positive charge out there, somewhere. So on average, any large-scale electromagnetic interactions simply cancel out. There's no coordination, no grand strategy, no overarching plan.
And gravity had a plan. Gravity was driving the expansion of the universe—that's the ultimate lesson of general relativity—and expansion makes the density drop for everyone. For matter, it's a simple cubic relationship. Put one particle in a box, you have a density of…one particle per box. Expand the box by doubling each side, you now have the equivalent of 2 × 2 × 2 = 23 boxes, so your density has dropped to one particle per eight boxes.
This goes for matter and radiation in equal measure, so at first blush it seems like a wash. But photons pick up an extra interaction from the expansion of the universe. They get elongated by the stretching of space-time itself—as the universe grows, their wavelengths grow longer as well. In other words, the light redshifts, which is what Hubble observed when he examined the light from distant galaxies in the 1920s.
And here's the kicker: redshifted light has less energy.
It's happening in our universe now, and it happened in the universe long ago. As the universe aged, the photons not only diluted, but also lost energy. It was an endurance race between matter and radiation: who could outlive the other in the inexorably evolving cosmos? This was gravity's wicked plan all along; the radiation was simply born to lose.
Because radiation had such overwhelming numbers, this was indeed a long struggle, the longest such war for survival that the universe had yet seen. But gravity is gravity is gravity: quiet and unassuming, but relentless.
The universe expanded and cooled. The densities of matter and radiation dropped. The primordial plasma lost its ferocity. Year by year, the photons grew tired, their influence over matter diminishing.
And 380,000 years into the history of the cosmos, when our observable universe was about one-billionth its present volume, radiation gave up the fight for good.
The first atoms were born.
The name “big bang” was coined by the sharp-minded and sharper-tongued astronomer Fred Hoyle, who didn't exactly take a shine to this expanding-universe business.1
Hoyle had a good point—Edwin Hubble's 1929 observations of a general redshifting of light from distant galaxies only suggested an expanding universe. But the “conspiracy” option, where we are the literal center of the universe with all the galaxies literally flying away from us in a predetermined pattern in order to reproduce a straight relationship between distance and redshift, was never really under serious consideration.
Why? Because Copernicus, that's why. Hundreds of years ago, “Hey, folks, maybe the universe isn't focused on us, just saying” was a radical, thought-provoking notion worthy of much narrowing of the eyes and muttering under the breath. But the eventual success of that (initially flawed) picture, plus decade after decade of the universe rubbing our noses in it—proving over and over that the fantastic energies and vast scales really don't care about us—led, by the twentieth century, to a much more cautious generation of astronomers.
For them, it was preferred to assume that we're not special. Better safe than sorry, I suppose. This central conceit of modern cosmological thinking goes by a few names, like the Copernican principle or the mediocrity principle, and we'll return to the topic in sobering discussion later.
What about “tired light,” the concept that it's not expanding space that's sapping the energy from light, shifting it to redder hues, but simply that light loses energy as it travels? One challenge to this idea is that in order to make light grow tired, you must have that light interact with some sort of substance sprinkled in the intergalactic gulfs—say, magical redshifting pixie dust, purely for example. Since the light will bounce off that magical redshifting pixie dust, the light must also scatter. So images of distant galaxies must be slightly fuzzier than closer ones, because their light has had more interactions with the magical redshifting pixie dust. Plus, that same magical redshifting pixie dust must be sprinkled inside our own galaxy too, so stars on the far side of the Milky Way should be redder and fuzzier than our closer neighbors.
The instruments of the first half of the twentieth century didn't have quite the measuring sophistication to conclusively rule out tired light, but the concept never really caught on. There were no known physical mechanisms for making light tired, and it conflicted with everything else we knew about the photons among us. Even Fritz Zwicky, the bolo-rocking astrophysicist who cooked up the idea, tossed as many varieties of models into his paper as he could think of (and crossed a few off the list in the very same paper), in the spirit of “Let's make sure we don't leave any stone unturned before we jump into an expanding universe.”
For those of you with highly skeptical hearts, don't fret. More modern astronomers with instruments of sufficient sophistication have indeed followed up on these lines of thinking and found them to be less than fruitful. Tired light is a tired idea.2
Expanding universe it is, then. But perhaps not necessarily the big bang “primordial atom” as cooked up by the Catholic priest Lemaître in his application of relativity to the universe. After all, the cosmos having an “origin” did seem bit too close to Genesis for some, and didn't we move past all this using-the-Bible-to-support-our-arguments line of thinking since the days of Kepler?
Enter Fred Hoyle, an amaz
ingly brilliant astronomer who, as far as I can tell, decided to take up the mantle of Curmudgeon Superior from Galileo and seemed to openly work against his own best interests, burning bridges faster than he could build them. He led vital work into the nature of how stars function, but in this story of the universe's story, he serves as the devil's advocate against the consensus growing around Hubble and Einstein's cosmological offspring.
And, like before, I have a soft spot in my heart for the die-hard skeptics in history, even when they become so cantankerous that nobody invites them to any parties. They're annoying, but oh so useful.
The ultimate too-cool-for-school kind of guy, Hoyle often took the opposite position to whatever was popular with his fellow scientists. I must say this was an awesome tactic, because (a) science needs healthy debate and skepticism to survive, and (b) he was smart enough for it to pay off most of the time.
But if a cosmic conspiracy and tired light were off the table for cosmological consideration, what possible alternative explanation was there? You couldn't argue with the data—the results of Hubble and company were too squeaky-clean for any charges of shenanigans. But you could always argue against the theory. Not general relativity itself—by the 1930s, Einstein's theory had already trounced any other potential challengers to the title of Explainer of Gravity—but there was one little crack that Hoyle identified. A small one, but big enough for him to drive a wedge into it and force the scientific community to hold it, take a breath, are you sure?, before jumping off the cliff.
The big mental hurdle that you have to leap, the metaphysical pill you have to swallow, the elephant in the room that you have to address if you want to take this big bang picture seriously is that the universe has, fundamentally, finite age. It has a beginning. There is a specific moment, in the countable past, when the universe switched from not existing to actually existing.
If you're religiously minded, that's not such a big deal. But Hoyle wasn't arguing against the so-called big bang (and even though he didn't intend the tag to be derisive, given his cantankerous nature I can't help but see the corner of his lips curl when he coined the phrase during a BBC radio show in 1949) theory on religious grounds. Far from it. “Everything we see in the universe came from, well, somewhere” isn't the most scientific of statements, but on its face it's not malignant.
Instead, Hoyle challenged the cosmic establishment to go all the way to the finish line when they insist that the universe doesn't care about us. If you're going to elevate Copernicus such that his name gets stuck in front of the word “principle,” the thinking goes, then you need to finish what you started.
We are not the center of the universe. We are not special. We do not have a special vantage point on the heavens—our view is, statistically, just like anybody else's. Ergo, from our perspective it looks pretty much the same in every direction. In the jargon, our universe is isotropic.
You can take it one step further and assert/assume that on average, at the largest scales, the universe is generally the same from place to place. It is homogeneous, like the milk you buy from the store. Therefore, nobody is the center. There is no special location in the universe that is wildly different or elevated or distinct from any other. Again, I'm repeating the phrase on average, at the largest scales because this is cosmology: you can only think big about these kinds of questions.
These two ideas combined, that the universe is both isotropic and homogeneous, form the backbone of general relativity's insights into the cosmos, and hence they are often referred to as the cosmological principle. They are the basic assumptions needed to simplify Einstein's nasty equations enough that you can get work done with the mathematics, and they serve as fundamental statements about the nature of our universe and our role in it.
Hoyle and colleagues rattled the cages: you want a boring universe, where nothing is very different from place to place? That's fine, that's great, that's wonderful. So then shouldn't the universe be pretty much the same from time to time as well? In other words, shouldn't our cosmos be the same through space and time together, like in this concept of space-time that everyone is so excited about?
The refutation to the big bang's cosmological principle was a perfect cosmological principle that rested on the assumption that the universe is eternal and unchanging, that it is indeed homogeneous, through both the vastness of space and the deepness of time. This was the default position just a few decades earlier, before Hubble astounded the world, so why let his results spoil the fun?
This might have ended up just empty words, but like I said, Hoyle had chops. Together with some colleagues he formulated an attractive alternative to the big bang—the steady-state model.3 In this picture, requiring only a small and innocuous alteration to the equations of general relativity, matter is continuously created in the universe, with the rate of creation matching the outward expansion. Thus as the universe continues to grow fatter, its density remains constant—there are always new partygoers joining the big bash as the room gets bigger.
Steady-state cosmology fit the data just fine. Arguments that it seemed too absurd to have matter popping into existence all the time (“Where does your stuff come from?”) were met with sharp rejoinders—the big bang model also posited the spontaneous creation of matter (“Where does your stuff come from, pal?”). It simply stretched the instant, fiery explosion of matter into a long, drawn-out slow burn. A simmer rather than a boil.
The match was set, between the perfect cosmology of the steady-state picture and the finite-aged cosmos of the big bang. And through the late 1940s and into the 1950s, there was no clear winner.
After 380,000 years of waiting, electrons could finally join their hadronic cousins and form the first atoms. Before this time, the radiation had already diluted to the point that matter was the dominant player, but still it fought its losing, helpless battle, preventing the formation of atoms. Finally, though, it called it quits; matter and radiation would never affect each other on cosmic scales again.
This remarkable event would have just flashed by in a haze known to us only dimly via equations and simulations, like all the other major transitions before it, except that the universe was now, for the first time, transparent. Before the formation of atoms, the universe was filled with hot, dense plasma. Just as the radiation prevented the atoms from forming stable long-term bonds, the thick dilution of matter prevented the radiation from traveling freely. A photon would attempt to make a great leap at light speed, only to run smack-dab into a klutzing electron.
But now that neutral hydrogen and helium had formed, deliciously transparent, light had room to move. Over a relatively brief window of time, about ten thousand years or so, the fog of the primordial universe lifted, and a more recognizable, more clear universe became the norm.
For obscure historical reasons, physicists refer to this event by the name recombination, as if this were the second time that atoms got together in the universe, which it wasn't, unless you count being squished into an exotic quark-gluon plasma as “together.” I personally prefer photon decoupling or, less formally, the best fireworks show ever.
The light emitted was literally white-hot, corresponding to a blackbody temperature of about three thousand kelvin, about half the temperature of the surface of the sun.
I know, I know. Blackbody temperature? It's perhaps one of the most confusing terms in all of physics (and that's saying something). It comes from the devices used in the nineteenth century to study the radiation emitted from as-black-as-possible objects, objects that drank in as much of the surrounding radiation as possible and were at a fixed temperature. Perhaps a more descriptive term is thermal radiation, or maybe even warm and/or hot stuff radiation.
All stuff gives off radiation of some form. All that wiggling, jiggling, and rotating at the molecular level releases some of that energy in the form of light. Since some wiggles and jiggles are bigger or smaller than other wiggles and jiggles, the radiation emitted covers a broad spectrum, with a distinct peak
depending on the temperature.
For example, you. At a temperature of ninety-eight degrees Fahrenheit, you are emitting all sorts of radiation, most of it in the infrared—hence why infrared goggles are so handy for seeing people in the dark. But you're also giving off a little bit of microwaves (enough to be detectable by a standard household satellite dish) and even visible light (not enough to be seen, but it's there).
The cooler an object is, the longer the wavelength of the majority of light it gives off. The hotter, the shorter. The full description of blackbody (aka thermal) radiation was cracked by Max Planck, a name we already encountered as we tried to come up with a numbering system to describe the earliest moments of the universe. In the process of describing blackbody radiation, he also inadvertently invented quantum mechanics, but that's a story for another chapter.
At the moment of the separation between radiation and matter 380,000 years into the history of our universe, the cosmos was in almost perfect equilibrium. Radiation and matter were bouncing around ferociously, and those countless interactions created an essentially ideal blackbody scenario. Thus when the light was finally released, it carried that imprint, perfectly mimicking a laboratory device at a temperature of three thousand kelvin.
That primordial light permeated the cosmos. Truly for the first time, the densities had dropped so much that it could travel for countless light-years before interacting with a stray bit of matter. It soaked the universe but was no longer a part of it. And it was bright, like having the surface of the sun surrounding you on all sides. Indeed, this sudden release of radiation generated more photons than all the stars will produce, ever, in the entire future history of the cosmos.
But that event was a long time ago. We aren't bathed in white-hot radiation from the early universe. What happened? The quiet but inexorable expansion of the universe happened. Gravity didn't just win; it rubbed radiation's nose in it. With the continued expansion, the radiation was stretched and stretched, redshifted down just like any other long-distance photon in the universe. The primordial light was still there, bathing the sky, but no longer in the visible range of the human eye.