Stargazers spot dark energy
Still real, we think
The distribution of galaxies and the time it takes for galactic clusters to form are behind a University of Queensland claim confirming the existence of dark energy.
Dark energy has been predicted as a defender of Einsteinian models of the universe, ever since the 1990s when astrophysicists identified the accelerating expansion of the universe.
Since the “inflationary universe” didn’t fit with Einstein’s predictions, either Einstein was wrong, or a new form of energy was required.
The great physicist had once recast his equations to include a similar idea, but wasn’t comfortable with the solution and later called it his “greatest blunder”.
The UQ researchers worked with 26 astronomers from 14 institutions in a project conducted at the Anglo-Australian Telescope. The “WiggleZ Dark Energy Survey” mapped the distribution of 200,000 galaxies.
This, according to Professor Michael Drinkwater from the QU School of Mathematics and Physics, is “the first individual galaxy survey to span such a long stretch of cosmic time”.
Generally, supernovae are used as the basis of galactic measurement. This survey instead mapped the distance between galactic clusters, since pairs of galaxies have a statistical preference for being separated by a distance of 490 million light years.
This measurement was used to confirm measurements made using supernova brightness.
The researchers worked with NASA’s Galaxy Evolution Explorer. Along with Professor Drinkwater, the WiggleZ survey was led by Swinburne University’s Dr Christ Blake, Professor Warwick Couch and Professor Karl Glazebrook. ®
The "quick" answer
Back when the universe was extremely young and extremely hot, the sheer energy in radiation kept hydrogen fully ionised. No sooner would an electron condense into a proton then a photon came along and smacked it out again. This caused a system much like two balls connected by a spring and rolling down opposite sides of a hill to form -- gravity made all the protons try and fall together but radiation pressure pushed them apart again. So you got waves set up, and those waves had a very characteristed wavelength.
Then the universe cooled enough that suddenly electrons could condense into hydrogen, an event known as "last scattering". The radiation left over from the big bang could then run free and we see that as the "cosmic microwave background", which is almost but not quite a perfect blackbody -- the little ripples in it tend to clump together on exactly that wavelength. This is seen extremely clearly; google for "WMAP power spectrum" and the wavelength is given by that first, beautifully mapped peak.
So far so good, but what does that mean for galaxies? Well, after the photons decoupled the newly-formed hydrogen could relax -- but the imprint of the waves was left on it the same as you see the imprint of waves left on a wet beach at low tide. That hydrogen is what later formed the galaxy clusters, then the galaxies, then us, with little overdense pockets collapsing faster and further. But that extra collapse is overlaid on that same old characteristic wavelength.
So we've got an absolute standard ruler for the universe's evolution: the wavelength of those pre-last scattering waves.
The key point is that the universe is growing, so that wavelength is stretched. If we can map the wavelength back through time then we know *precisely* how the universe evolved -- and that lets us tell the difference between models with and models without dark energy. Even more, it will eventually let us tell between different dark energy models and will let us actually rule out Einstein's cosmological constant completely (or confirm it, of course).
WiggleZ, despite its fucking horrible name, is the first big galaxy survey to report back to us that has the sensitivity to measure this along different time slices. Earlier surveys like the 2dF and the Sloan Digital Sky Survey certainly detected those waves but they did it by lumping things across about 5 billion years of evolution together to beat down statistical noise. The next results I'd expect perhaps from LOFAR which is running in the Netherlands and has apparently just extended its reach to Ireland which makes it the longest base-length radio telescope ever. In the future, assuming that computers keep actually getting better, the Square Kilometre Array in either Australia or South Africa will blow everything else away.
I'm looking forward to SKA and its insane daily petabytes of data. And the jump in global hard drive prices that will cause.
Hmmm. I think I started rambling there. Sorry. Anyway, the key point is that if we map out how a characteristic wavelength evolved when the universe was, say, 5bn years old, 7, 9, 11 and 13, and tie that to that wavelength when it was 100,000 years old (at the CMB) then we know a *lot* about the universe's evolution, and therefore about its composition.
Hope that cleared a bit up instead of adding to the confusion.
Creation Part 2
In the beginning was the Singularity and it knew not time or space nor of what it was made.
And lo the Singularity was touched by the hand of God and as a mighty blow space and time did exist.
The Singularity found that it was no longer in one place but scattered over 100bn light years of space. The Singularity was discomforted by the lack of self comfort and tears condensed from the void forming all that is visible and the stars which spread across the firmament. But part of the Singularity was so thin it kept to itself and hid amongst the brightness of the stars. Because this part eschewed the light it became dim and was transformed dark matter and dark energy.
"Do the models (and this result) account for dark matter as well as dark energy?"
They're fitting the so-called Lambda CDM model -- the Lambda is the cosmological constant, while the CDM is "cold dark matter". The model contains a cosmological constant, cold dark matter, hydrogen (and helium if necessary for accuracy; for these purposes you can just assume everything's hydrogen without error so long as you get the total mass of it all right), radiation and neutrinos.
I'm not sure what you mean by "it confirms the results of the mechanized supernova surveys, nothing more" though. In a straight sense, yes, it does -- but it's not as susceptible to systematic errors. This is particularly because it's a purely geometric measure, while supernova surveys assume that supernovae type ia are "standardisable candles", while we don't actually understand the progenitors of them very well. Even so, surely the fact that two independent data sets are in agreement should help persuade you, rather than anything else?