泭
泭
This article was originally published on . Read the .
dispensing with 70% of the stuff in the universe has .
Dark energy and dark matter are theoretical inventions that explain observations we cannot otherwise understand.
On the scale of galaxies, gravity appears to be stronger than we can account for using only particles that are able to emit light. So we add dark matter particles as 25% of the mass-energy of the Universe. Such particles have never been directly detected.
On the larger scales on which the Universe is expanding, gravity appears weaker than expected in a universe containing only particles whether ordinary or dark matter. So we add dark energy: a weak anti-gravity force that acts independently of matter.
Brief history of dark energy
The idea of dark energy is as old as general relativity itself. Albert Einstein included it when he first applied relativity to cosmology exactly 100 years ago.
Einstein mistakenly wanted to exactly balance the self attraction of matter by anti-gravity on the largest scales. He could not imagine that the Universe had a beginning and did not want it to change in time.
Almost nothing was known about the Universe in 1917. The very idea that galaxies were objects at vast distances was debated.
Einstein faced a dilemma. The physical essence of his theory, as summarised decades later in the is:
Matter tells space how to curve, and space tells matter how to move.
That means space naturally wants to expand or contract, bending together with the matter. It never stands still.
This was realised by who in 1922 kept the same ingredients as Einstein. But he did not try to balance the amount of matter and dark energy. That suggested a model in which universes that could expand or contract.
Further, the expansion would always slow down if only matter was present. But it could speed up if anti-gravitating dark energy was included.
Since the late 1990s many independent observations have seemed to demand such accelerating expansion, in a Universe with 70% dark energy. But this conclusion is based on the old model of expansion that has not changed since the 1920s.
Standard cosmological model
Einsteins equations are fiendishly difficult. And not simply because there are more of them than in Isaac Newtons theory of gravity.
Unfortunately, Einstein left some basic questions unanswered. These include on what scales does matter tell space how to curve? What is the largest object that moves as an individual particle in response? And what is the correct picture on other scales?
These issues are conveniently avoided by the 100-year old approximation introduced by Einstein and Friedmann that, on average, the Universe expands uniformly. Just as if all cosmic structures could be put through a blender to make a featureless soup.
This homogenising approximation was justified early in cosmic history. We know from the the relic radiation of the Big Bang that variations in matter density were tiny when the Universe was less than a million years old.
But the universe is not homogeneous today. Gravitational instability led to the growth of stars, galaxies, clusters of galaxies, and eventually a vast , dominated in volume by voids surrounded by sheets of galaxies and threaded by wispy filaments.
In standard cosmology, we assume a background expanding as if there were no cosmic structures. We then do computer simulations using only Newtons 330-year old theory. This produces a structure resembling the observed cosmic web in a reasonably compelling fashion. But it requires including dark energy and dark matter as ingredients.
Even after inventing 95% of the energy density of the universe to make things work, the model itself still faces .
Further, standard cosmology also fixes the curvature of space to be uniform everywhere, and decoupled from matter. But thats at odds with Einsteins basic idea that matter tells space how to curve.
We are not using all of general relativity! The standard model is better summarised as: Friedmann tells space how to curve, and Newton tells matter how to move.
Enter backreaction
Since the early 2000s, some cosmologists have been exploring the idea that while Einsteins equations link matter and curvature on small scales, average expansion thats not exactly homogeneous.
Matter and curvature distributions start out near uniform when the universe is young. But as the cosmic web emerges and becomes more complex, the variations of small-scale curvature grow large and average expansion can differ from that of standard cosmology.
Recent numerical results of a team in Budapest and Hawaii that used standard Newtonian simulations. But they evolved their code forward in time by a non-standard method to model the backreaction effect.
Intriguingly, the resulting expansion law fit to Planck satellite data tracks very close to that of a , known as the . It posits that we have to calibrate clocks and rulers differently when considering variations of curvature between galaxies and voids. For one thing, this means that .
In the next decade, experiments such as the and the , will have the power to test whether cosmic expansion follows the homogeneous law of Friedmann, or an alternative backreaction model.
泭
To be prepared, its important that we dont put all our eggs in one cosmological basket, as . In Loebs words:
To avoid stagnation and nurture a vibrant scientific culture, a research frontier should always maintain at least two ways of interpreting data so that new experiments will aim to select the correct one. A healthy dialogue between different points of view should be fostered through conferences that discuss conceptual issues and not just experimental results and phenomenology, as often is the case currently.
What can general relativity teach us?
While most researchers accept that the backreaction effects exist, the real debate is about whether this can lead to more than a 1% or 2% difference from the mass-energy budget of standard cosmology.
Any backreaction solution that eliminates dark energy must explain why the law of average expansion appears so uniform despite the inhomogeneity of the cosmic web, something standard cosmology assumes without explanation.
Since Einsteins equations can in principle make space expand in extremely complicated ways, some simplifying principle is required for their large-scale average. This is .
Any simplifying principle for cosmological averages is likely to have its origins in the very early Universe, given it was much simpler than the Universe today. For the past 38 years, have been invoked to explain the simplicity of the early Universe.
While successful in some aspects, . Those that survive give tantalising hints of deeper physical principles.
Many physicists still view the Universe as a fixed continuum that comes into existence independently of the matter fields that live in it. But, in the spirit of relativity that space and time only have meaning when they are relational we may need to rethink basic ideas.
Since time itself is only measured by particles with a non-zero rest mass, maybe spacetime as we know it only emerges as the first massive particles condense.
Whatever the final theory, it will likely embody the key innovation of general relativity, namely the dynamical coupling of matter and geometry, at the quantum level.
The recently published essay: further explores these ideas.
Alan Coley is Professor in Mathematics at AV整氈窒 and David Wiltshire is Professor of Theoretical Physics at the University of Canterbury.