Decoding cosmological data could shed light on neutrinos, modified gravity

Decoding cosmological data could shed light on neutrinos, modified gravity
This Hubble Ultra-Deep Field image of the distant universe contains approximately 10,000 galaxies. Image credit: NASA and the ESA

(PhysOrg.com) -- Today’s most powerful telescopes collect huge amounts of data from the most distant locations of the universe – yet much of the information is simply discarded because it involves small length scales that are difficult to model. In an effort to waste less data from cosmological surveys, a team of scientists has developed a new technique that allows researchers to use otherwise unusable data by "clipping" some of the highest density peaks, which present the greatest challenge to models. This data could provide a way to address some unsolved problems in physics, including estimating the neutrino mass and investigating theories of modified gravity.

The scientists, Fergus Simpson, Alan Heavens, and Catherine Heymans from the University of Edinburgh, and J. Berian James from the Dark Cosmology Centre in Copenhagen, Denmark, and the University of California, Berkeley, have published their study in a recent issue of Physical Review Letters.

“The pattern formed by galaxies in our is often referred to as the cosmic web, as it bears some resemblance to the structures seen in an intricate spider web,” Simpson told PhysOrg.com. “Within the detailed nature of this pattern is encoded various pieces of information regarding the composition of the Universe, the conditions in the early Universe, and the laws of gravity. However, when we try to study the fine detail on ‘small’ scales (around 100 million light years or less), it appears to be very unpredictable since the Universe is particularly lumpy on these scales, so the physics becomes very complex and nonlinear. In other words, we don't know how to decode that information, and it's particularly frustrating since most of the useful information is buried in these smaller scales.”

In an attempt to decode this small-scale data, the researchers developed the density “clipping” technique, which makes the data accessible to modeling.

“By applying a simple correction to the very densest regions of a simulated patch of the Universe, just 0.1% of the volume, we found that this removes most of this unpredictable behavior,” Simpson said. “We have now demonstrated that a great deal of information from these smaller scales can be successfully extracted.”

He explained that this extra data could prove useful for scientists studying a wide range of areas, such as calculating a better estimate of the neutrino mass.

“It is these ‘small’ cosmological scales that neutrinos are thought to have influenced in the early Universe, at a time when they were travelling very close to the speed of light,” he said. “The extent of their influence depends on how much time they spent at these very high speeds, which in turn is determined by their mass. So it's possible that our technique will allow the neutrino masses to be determined from the distribution of galaxies.

“Neutrinos can tell us about fundamental physics as well as cosmology. In the Standard Model of particle physics, do not have mass, so neutrino masses tell us about extensions of the Standard Model. In principle, measurements from cosmology can be significantly more precise than laboratory experiments.”

In addition, the small-scale data enables researchers to better understand the relationship between galaxies and , which could potential lead to methods for investigating theories of modified gravity with observations.

“The discovery that the expansion of the Universe is accelerating has led many cosmologists to wonder if this is an indication that the laws of gravity need to be modified,” Simpson said. “If there is some new gravitational physics, it is expected to change the rate at which dark matter clusters together. A major difficulty in measuring the dark matter's behavior is that we don't know how the distribution of galaxies (which is what we can measure directly) relates to the distribution of dark matter. In our study we demonstrated that our technique allows the relationship between galaxies and dark matter (‘galaxy bias’) to be determined with much greater accuracy. Once the galaxy bias is known, we can determine how fast the dark matter has been clustering, and see whether that matches our expectation from Einstein's laws of gravity.”

More information: Fergus Simpson, et al. “Clipping the Cosmos: The Bias and Bispectrum of Large Scale Structure.” Physical Review Letters 107, 271301 (2011). DOI: 10.1103/PhysRevLett.107.271301

Journal information: Physical Review Letters

Copyright 2012 PhysOrg.com.
All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com.

Citation: Decoding cosmological data could shed light on neutrinos, modified gravity (2012, January 17) retrieved 26 April 2024 from https://phys.org/news/2012-01-decoding-cosmological-neutrinos-gravity.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Andromeda dwarf galaxies help unravel the mysteries of dark matter

0 shares

Feedback to editors