...the who's who,
and the what's what 
of the space industry

Space Careers

news Space News

Search News Archive

Title

Article text

Keyword

  • Home
  • News
  • Getting the most out of cosmic maps

Getting the most out of cosmic maps

Written by  Thursday, 23 January 2025 09:11
Write a comment
Los Angeles CA (SPX) Jan 22, 2025
A University of Michigan-led research initiative is paving the way for cosmology to fully utilize the capabilities of telescopes and instruments tackling the universe's biggest questions. The project highlights a novel computational method that extracts significantly more information from cosmic maps depicting the clustering and distribution of galaxies across the universe. Such maps are c
Getting the most out of cosmic maps
by Clarence Oxford
Los Angeles CA (SPX) Jan 22, 2025

A University of Michigan-led research initiative is paving the way for cosmology to fully utilize the capabilities of telescopes and instruments tackling the universe's biggest questions.

The project highlights a novel computational method that extracts significantly more information from cosmic maps depicting the clustering and distribution of galaxies across the universe. Such maps are critical tools for exploring dark energy, dark matter, and other enigmatic cosmic phenomena.

The dark side of cosmology

While instruments like the Dark Energy Spectroscopic Instrument (DESI) are providing valuable insights, scientists acknowledge the need for even more advanced tools to uncover the universe's secrets. Some researchers are already designing next-generation equipment, but Minh Nguyen and his team are concentrating on maximizing the value of current and future data.

"As we move to bigger and better telescopes, we might also be throwing away more information," said Nguyen, a Leinweber Research Fellow in the U-M Department of Physics. "While we're collecting more data, we can also try to get more out of the data."

Nguyen collaborated with colleagues at the Max Planck Institute for Astrophysics (MPA) to develop LEFTfield, a computational framework that enhances the analysis of the universe's large-scale structure.

"In the early universe, the structure was Gaussian-like the static you would see on old TV sets," Nguyen explained. "But because of the interplay between dark energy and dark matter, the large-scale structure of the universe today isn't Gaussian anymore. It's more like a spider web."

Dark energy propels the universe's expansion, while matter-both observable and dark-exerts gravitational forces opposing this expansion. Remarkably, the majority of the universe's mass and energy consists of these elusive dark components. Cosmic maps serve as a gateway to studying the forces shaping the universe's intricate, weblike structure.

With LEFTfield, Nguyen and his team demonstrated an ability to glean even more insights from existing maps. Their findings, published in Physical Review Letters, earned a 2024 Buchalter Cosmology Prize.

Out of LEFTfield

The innovation lies in how LEFTfield processes data compared to traditional methods.

"With a standard analysis, you basically cannot work with the data as is. People have to compress it," Nguyen said. "That reduces the complexity of the analysis and makes it easier to make theoretical predictions, but the trade-off is you lose some information."

Standard methods rely on computational models that simplify the data by grouping galaxies into pairs or triplets, streamlining calculations but discarding certain details. Nguyen's approach, however, bypasses this compression by treating cosmic maps as 3D grids. Each voxel-a 3D data unit akin to a pixel-retains uncompressed information about galaxy distribution and density.

This approach preserves data fidelity in a way standard methods cannot, enabling a deeper understanding of the non-Gaussian features of the universe.

"I love the idea of field-level inference because it is, in principle, the actual thing we want to do," said Shaun Hotchkiss, host of the online seminar series Cosmology Talks, which recently featured Nguyen and co-author Beatriz Tucci. "If we've measured the density field, why compress the information inside of it? Of course, field-level inference is therefore more difficult to do, but this hasn't stopped Bea and Minh, and shouldn't stop the community."

To evaluate LEFTfield's capabilities, the team calculated sigma-8, a parameter measuring the universe's clumpiness. Compared to traditional methods, LEFTfield improved sigma-8 measurements by a factor of 3.5 to 5.2.

"That's like going from DESI to the successor of DESI," Nguyen said. "Typically, moving between two generations of surveys would take 10 to 20 years."

However, integrating LEFTfield with specific instruments and accounting for the effects of noise and tool idiosyncrasies remain key challenges. Despite this, Nguyen believes LEFTfield will play a pivotal role in future research.

"It really opens the fast track to get insights into dark energy, dark matter and general relativity-the theory that this is all based on," Nguyen said.

The team included Fabian Schmidt, an MPA cosmologist and group leader; staff scientist Martin Reinecke; and Andrija Kostic, who contributed as a Ph.D. student and postdoctoral researcher. Nguyen, now a research fellow at the Kavli Institute for the Physics and Mathematics of the Universe in Tokyo, recently completed his U-M fellowship.

Research Report:How Much Information Can Be Extracted from Galaxy Clustering at the Field Level?

Related Links
University of Michigan
Understanding Time and Space


Read more from original source...

You must login to post a comment.
Loading comment... The comment will be refreshed after 00:00.

Be the first to comment.

Interested in Space?

Hit the buttons below to follow us...