Astronomers Conduct Largest Cosmological Computer Simulation to Date
October 23, 2023
The integrity of this article has been examined in accordance with Science X's guidelines and review procedures. The editors have ensured the following features to maintain the accuracy of the content:
Fact-checked
Peer-reviewed
Trusted source
Proofread
Endorsed by the Royal Astronomical Society
A group of astronomers from around the world have conducted one of the largest known cosmological computer simulations. It covers both dark and ordinary (planets, stars, galaxies) matter, offering insights into the possible development of our universe. .cmd.push(function() {.display('div-gpt-ad-1449240174198-2'); });
The FLAMINGO simulations trace the stages of evolution of our universe's constituents - normal matter, dark matter, and dark energy - based on physics law. As the simulation advances, digital representations of galaxies and their clusters start to appear. Three papers involving this study have been published in the Royal Astronomical Society's Monthly Notices: detailing the methods, the simulations, and reviewing how effectively the simulations mirror the universe's large-scale structure.
Instruments such as the recently launched Euclid Space Telescope from the European Space Agency (ESA) and NASA's JWST gather vast amounts of data concerning galaxies, quasars, and stars. Simulations like FLAMINGO assist in the scientific interpretation of this data by linking theoretical predictions to the acquired data of our universe.
The prevailing theory proposes that a few numbers or 'cosmological parameters'(six in the simplest form) define the properties of our universe as a whole. These values can be accurately measured in multiple ways.
Cosmic microwave background (CMB) properties are one method. While this provides a faint background glow from the early universe, these values conflict with others sourced from techniques that rely on galaxies' gravitational force bending light (lensing). This discrepancy might herald the end of the standard cosmological model comprising of cold dark matter.
Computer simulations could reveal the reasons for these discrepancies by educating scientists on possible measurement biases. If these fail to explain the differences, then the theory could face serious problems.
According to current observations, the computer simulations only track cold dark matter. The research leader, Joop Schaye from Leiden University, says that the contribution from actual matter needs to be considered as it could be similar to the variations between the models and observations.
The preliminary findings suggest that both neutrinos and ordinary matter are vital for accurate predictions. Still, they don't resolve the discrepancies seen in different cosmological observations.
Simulations that incorporate regular, baryonic matter-- also known as ordinary matter-- pose a greater challenge as they require more computing power. This is because ordinary matter– which comprises only sixteen percent of the universe's matter– is not only susceptible to gravity, but also gas pressure. This can cause matter to be shed far from the galaxies into intergalactic space via active black holes and supernovae.
The strength of these intergalactic winds depends on interstellar medium explosions, which are tricky to forecast. Moreover, the contribution of neutrinos, or tiny subatomic particles of unknown exact mass, is significant, but their motion hasn't been simulated yet.
The team of astronomers has conducted a series of computer simulations tracking the formation of structures in dark matter, ordinary matter, and neutrinos. Ph.D. student Roi Kugel from Leiden University explains, 'The impact of galactic winds was gauged using machine learning through the comparison of the predictions from different simulations of relatively small volumes with the observed masses of galaxies and the distribution of gas in galaxy clusters.'
The investigators simulated the model best aligned with the calibration observations using a supercomputer in different cosmic volumes and resolutions. They also altered the parameters of the model, including the strength of galactic winds, the mass of neutrinos, and cosmological parameters, in smaller yet still large volume simulations.
The largest simulation uses 300 billion resolution elements (particles with the mass of a small galaxy) in a cubic volume with edges of ten billion light years. This is believed to be the largest cosmological computer simulation with ordinary matter ever completed. Matthieu Schaller, of Leiden University, said, 'To make this simulation possible, we developed a new code, SWIFT, which efficiently distributes the computational work over 30 thousand CPUs.'
The FLAMINGO simulations open a new virtual window on the universe that will help make the most of cosmological observations. In addition, the large amount of (virtual) data creates opportunities to make new theoretical discoveries and to test new data analysis techniques, including machine learning.
Using machine learning, astronomers can then make predictions for random virtual universes. By comparing these with large-scale structure observations, they can measure the values of cosmological parameters. Moreover, they can measure the corresponding uncertainties by comparing with observations that constrain the effect of galactic winds.
More information: Joop Schaye et al, The FLAMINGO project: cosmological hydrodynamical simulations for large-scale structure and galaxy cluster surveys, Monthly Notices of the Royal Astronomical Society (2023). DOI: 10.1093/mnras/stad2419
Roi Kugel et al, FLAMINGO: Calibrating large cosmological hydrodynamical simulations with machine learning, Monthly Notices of the Royal Astronomical Society (2023). DOI: 10.1093/mnras/stad2540
Ian G McCarthy et al, The FLAMINGO project: revisiting the S8 tension and the role of baryonic physics, Monthly Notices of the Royal Astronomical Society (2023). DOI: 10.1093/mnras/stad3107
Journal information: Monthly Notices of the Royal Astronomical Society
Provided by Royal Astronomical Society