
The Hubble space telescope triggered this deep image of the ACO S 295 galaxy cluster. Looking at the distribution of galaxies and clusters throughout the universe it is helping astronomers better limit the general properties of the cosmos. Credit: ESA/Hubble & Nasa, F. Pacaud, D. Coe
Takeaway Keyway:
- Cosmology is based on the precise measurement of cosmological parameters to distinguish between the models of the universe and validate theories, but the current methods are expensive and intensive computationally.
- A new method guided by artificial intelligence, Symbig, uses simulated universes to train a model that extracts cosmological information from polls on the galaxy with a higher precision than previous methods, using both large and small data.
- This artificial intelligence approach improves the accuracy of the estimates of the parameters, in particular for dark matter, dark energy and neutrinos, offering effectiveness in terms of cost and efficiency by processing the data from existing polls.
- “The advantage of automatic learning is that you can simply launch these truly powerful algorithms on blind data, and could extract details that are not something you would have thought of previously,” said Emily Hunt.
For almost as much as humans exist, we have tried to make sense of the cosmos. What started as philosophical reflected, following the advent of the telescope and the ability to look more and more in space (and always first over time), becomes a flourishing research field.
Today, scientists try to understand the properties that regulate the behavior of our universe. These properties are characterized mathematically like the so -called cosmological parameters, which adapt to our cosmos models. More precisely, these parameters can be measured, the better we are able to distinguish between the models, as well as validate-o longtime excluding-theories, including the general theory of Einstein’s relativity. Since different models can contain very different forecasts both for the first moments of our universe and for the possible fate, that differentiation is vital.
To date, some of the major challenges include more strictly binding parameters such as those that determine the precise quantity and nature of the dark matter, the dark energy source and the repulsive strength that exercises and exactly how neutrinos behave.
These questions are at the forefront in the field of cosmology. However, there is a grip: the survey of these cosmic parameters is an expensive deal.
Reduce costs
“There are so many experiments and astronomical surveys built only to measure these six at 10 [major cosmological] Parameters and cost more billions of dollars, “says Shirley Ho, Professor of Astrophysics at the Flatron Institute in New York.
In an article published last August in Astronomy of natureI have and his colleagues have exploited artificial intelligence (AI) to calculate five of the main cosmological parameters that regulate dark matter, dark energy and neutrinos to a higher degree of precision than ever. They did it not only to show that the IA can make these calculations more precise, but also to show that this method is also more convenient. “It is quite interesting, what they have [done] In the document, “says Emily Hunt, astronomer at the Max Planck Institute for Astronomy in Germany that was not involved in research.
The survey of cosmological parameters generally require the study of how galaxies and their individual properties are distributed using polls. Previous approaches used simplified models of the universe and compared them with the detection data. This approach mainly allows a comparison of “blurred”, generalized models with the survey data, ignoring the smaller details due to the expenses and challenges associated with the development of high resolution models. Trying to look at the universe in this way, however, is “how to wear really bad glasses”, says I have.
Insights from the AI
The new study has used a framework to the called inference based on the simulation of galaxies or symbigs to extract cosmological information. Scientists first trained the model to presenting it with 2,000 simulated universes, each with different cosmological parameters. The researchers then introduced the noise to make artificial universes more similar to what we are able to observe in ours, due to intrinsic uncertainties in the data collected by polls on the galaxy. The noise imitated the natural imperfections introduced by telescopic instruments and the atmosphere (such as bright stars and objects that are so close to each other that their signals merge).
Over time, the AI model has been able to learn to extract hidden features. While the previous approaches were able to examine only the large-scale distribution of galaxies, the new AI model has learned to exploit the differences on a small scale in the distribution of the galaxy-ad for example the distance between the individual couples of galaxy-to better estimate the desired cosmological parameters to a higher degree of accuracy.
The trained model was therefore presented with over 100,000 galaxies from the Sloan Digital Sky Survey Spectroscopic Survey (Boss). (This is only a simple fraction of the total boss- less than 10 percent.) Artificial intelligence has therefore been able to use information both on a large and small scale in the real data to limit more precisely the cosmological parameters than the previous methods. “The advantage of automatic learning is that you can simply launch these truly powerful algorithms on blind data and could extract details that are not something you would have thought previously,” says Hunt.
The approach led by artificial intelligence is fundamental, since we do not have another universe, says I have. “Our best bet is actually increasing precision as much as possible to squeeze as much information as possible from the existing universe that we have observed,” he adds.
Future surveys will be able to acquire increasingly more information on the universe. And the intuitions based on artificial intelligence such as those offered by this study can be useful to solve some of the greatest cosmological puzzles today, including the so -called Hubble voltage that surrounds the discrepancy in the measured rate to which our current universe is expanding. This rate is given by the parameter h0called constant of Hubble. “If you have a much better bond [on H0]then you can really nail if there is true [Hubble] Tension or discrepancy, “says I have.
Opportunities and challenges Next
To catalyze further research based on automatic learning in astronomy, I and the co-author Liam Parker (also at the Flatron Institute) have collaborated with other astronomers to treat a collection of large-scale astronomical data. This effort combines hundreds of millions of data sets publicly available from important astronomical surveys in an attempt to “allow the development of large multimodal models specifically aimed at scientific applications”, the authors write in an abstract available on Arxiv Preprint server.
While the future of the application of AI in astronomy seems promising, the experts urge to be careful when they implement these tools. “With each turning point, some things in front of them in front and then other things must recover,” says Hunt. The understanding of the limitations and uncertainties of automatic learning models is fundamental when applying their results to the applications of the real world, in particular when trying to unlock the cosmos.