At the summit of Haleakala on the Hawaiian island of Maui is the panoramic survey telescope and the rapid response system (Pan-STARRS1 (PS1)). As part of the Haleakala Observatory, overseen by the University of Hawaii, Pan-STARRS1 relies on a system of cameras, telescopes, and computer equipment to perform optical imaging of the sky and astrometry and photometry of known objects.
In 2018, the University of Hawaii at the Manoa Institute for Astronomy (IfA) published the PS1 3pi survey, the world's largest digital sky survey that covered three-quarters of the sky and comprised 3 billion objects. And now a team of IfA astronomers have used this data to create the Pan-STARRS1 source types and redshifts using machine learning (PS1-STRM), the world's largest three-dimensional astronomical catalog.
Their work is described in an article that appeared in the August 31st issue of the Royal Astronomical Society's Monthly Notices. The study was led by Robert Beck, a former postdoctoral researcher in cosmology at IfA (now professor at Eötvös Loránd University in Hungary) and included members from both institutions as well as Stanford Health Care's Platform Services.
PS1 at dawn. The mountain in the distance is Mauna Kea, about 130 kilometers to the southeast. Photo credit: pan-starrs.ifa.hawaii.edu
Novel computer tools
As they describe in their study, the team first performed publicly available spectroscopic measurements of the 2,902,054,648 objects examined in the PS1 3pi survey, giving them final object classifications and distances. These were then fed to an artificial intelligence algorithm, which sorted them into stars, galaxies, quasars, or uncertainties (it also derived refined estimates for the distances between the galaxies).
As Beck recently described the process in a University of Hawaii press release:
"Using a state-of-the-art optimization algorithm, we used the spectroscopic training set of nearly 4 million light sources to teach the neural network to predict source types and galaxy distances, while also correcting light extinction from dust in The Milky Way."
The machine learning process they used, known as a “feedforward neural network,” went a long way in allowing the team to pinpoint the properties of various objects and sort them by size and photometric redshift. Overall, this process achieved a classification accuracy of 98.1% for galaxies (and distance estimates of almost 3%), 97.8% for stars, and 96.6% for quasars.
Overdensity map of galaxies between redshift 0.1 and 0.2 in the PS1-STRM catalog. The galaxy and regions with missing sources have been hidden. Photo credit: STScI
Biggest 3D map ever!
So far, the largest and most detailed 3D maps of the universe have been created by the Sloan Digital Sky Survey (SDSS) published in 2012. This survey combines data from the Sloan Foundation 2.5m telescope and the NMSU 1m telescope located at the Apache Point Observatory (New Mexico) and the Irénée du Pont Telescope at the Las Campanas Observatory (Chile).
The latest data release (DR16), the fourth release of the fourth phase of the SDSS (SDSS-IV), contains SDSS observations through August 2018. The final release (DR17) is scheduled for July 2021 and will include all new spectra observations as well as all final ones Data products and catalogs. However, the SDSS catalog covers a third of the sky and contains spectra for over 3 million objects.
In comparison, the PS1-STRM doubles the area examined, increases the number of objects tenfold and covers certain areas that the SDSS overlooked. István Szapudi, an IfA astronomer and co-author of the study, noted:
“(A) Already now, a preliminary version of this catalog, covering a much smaller area, enabled the discovery of the greatest void in the universe, the possible cause of the cold spot. The new, more accurate, and larger photometric redshift catalog will be the starting point for many future discoveries. "
IR map of the entire galaxy with the airplane and the bulge of the galaxy full of stars and dust. Photo credit: SDSS
This latest map of the universe is evidence of how astronomical instruments and methods have matured in a short period of time. In particular, it was shown how large amounts of data obtained from multiple telescopes can be multiplied by adding machine learning techniques, improved data sharing, and complementary observations.
Ken Chambers, the Pan-STARRS director and an IfA associate astronomer who also co-authored the study, said this is just the beginning. "As Pan-STARRS collects more and more data," he said, "we will use machine learning to extract even more information about near-Earth objects, our solar system, our galaxy and our universe."
The Pan-STARRS 3D catalog (approx. 300 GB in size) is now available in the Mikulski archive for space telescopes. Science users can query the catalog via the Space Telescope Science Institute (STScI) MAST CasJobs SQL interface or download the entire package as a computer readable spreadsheet.
Further reading: University of Hawaii News, MNRAS