Asian Scientist Magazine (October 10, 2022) — From the Princess Leia hologram in the internationally acclaimed Star Wars franchise to Bubs the robot in the popular Korean film Space Sweepers, what space-themed sci-fi story would be complete without a matrix of futuristic technology? Aboard Starfleet starships, an arsenal of computers – even artificial intelligence units – and handheld personal access display devices were featured in Star Trek.
But these fanciful technologies are not just a chimera aimed at an inaccessible future. In fact, supercomputing is an integral part of how modern astronomy works. Cosmology, a branch of astronomy devoted to unraveling the origins and evolution of the universe, is particularly data-intensive and requires sophisticated computing resources to piece together disparate clues from outer space.
“The integration of astronomy and supercomputing has accelerated the rate at which discoveries can be made. We can process data much faster, detect much weaker signals through leaps in sensitivity, and create images at higher resolution than ever before,” said Dr Sarah Pearce, Deputy Director for the Astronomy and Space Science at Commonwealth Scientific and Industrial Research in Australia. Organization (CSIRO), in an interview with Supercomputing Asia.
Pearce also leads the charge for the Australian branch of the Square Kilometer Array (SKA) project, an international effort to build the world’s largest radio telescope to collect data over an area of one million square meters.
Technological behemoths are essential to these astronomical missions, because one small step in high performance computing (HPC) can lead to a giant leap in understanding the cosmos and the creation of our universe.
Twinkling stars and other space objects like asteroids are not only fascinating features that dot the sky, but they also hold many secrets about the fundamental forces of our universe, its immense history, and its dynamic evolution. Even the apparent vacuum of outer space should fool no one, as invisible gravitational wavelengths and radio emissions fill the void with a colossal mishmash of signals.
“The sensitivity and design of the SKA telescopes will allow the detection of extremely faint signals emitted shortly after the birth of the universe nearly 14 billion years ago,” Pearce explained. “Like a time machine, these technologies allow us to see when and how the first stars and galaxies formed.”
Origins aside, the universe is still breathtaking to astronomers. For one thing, it’s still expanding and doing so faster than ever. The gravitational pull between galaxies should slow this expansion, but a puzzling component called dark energy could counteract this force.
To test whether such theories have any weight, astronomers are taking stock of the masses of many galaxies and their gravitational disturbances in the path of radio waves. These galactic exploration missions also involve searching for hydrogen gas emissions, believed to fuel star formation.
Sampling huge numbers of galaxies is essential to revealing subtle differences in emission wavelengths and distortions in radio signals. As a result, scientists are harnessing supercomputers to calibrate, transform, and analyze all of this data as quickly as possible, performing billions of calculations in the blink of an eye. These measurements can then be used to build models to simulate the cosmological past.
For example, researchers led by Dr. Masato Shirasaki at the National Astronomical Observatory of Japan wound up the cosmic clock and reconstructed the early universe, running 4,000 simulated universes on the 3,087 petaFLOPS ATERUI II supercomputer.
During the Big Bang, the universe exploded from nothing to a trillion times its size in a split second. This cosmic inflation has influenced how galaxies and other matter are distributed in space. To trace this phenomenon, the team stripped the simulated galaxies of their gravitational effects to reduce interference and evolved them to see which best reflected the state of the early universe.
“This new method allows us to verify inflation theories using only one-tenth the amount of data,” Shirasaki told Supercomputing Asia. “Since less data is needed, it may also shorten the observation time required for future galaxy survey missions.”
Searching for signals
To uncover answers to the great mysteries of the universe, scientists are designing machines that can match this galactic scale and decipher its cacophony of signals. Unlike their optical counterparts, radio telescopes like SKA can detect invisible waves and aren’t blocked by molecular dust, effectively peering into the “dark” regions where stars and planets are born.
The SKA Low Frequency Telescope in Western Australia is expected to have over 130,000 antennas across 512 stations, while the South African contingent will include 197 satellite dishes to cover the mid-frequency range.
“SKA will receive up to 10 billion data streams simultaneously,” Pearce pointed out. “The supercomputers in our scientific processing facilities will be essential for tracking data from receivers 24/7.”
Such comprehensive equipment can speed up surveying missions by capturing multiple large portions of the sky in parallel and with unprecedented sensitivity. But to paint a picture from the radio data, supercomputers must correlate and synchronize the signals from the antennas, multiplying them to generate data objects called visibilities.
“The difficulty is that within these visibilities, the image of the sky is mixed with responses from antennas and other radio signals such as those from telecommunications devices,” Pearce noted.
From oscilloscopes to supercomputers
Supercomputers use advanced data analytics to disentangle spatial signals from all the noise, including accounting for minor differences in the instruments used and any “spikes” that appear around bright stars. Thanks to iterative calculation loops, machines can convert radio waves into astronomical images with unequaled quality and resolution.
From filtering out spurious signals to stitching together smaller images to create detailed representations, these complex computing tasks all take place in real time and are performed over thousands of radio frequencies. Such a feat, Pearce noted, is only possible because of the power of HPC resources available today.
“Distant galaxies, only glimpsed by very long observations today, will be routinely observed in a fraction of the time. Astronomers using SKA telescopes will encounter more data than ever before in the history of radio astronomy,” she added.
SKA also builds on long-standing CSIRO precursor projects including the Australian Square Kilometer Array Pathfinder (ASKAP) and the Murchison Widefield Array. The backbone of these space missions is Galaxy, a real-time supercomputing service for telescopes and astronomy research. Installed at the Pawsey Supercomputing Research Center in Australia, this 200 teraFLOPS CRAY XC30 system is equipped with Nvidia K20X “Kepler” graphics processing units and Intel Xeon E5-2690 host processors.
SKA’s HPC facilities will have a collective computing capacity of 500 petaFLOPS and archive more than 600 petabytes of data each year. In addition, the alliance of SKA centers around the world will be connected via a high-end fiber optic network capable of transmitting data at speeds of seven to eight terabits per second, approximately 100,000 times faster than average speeds. current broadband.
Collective ambitions, universal futures
By linking localized efforts to global efforts, Pearce envisions a more collaborative model for the future of astronomy.
“The concept of open science is deeply rooted in our philosophy,” Pearce said. “After a period of ownership, SKA’s huge datasets will become accessible to anyone who wants to analyze them, greatly increasing the potential for new discoveries.”
Traditionally, a single astronomer or a small team required time to use a telescope for their individual research. Today, scientists and engineers from approximately 100 organizations in 20 countries participate in the development of SKA, harnessing shared technological resources as a vehicle for progress in space science.
From unlocking the secrets of dark matter to mapping the magnetic fields that permeate the universe, HPC systems are poised to supercharge the next generation of astronomical observation. By capturing snapshots of spacetime, these innovations can enable science teams to weave together compelling stories that transform our understanding of the origin and fate of the universe.
This article first appeared in the print version of Supercomputing Asia, July 2022.
Click here to subscribe to Asian Scientist Magazine in print.
Copyright: Asian science magazine. Image: Unsplash