Aurora Exascale system to advance dark matter research

Scientists have unlocked many atomic secrets through research in physics that studies the interactions between particles such as quarks, gluons, protons, and neutrons in an atom’s nucleus. However, little is known about a mysterious form of matter that has been named dark matter.

This is one of the research challenges being pursued by scientists preparing for the upcoming Aurora exascale supercomputer from Intel and Hewlett Packard Enterprise, which will be housed in the US Department of Energy’s Argonne National Laboratory. Supported by the Argonne Leadership Computing Facility (ALCF) Aurora Early Science (ESP) program, a team based at the Massachusetts Institute of Technology (MIT) is using advanced machine learning (ML) and cutting-edge physical simulations to help pioneer the mysteries of dark matter while simultaneously providing information on fundamental particle physics. The ESP research team includes: William Detmold and Phiala Shanahan who are co-principal investigators at MIT as well as researchers from New York University and their collaborators in Europe and at the Argonne National Laboratory. The hope is that by switching from the current petascal systems to the Aurora exascal system, researchers will be able to simulate the entire nucleus of an atom – and not just a single proton.

“One of our biggest challenges in dark matter research is a computational challenge. Our ML and Lattice Quantum Chromodynamics (LQCD) calculations are calculation intensive and currently use around 100 TB of data per simulation. Running on the Aurora exascale supercomputer will allow us to multiply by ten the data used in our calculations. In addition, we will be able to research the dark matter puzzle that is currently not possible on existing petaflopic supercomputers, ”says Detmold.

Dark matter research extends to underground mines

Detmold says: “Scientists now understand that protons are made up of quarks and gluons which are the basic building blocks of the universe. Protons make up 99% of visible matter, but only make up about 20% of the mass content of the universe. The term dark matter has been applied to the other unknown matter which is invisible to the naked eye and has not been detected by current instruments but is inferred based on its gravitational effects.

Many experiments relating to dark matter have been carried out using detectors constructed from materials such as sodium iodide and xenon. In one experiment, the search for clues about dark matter travels deep underground in a former gold mine in Lead, southern Dakaota. The mine is now a research facility called the Sanford Underground Research Facility (SURF). Once completed, researchers from around the world will work in depth on the LUX-ZEPLIN (LZ) experiment. A stainless steel container will be filled with layers containing water, liquefied gadolinium and linear alkylbenzene, and an inner cylinder containing liquid xenon. The sensors around the acrylic tanks are designed to detect the tiny flashes of light that dark matter colliding with a nucleus can produce.

The ESP team’s research on dark matter is based on the theory of quantum chromodynamics, which explains how quarks interact with each other inside an atom’s nucleus. The team uses ML and LQCD supercomputer simulations for physics experiments to determine atomic constituents and their potential interactions with dark matter. The LQCD is a set of sophisticated digital techniques that calculate the effects of the strong interaction of particle physics that binds quarks and gluons together into protons and neutrons, and then to the nuclei of an atom.

Some parts of the LQCD calculations can only be performed on large-scale supercomputers. Detmold says, “In our research, we cannot look at the entire universe. We take a small region looking at the nucleus of an atom for our calculations. The size of the region will influence the result. Our calculations look at a small box and expand the region to larger areas or boxes and extrapolate the results to the infinite box size limit.

Artist’s impression of a kernel in a network QCD computation. Numerical calculations use a space-time network (grid) to determine the properties and interactions of nuclei, including their potential interactions with dark matter.

Using the power of the future Aurora supercomputer, the team wants to use their calculations to analyze elements of the nuclear matrix to understand how the nucleus reacts when struck by a particular type of dark matter. “Our goal is to be able to assess what happens if a dark matter experiment shows nucleus interaction and recoil. With our LQCD and ML software, we want to be able to examine a potential dark matter model, perform calculations to predict the strength of interactions between dark matter and the nucleus, and then compare that number to a real experiment, ”says Detmold.

MIT machine learning software developed for dark matter research

The MIT team developed their own ML software for dark matter research to solve some of the difficult computational tasks. Their software is designed to speed up the HPC algorithms in certain parts of the LQCD calculation. The team’s ML algorithm is optimized to take advantage of other software tools such as the USQCD, TensorFlow, HDF5, and PyTorch libraries.

ML software uses the self-learning method where the model generates samples of typical quark and gluon configurations, and the program learns from the samples to more accurately generate new samples. These ML-generated configurations can be used for a variety of other physical calculations in addition to the dark matter research that is the focus of the ESP project.

Researchers use convolutional models that are ubiquitous in ML, but are typically two-dimensional convolutions required for image classification / generation tasks. LQCD calculations involve all three dimensions of space and time, so the team needs convolutions that work in four dimensions. Seizures are generally not optimized to run in four dimensions, so the team is working to verify that the ML software will run on the future Aurora supercomputer.

Detmold notes that caution should be used when creating ML tools. “When we do calculations in basic science and physics, we need to be careful that machine learning does not change the nature of the problem we are solving. We focus on implementing exact algorithms, as well as understanding the uncertainties introduced. “

Preparation for Aurora

While waiting for Aurora, the team is generally working on the Theta supercomputer at ALCF, but also working on other supercomputers, notably “Summit” at Oak Ridge National Laboratory, “Frontera” at the Texas Advanced Computing Center and “Marconi” at Cineca in Italy).

Aurora will integrate new Intel compute engines, including the “Ponte Vecchio” Xe The HPC GPU accelerator and the Xeon SP “Sapphire Rapids” processor, as well as the DAOS storage to which Intel has become attached. Detmold notes that dark matter ML tools work best on GPUs, but supercomputers currently use different GPUs, requiring tedious programming changes. Aurora will use the Data Parallel C ++ programming language (DPC ++) as part of the Intel-led oneAPI cross-industry initiative, designed to unify and simplify application development on various IT architectures. Detmold says HPC researchers need a tool like oneAPI to save time.

The future architecture of the Aurora supercomputer is designed to optimize deep learning and the software stack will operate at scale. ALCF postdoctoral researcher Dennis Boyda is working on the Theta supercomputer to ensure that MIT’s software can scale properly. “Our team’s large computations can become a bottleneck, and Aurora’s storage and DAOS architecture is designed to reduce bottleneck issues,” says Detmold.

According to Detmold, “Moving from petascale to exascale supercomputers will allow us to do research on a different scale and look at a different kind of problem. For example, doing research on a peta-scale system allows us to do research on a single proton like determining the mass of the proton. With an exascale system, the team can consider doing simulations of the helium nucleus and possibly carbon. Even with an exascale supercomputer, researchers will not be able to do dark matter research on a nucleus larger like xenon. Our team has developed specific machine learning software to accelerate calculations in order to speed up our research on dark matter. “

The ALCF is a user installation of the DOE Office of Science.

Linda Barney is the founder and owner of Barney and Associates, a technical / marketing writing, training and web design company in Beaverton, Oregon.

Source link

About Mariel Baker

Check Also

Department of Energy and Hewlett Packard Enterprises unveil Polaris supercomputer

Nissan COO Ashwani Gupta on the global chip shortage, supply and demand in the automotive …