Cronos supercomputer powers Insight at world’s second largest electricity supplier

Planning for electricity today and tomorrow is a topic of discussion that affects more than six billion people on the planet right in their homes. The needs are vast, the issues complex and the solutions varied. For example, industry analysts estimate that global energy consumption will almost double by 2050[1], with electricity demand dropping from 19% today to 30% by mid-century. By 2036, renewable energy sources, mainly wind and solar, are expected to provide nearly half of the world’s energy supply[2]. According to Carbon Tracker, onshore and offshore wind energy alone could capture more than ten times the global energy consumption of 2019. Nuclear energy today generates around 10% of the world’s electricity needs and more than half of the electricity needs of the United States. Statistically, it is one of the safest[3], carbon free[4], and the most efficient[5] sources of electricity generation used today.

Data like this illustrates the extent of the energy challenges facing the world’s electricity producers, in addition to the information and knowledge needed to continue to operate existing facilities safely and efficiently.

Atos BullSequana X

At Électricité de France (EDF), the world’s second-largest electric utility, headquartered in Paris, France, business analysts and engineers use high performance computing (HPC) to understand and predict the energy consumption of customers, for the design of new generation facilities, and to simulate components and systems for safety and regulatory compliance, in particular for nuclear power generation facilities.

“Concerning our facilities, our primary concern is safety,” said Alain Martin, head of Scientific Information Systems at EDF. “A lot of things are difficult or complex to measure in a nuclear power plant, so we simulate the evolution of various components using calculations. These include the components of the reactor pressure vessel (RPV), the steam generator, the primary pump, among others. The higher the resolution of these simulations, the more detailed we understand about component conditions, operational efficiency and reliability, all of which are key factors in ensuring safety.

Every few years, EDF upgrades and replaces HPC systems to meet the need for ever higher resolution simulations and, now, to support emerging studies in artificial intelligence (AI) that will provide a better understanding of their resource requirements. and their operations. EDF’s Gaia HPC cluster, designed around Intel Xeon Gold 6140 processors, was installed in 2018. At the end of 2020, EDF began deploying Cronos, an HPC cluster of 4.3 petaflops Linpack (7.1 petaflops theoretical peak ).

Cronos is one of the 100 fastest supercomputers in the world[6]. Built by Atos, it is a BullSequana X system with 3,400 Intel Xeon Platinum 8260 processors. The new system will help EDF manage existing electrical installations, while predicting future needs and designing new resources to help them. to be CO2 neutral by 2050.

In addition to running simulations to support safety and nuclear regulatory compliance, the HPC also helps engineers design systems and understand the risks of other types of power plants, such as solar power, offshore wind and hydropower. Complex and high precision calculations on several physics are necessary to design efficient, reliable and safe installations. In addition, predictive maintenance analysis and component end-of-life tracking help keep facilities running at peak performance and maximize useful life.

“We use many types of open source and proprietary codes for studies in mechanics, hydraulics, neutronics, physics and multiphysics”, adds Cyril Baudry, architect of scientific information systems at EDF. “We are very familiar with these codes, such as computational fluid dynamics (CFD). They are parallelized and highly scalable codes capable of handling large data sets. In the decades that we have been operating facilities, we are continually expanding our datasets every year, requiring more and more IT resources to process them within a reasonable time frame and gain insight. “

“We are running a lot of parallel workloads, so more cores gives us more capacity to run simulations,” Baudry explained.

Memory bandwidth and processor core clock speed limited the high performance workloads performed by EDF engineers. Cronos processors offer them more memory channels and higher memory bandwidth compared to previous generation processors. Using processors with fewer cores per socket gives Cronos higher maximum frequencies than larger processors.

Cronos went into production in the first quarter of 2021. Users worked on optimizing their codes and running simulations, while developing emerging workloads using machine learning.

“We are just starting to explore machine learning workloads,” Baudry concluded. “We are exploring which AI to use for predictive maintenance, power consumption planning, cybersecurity, social media analytics, to name a few. This is a new area for us.

The simulation and forecasting of future energy needs, the design and management of new installations around nuclear, wind, solar and hydraulic sources and the analysis of the evolution of power plant components require large-scale supercomputing. With Cronos, EDF has the resources analysts and engineers need for traditional data analysis and to leverage the capabilities that AI and machine learning can bring to scientific knowledge, efficiency and effectiveness. business leadership.

Electricité de France (EDF) is at the heart of global energy supply concerns, both in terms of capacity planning and production. The second largest electricity utility in the world, EDF has the capacity to generate more than 120 gigawatts, 90% of which are CO2-free. EDF designs, builds and manages different types of electrical installations, including nuclear (it operates 58 nuclear power plants), off-shore wind, hydro and solar. It relies on high-performance computing to analyze and understand many of the pressing issues it faces in delivering generated energy to the global community in a safe and efficient manner.

Ken strandberg is a technical storyteller. He writes articles, white papers, seminars, online training, video and animation scripts, as well as technical marketing and interactive materials for emerging technology companies, Fortune 100 companies and multinational corporations. Mr. Strandberg’s technology areas include software, high-performance computing, industrial technologies, design automation, networking, medical technology, semiconductor and telecommunications. He can be contacted at [email protected].

[1] [2] [3] [4] [5] [6]

Source link

About Mariel Baker

Check Also

Kyoto University of Japan lost 77TB of important data due to Hewlett Packard Japan supercomputer-

Kyoto University of Japan yesterday issued a statement saying that from December 14-16, their supercomputer’s …