Supercomputers – Fun With Justin http://funwithjustin.com/ Sat, 09 Oct 2021 16:57:02 +0000 en-US hourly 1 https://wordpress.org/?v=5.8 https://funwithjustin.com/wp-content/uploads/2021/05/fun-with-justin-icon-150x150.png Supercomputers – Fun With Justin http://funwithjustin.com/ 32 32 Supercomputer predicts impact of Newcastle takeover on Manchester United and Man City finishes https://funwithjustin.com/supercomputer-predicts-impact-of-newcastle-takeover-on-manchester-united-and-man-city-finishes/ Sat, 09 Oct 2021 15:53:08 +0000 https://funwithjustin.com/supercomputer-predicts-impact-of-newcastle-takeover-on-manchester-united-and-man-city-finishes/

Newcastle United are expected to emerge as Premier League contenders in the coming seasons after a £ 305million seismic takeover has been approved.

A consortium backed by the Saudi Arabia Public Investment Fund (PIF) struck a deal with former President Mike Ashley, which resulted in new owners.

The green light was given in the middle of the week and the news sent shockwaves through the Premier League scene.

Part of this came amid protests from human rights groups – as the Saudi-backed PIF is overseen by Crown Prince Mohammed bin Salman – while the deal is expected to be a game-changer. the pitch when the Magpies are finally able to adapt that. a new financial muscle.

It will certainly not be enough for them this season, but starting in January there are one or two new players in town when it comes to high-level transfers.

But where does that leave Manchester City and Manchester United in their title this campaign?

According to supercomputer FiveThirtyEight, Pep Guardiola and his roster of stars City are favorites to defend the trophy they’ve won three times in the past four seasons.

City are currently seen as a 50 percent chance of finishing first in the class, while Liverpool (26%) and Chelsea (18%) follow far behind. United have a 5% chance of winning a first Premier League crown since 2013 – according to the same data.

These predictions are also replicated when evaluating the four main degrees.

City have a 95% chance in this regard, while Liverpool (87%) and Chelsea (81%) are backed to follow them in the main competition – and United (48%) are expected to fend off challenges from Tottenham (14%) , Everton (13%) and Arsenal (12%).

The supercomputer predicts that 84 points will be enough for City to reign supreme, once again.

Lower in the table, Newcastle headed for that winless international break after seven matches – finishing 19th with three points.

Since their hands are tied in the transfer market until January – at least – Steve Bruce will have to work with what he has, unless new Saudi owners pull the plug and seek their own leadership appointment in the days and weeks to come.

Who will win the Premier League title this season? Give your opinion in the comments section below.

The same FiveThirtyEight data as above expects Newcastle to stay in the division.

The Magpies are expected to end the season in 17th place, although it could turn out to be a nervous final with a similar expected point total (35) to that of Burnley and Watford – which the supercomputer expects to join Norwich City (29 points) by moving to the championship.

Newcastle have a 57% chance of survival, but that should change when January rolls around.

FiveThirtyEight uses SPI odds, which have an offensive and defensive component, to determine which team will win a specific game. This is all then built up to see how many points each team will accumulate at the end of the season.

Sign up for our United newsletter to never miss an Old Trafford update this s raison.

Source link

]]>
European supercomputer project receives RISC-V test chips https://funwithjustin.com/european-supercomputer-project-receives-risc-v-test-chips/ Fri, 08 Oct 2021 10:06:42 +0000 https://funwithjustin.com/european-supercomputer-project-receives-risc-v-test-chips/

The EPI project has 28 partners from 10 European countries, with the aim of making the EU independent in HPC chip technologies and HPC infrastructure. 43 of the EPAC1.0 RISC-V test chips were delivered to EPI by GlobalFoundries and the first tests of their functioning were conclusive.

The European Processor Accelerator (EPAC) combines several specialized accelerator technologies for different application areas. Built in a 22nm process, the 1GHz chips have an area of ​​26.97mm2 with 14 million placeable instances, or the equivalent of 93m of gates, including 991 memory instances.

The test chip contains four vector processing micro-tiles (VPUs) consisting of an Avispado RISC-V core designed by SemiDynamics and a vector processing unit designed by the Barcelona Supercomputing Center and the University of Zagreb.

Each tile also contains a Home Node and L2 cache, designed by Chalmers and FORTH respectively, to provide a consistent view of the memory subsystem.

The chip also includes two additional accelerators: the Stencil and Tensor (STX) accelerator designed by Fraunhofer IIS, ITWM and ETH Zürich, and the variable precision processor (VRP) from CEA LIST. All of the on-chip accelerators are connected to a very high-speed on-chip network and to EXTOLL’s SERDES technology.

The chips were made in GF’s 22FDX low power technology and are packaged in FCBGA with 22 × 22 balls

The initial setup was successful and EPAC performed its first bare metal program sending out the traditional “Hello World!” greetings in different languages ​​to the EPI consortia and to the world!

EPI says that it will continue to develop, optimize and validate different IP blocks and demonstrate the functionality and performance of these thus creating an EU HPC IP ecosystem and making it available to the processor and processor industry. accelerators and universities for next generation HPC systems.

www.european-processor-initiative.eu/

Related Articles

Source link

]]>
Supercomputer prediction of Leeds United’s Premier League arrival https://funwithjustin.com/supercomputer-prediction-of-leeds-uniteds-premier-league-arrival/ Thu, 07 Oct 2021 15:00:00 +0000 https://funwithjustin.com/supercomputer-prediction-of-leeds-uniteds-premier-league-arrival/

FiveThirtyEight has released its latest predictions on the end of the Premier League.

The website uses statistical analytics to tell stories on several different topics, including sports, and more specifically the Premier League.

Based on the data presented so far, Leeds United are expected to finish 14th with 44 points, with a goal difference of -14.

He expects Leeds to comfortably push back the threat of relegation, nine points above the drop zone, a finish that will guarantee the Whites their third consecutive season of Premier League football.

Marcelo Bielsa’s side are expected to finish above Crystal Palace, Southampton, Newcastle United, Burnley, Watford and Norwich City with the last three constituting the trio of teams that are expected to be relegated to the Championship.

Video upload

Video unavailable

While the data doesn’t expect Leeds to have any problems in their quest to avoid the decline, FiveThirtyEight believes there is still a 15% chance that Whites will be relegated.

He also predicts Leeds have only a 6% chance of replicating their ninth place finish from last season, while confirming that a return to Champions League football remains a distant possibility at 2%.

In the rest of the division, Manchester City are set to retain their Premier League crown, with Liverpool, Chelsea and Manchester United making up the remainder of the top four.

Brighton and Hove Albion are expected to continue their strong start to the season, finishing in eighth place, ahead of Aston Villa, West Ham United and Leicester City.

Newly promoted Brentford also appear set to build on their impressive first seven games of their first Premier League campaign by securing a solid 11th place finish.

Source link

]]>
Getting up to date on the proton – sciencedaily https://funwithjustin.com/getting-up-to-date-on-the-proton-sciencedaily/ Wed, 06 Oct 2021 20:52:21 +0000 https://funwithjustin.com/getting-up-to-date-on-the-proton-sciencedaily/ Scientists are developing a revolutionary theory to calculate what happens inside a proton traveling at the speed of light.

For more than 2,000 years, scientists believed that the atom was the smallest particle possible. Then they discovered that it had a nucleus made up of protons and neutrons surrounded by electrons. After that, they discovered that the protons and neutrons themselves have a complex inner world filled with quarks and antiquarks held together by a superglue-like force created by the gluons.

“Protons along with neutrons make up over 99% of the visible universe, which means everything from galaxies and stars to us,” said Yong Zhao, a physicist at the US Department of Energy’s Argonne National Laboratory. (DOE). “Yet there is still a lot we don’t know about the rich inner life of protons or neutrons.”

Zhao co-wrote an article on an innovative method for calculating the structure of quarks and gluons of a proton moving at the speed of light. The name of the team’s creation is the Large Pulse Efficient Theory, LaMET for short, which works in conjunction with a theory called Lattice Quantum Chromodynamics (QCD).

The proton is tiny – about 100,000 times smaller than an atom, so physicists often model it as a point without dimensions. But these new theories can predict what happens in the proton by the speed of light as if it were a three-dimensional body.

The concept of momentum is vital not only for LaMET but also for physics in general. It is equal to the speed of an object multiplied by its mass.

More than half a century ago, says Zhao, a simple quark model by physicists Murray Gell-Mann and George Zweig discovered part of the internal structure of the proton at rest (no momentum). From this model, the scientists represented the proton as composed of three quarks and predicted their essential properties, such as electric charge and spin.

Subsequent experiments with protons accelerated to a speed close to the speed of light demonstrated that the proton is even more complex than originally thought. For example, it contains countless particles that interact with each other, not just three quarks linked by gluons. And gluons can briefly transform into quark-antiquark pairs before destroying each other and reverting to gluon. Particle accelerators like the one at DOE’s National Fermi Accelerator Laboratory produced most of these results.

“When you accelerate the proton and hit it with a target, that’s when the magic happens to reveal its many mysteries,” Zhao said.

About five years after the simple quark model rocked the physics community, a model proposed by Richard Feynman represented the proton moving at near light speed as a beam carrying an infinite number of quarks and gluons moving through the same direction. He called these particles “let’s go.” His parton model inspired physicists to define a set of quantities that describe the 3D structure of the proton. Researchers could then measure these amounts in experiments in particle accelerators.

Previous calculations with the best theory available at the time (lattice CQD) provided illuminating details on the distribution of quarks and gluons in the proton. But they had a serious flaw: they could not accurately distinguish between fast and slow partons.

The difficulty was that the lattice QCD could only calculate the properties of the proton that do not depend on its momentum. But applying Feynman’s parton model to lattice QCD requires knowing the properties of a proton with infinite momentum, which means that the proton particles all have to move at the speed of light. Partially filling this knowledge gap, LaMET provides a recipe for calculating parton physics from on-lattice QCD for large but finite momentum.

“We have developed and refined LaMET over the past eight years,” Zhao said. “Our article summarizes this work.”

Running on supercomputers, networked QCD calculations with LaMET generate new and improved predictions about the proton structure of the speed of light. These predictions can then be put to the test in a unique new facility called an electron-ion collider (EIC). This facility is under construction at the DOE’s Brookhaven National Laboratory.

“Our LaMET can also predict useful information about extremely difficult to measure quantities,” Zhao said. “And with sufficiently powerful supercomputers, in some cases our predictions might even be more accurate than can be measured at the EIC.”

With a better understanding of the 3D quark-gluon structure of matter using EIC theory and measurements, scientists are ready to get a much more detailed picture of the proton. We will then enter a new era of parton physics.

Source link

]]>
Former IBM Senior Vice President Robert Picciano Joins CognitiveScale as CEO https://funwithjustin.com/former-ibm-senior-vice-president-robert-picciano-joins-cognitivescale-as-ceo/ Wed, 06 Oct 2021 17:05:18 +0000 https://funwithjustin.com/former-ibm-senior-vice-president-robert-picciano-joins-cognitivescale-as-ceo/

CognitiveScale Inc., a company-backed artificial intelligence startup, today announced the arrival of Robert Picciano, former senior vice president of IBM Corp., as CEO.

Picciano led IBM’s Cognitive Systems business unit until his departure from the company in 2020. The Cognitive Systems unit is responsible, among other product lines, for the Power Systems family of data center servers. The servers are based on IBM-designed central processing units, which include optimizations for running AI models.

Picciano is credited with leading the Cognitive Systems unit to eight quarters of growth during his tenure at the helm. The executive also played a key role in the delivery of the Summit and Sierra supercomputers that IBM had developed for the US Department of Energy. The two systems, which are both based on the company’s Power Systems servers, rank among the fastest supercomputers in the world.

CognitiveScale announced today that the co-founder and current CEO Akshay Sabhikhi will assume the role of Chief Operating Officer.

The startup also announced a number of other changes to its management team. Mike McQuaid joins as Director of Revenue after serving as sSenior Vice President of Global Sales at Hitachi Vantara Corp., a leading supplier of data center storage equipment and business software. McQuaid will be responsible for directing the sales and go-to-market activities of CognitiveScale.

Jointly, CognitiveScale has appointed Vice President of Engineering Gopal Krishnan as the new Senior Vice President of Engineering and Delivery. Besides, Bart Peluso joins as vice president of product marketing. Peluso previously led product marketing at Blue Prism Ltd., a publicly traded provider of robotic process automation software.

“The addition of such respected, successful and innovative leaders is great for the entire CognitiveScale community,” CognitiveScale Executive Chairman Manoj Saxena said in a statement.

CognitiveScale provides a software platform, Cortex, that makes it easier for businesses to build and deploy AI models. The first component of the platform is a tool called Cortex Fabric which, according to the startup, enables AI software to be developed at up to 70% lower cost than alternative technologies. The tool is complemented by Cortex Certifai, an AI explainability product that promises to provide organizations with insight into how their neural networks generate decisions.

CognitiveScale also offers a range of prepackaged machine learning applications for industry verticals. They are intended for companies in sectors such as healthcare, banking, insurance and retail. CognitiveScale customers include Dell Technologies Inc., Morgan Stanley, Wells Fargo & Co. and other large companies.

Since its launch, CognitiveScale has raised $ 40 million in funding, according to Crunchbase. The startup’s backers include, among other big names, Intel Capital, IBM Corp. and the M12 venture capital arm of Microsoft Corp.

Photo: SiliconANGLE

Show your support for our mission by joining our community of Cube Club and Cube Event experts. Join the community which includes Amazon Web Services and Amazon.com CEO Andy Jassy, ​​Dell Technologies Founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many other luminaries and experts.

Source link

]]>
How Climate Models Got So Accurate They Won A Nobel Prize https://funwithjustin.com/how-climate-models-got-so-accurate-they-won-a-nobel-prize/ Tue, 05 Oct 2021 22:52:32 +0000 https://funwithjustin.com/how-climate-models-got-so-accurate-they-won-a-nobel-prize/

Climate modellers have a moment.

Last month, Time magazine listed two – Friederike Otto and Geert Jan van Oldenborg of the World Weather Attribution Project – among the 100 Most Influential People of 2021. Two weeks ago, Katharine Hayhoe of Texas Tech University was a guest on the popular talk show CBS Jimmy Kimmel Live! And on Tuesday, climate modeling pioneers Syukuro Manabe and Klaus Hasselman shared the Nobel Prize in physics with theoretical physicist Giorgio Parisi – recognition, said Thors Hans Hansson, chairman of the Nobel Committee in Physics, that “our knowledge of climate rests on a solid scientific basis, based on a rigorous analysis of the observations.

Climate modelers are experts in Earth or planetary sciences, often experienced in applied physics, mathematics, or computer science, who use physics and chemistry to create equations, feed them into supercomputers, and apply them to simulate the climate of the Earth or other planets. Models have long been viewed by climate change deniers as the soft underbelly of climate science. Being necessarily predictive, they were considered to be essentially unverifiable and the result of erroneous entries producing unreliable results.

A 1990 National Geographic The article put it this way: “Critics say modeling is in its infancy and cannot even reproduce the details of our current climate. The modelers agree and note that the predictions necessarily fluctuate with each refinement of the model.

However, more recent analyzes, dating back decades, have found that many of the older models were remarkably accurate in their predictions of global temperature increases. Now, as computing power increases and more and more refinements are added to modeling inputs, modelers are more confident in defending their work. As a result, says Dana Nuccitelli, author of Climatology versus pseudoscience: exposing the unsuccessful predictions of global warming skeptics, “There has certainly been a change from outright denial of climate science; because the predictions turned out to be so accurate, it becomes harder and harder to deny the science at this point. “

This 1990 article quoted Manabe – widely regarded as the father of modern climate modeling – as saying that, in some of the early models, “all kinds of crazy things happened … sea ice covered tropical oceans, for example.” But in a seminal 1970 article, the first to make a specific projection of future warming, Manabe argued that global temperatures would increase by 0.57 degrees Celsius (1.03 degrees Fahrenheit) between 1970 and 2000. The actual warming recorded was remarkably close to 0.54 ° C. (0.97 ° F).

A 2019 article by Zeke Hausfather of the University of California at Berkeley, Henri Drake and Tristan Abbott of the Massachusetts Institute of Technology, and Gavin Schmidt of the NASA Goddard Institute for Space Studies analyzed 17 models dating from the 1970s and found that 14 accurately predicted the relationship between global temperatures as greenhouse gases increased. (The estimates of two were too high and one was too low.) That’s because fundamental physics has always been strong, says Dana Nuccitelli, research coordinator at the Citizens’ Climate Lobby and author of Climatology versus pseudoscience: exposing the unsuccessful predictions of global warming skeptics.

“We’ve understood for decades the basic science that if you put a certain amount of carbon dioxide into the atmosphere, we’ll get some warming,” he says. “These forecasts in the 1970s were remarkably accurate, but they also used fairly simplified climate models, in part because of our level of understanding of climate systems, but also because of computational limitations at the time. It is certainly true that climate models have come a long way.

The more things change …

In the realm of climate modeling, “What hasn’t changed over the years is the overall assessment of the extent of global warming as we increase CO2,” says Hayhoe, who is also Chief Scientist for Nature Conservancy and author of Saving Us: A Climate Scientist’s Case for Hope and Healing in a Divided World. “What has changed is our understanding at increasingly smaller spatial and temporal scales. Our understanding of feedbacks in the climate system, our understanding, for example, of the real sensitivity of the Arctic. “

As this understanding has grown, she says, it has enabled the development of what she calls “the cutting edge of climate science today” – the attribution of individual events, the specialty for which Otto and van Oldenberg were recognized in Time, which for the first time is able to establish strong links between climate change and specific weather events, such as heat waves in the western United States or the amount of rain deposited by Hurricane Harvey.

“We couldn’t do this without models,” says Hayhoe, “because we need the models to simulate a world without people. And we have to compare an Earth without people to the Earth we live on with humans and carbon emissions. And when we compare these two Earths, we can see how human-induced climate change has altered the duration, intensity, and even damage associated with a specific event. “

In Hayhoe’s case, the actual act of modeling is “looking at thousands of lines of code, and it’s so intense that I often do it at night, when people aren’t emailing and that the lights are out and that I can focus on that bright screen. in a dark room. Then I blink and it’s suddenly half past four in the morning.

Much of the work, she says, involves trying to find things wrong with the models, to make sure they reflect reality. “If it doesn’t quite match up, we have to look more carefully because there is something we didn’t quite understand. “

While such deviations can be model flaws, they can sometimes reflect errors in the observations. For example, a series of studies in 2005 found that satellite data which appeared to show no warming in the lower atmosphere, or the troposphere, and which was used to cast doubt on global warming patterns, was itself wrong. The models, backed by data from weather balloons, have always been right.

Irony, says Michael Mann, professor emeritus of atmospheric science at Penn State University and most recent author of The new climate war, is that “climatologists have been dismissed as alarmists for the predictions we made, but the predictions, if any, have been shown to be too conservative and we are seeing even greater impacts than we expected.” “

The apparent imminent collapse of the system that drives the currents of the Atlantic Ocean is, he says, an example. “It’s something we anticipated, but it’s not just happening, it’s happening sooner than expected,” he notes. Manabe, he points out, was one of those who first raised the possibility decades ago. “It just underscores that what’s going on in climate science is the worst thing that can really happen to a climate modeler: to see your worst predictions come true. “

Modelers recognize that science is not perfect; Even today, uncertainties remain, and not just of a single order.

“Do we have all the physical processes in the model? And if we have them in there, are they represented correctly or not? Hayhoe asks rhetorically. “Then there is a second source of uncertainty called parametric uncertainty.” Moreover, she says, some processes take place at such small scales, for example among cloud particles, that they cannot be measured directly but must be inferred. Obviously, this adds some uncertainty. . “However, the greatest uncertainty,” she says, “does not lie in physics, but in our own collective behavior and how prepared we are to allow global levels of greenhouse gases to rise.

“If we didn’t know that carbon emissions produce all of these impacts on us, that it’s not just a curiosity about the increase in global temperature but also our food, our water, our health, our homes, then we would not act. “, says Hayhoe.

“That’s why I do what I do, and that’s why role models are so important, because they show what’s going on right now that we’re responsible for, and what’s going to happen in the future. I look forward to the day when we can just use climate models to just understand this amazing planet, but right now those models are telling us, “Now is the time to act! And if we don’t, the consequences will be serious and dangerous.

Source link

]]>
From thunderstorms to lunar terrains, computer modeling research tackles real-world problemsThe Badger Herald https://funwithjustin.com/from-thunderstorms-to-lunar-terrains-computer-modeling-research-tackles-real-world-problemsthe-badger-herald/ Tue, 05 Oct 2021 14:00:00 +0000 https://funwithjustin.com/from-thunderstorms-to-lunar-terrains-computer-modeling-research-tackles-real-world-problemsthe-badger-herald/

In an age when computers are commonplace, two professors at the University of Wisconsin are harnessing their power to run real-world scenario simulations.

Professor Leigh Orf has been using computer models to better understand thunderstorms for over 30 years. He works to understand how thunderstorms behave, how they work and how to better predict them. Orf said there’s still a lot scientists don’t understand about what happens in a thunderstorm, but using supercomputers to model them can add new clarity to their inner workings, even at a very small size. ladder.

“I am using an atmospheric model that simulates a cloud with very, very high fidelity, high quality and high resolution. So all the small-scale, complex things that take place inside the cloud are revealed in the simulation, ”Orf said.

Orf said past events are often inspirations for simulations in his work. Orf also pointed out that the weather in the field is very important, because the atmospheric and physical information of a storm cannot be collected in any other way. From a particular storm, he got the atmospheric data collected just before the storm formed and ran it as a simulation, which then successfully produced the thunderstorm and tornado that occurred during the storm. real event.

Visualization of the 2011 El Reno tornado simulated on a supercomputer
Courtesy of Leigh Orf

“I was able to study the simulation independently of the actual event. But I can also compare it to the actual event because it is based on an actual event, ”Orf said.

Orf’s work has recently been published on the cover of Science magazine. He, along with colleagues at Stanford University, found answers to an unknown atmospheric phenomenon by running a simulation of the event. Orf said these computer models can sometimes provide information about real-world scenarios and allow scientists to discover things on the computer before they are discovered in the atmosphere.

He said that a growing area that will likely become more important in the future of computer modeling is machine learning. As more data is collected from simulations, processing that data will be a vital task for artificial intelligence.

“We’re going to have robots at some point and self-driving cars, and that stuff will be AI,” Orf said. “So this is an area that I would say is really interesting and important.”

Pet Finance: Putting a Price on a Dog’s Life Using Economic ModelingEconomists have long valued the cost of a human life at around $ 10 million. And now they’ve rated a Read…

Self-driving cars are precisely what Professor Dan Negrut uses to study computer models. As the technical lead of the simulation-based engineering lab, Negrut is currently working on two projects that involve running computer simulations of an autonomous car and a lunar rover to determine how they would perform in a field situation. unique.

Negrut said computer models offer a safer and more cost-effective way to analyze the reactions of autonomous vehicles. The simulations allow researchers to create a virtual environment in which they can test the vehicle millions of times with no real consequences from running the tests.

“It’s really hard to generate these kinds of fake scenarios in the real world. How would you take a Tesla and really hit it with a patch of ice? What if you wanted to do it ten times? What if you wanted to do it ten times? What if you wanted to change the “brain” of the vehicle and hit it again? Said Negrut.

For the lunar rover experience, this advantage is even more relevant. Negrut said it was essential to test the capabilities of the rover before the very expensive process of sending it into space. In this case, running computer simulations means scientists can test the rover on different terrains and obstacles in an environment – including a difference in gravity – that looks exactly like that of the moon without having to send it multiple times. .

As part of a larger project to bring humans to the moon, the VIPER rover will be tasked with determining where the water is on the moon and how difficult it will be to extract it, Negrut said.

Negrut said that running a computer simulation is essentially about solving a large number of equations representing the physical system you are working with. For something like the rover project with so many components, the number of equations needed to represent the environment can run into the billions, he added.

Wednesday Nite @ The Lab: The Physics of X-Ray LasersOn the first Wednesday night @ The Lab of the semester, University of Wisconsin physicist Uwe Bergmann presented the vibrant Read…

Orf said that a key aspect of computer modeling is being able to write code that harnesses the full power of the computer. Programming the computer to do what you want is not an easy process, especially when supercomputers are involved, Orf said.

“Supercomputers are really just a ton of computers all networked together. Imagine a warehouse full of really powerful desktops, ”Orf said. “How do I write a program that runs on all of them and runs all at the same time?” It is not easy. And even to gain access to these computers, you have to prove that you have code that will work effectively on them.

Orf said that using computer modeling for meteorological research requires a very specific skill set. Someone who is only computer literate may not know enough physics to determine if the model is reasonable. But an observational meteorologist may not have the knowledge to run the computer programs necessary to create even a model.

Orf and Negrut both use computer modeling to help improve safety and people’s lives. Whether it’s ensuring that self-driving cars behave in predictable ways or helping people better prepare for dangerous storms, computers contribute to our society on much deeper levels than we realize.

Source link

]]>
Supercomputers reveal how X chromosomes fold and turn off https://funwithjustin.com/supercomputers-reveal-how-x-chromosomes-fold-and-turn-off/ Mon, 04 Oct 2021 19:13:29 +0000 https://funwithjustin.com/supercomputers-reveal-how-x-chromosomes-fold-and-turn-off/

RNA particles invade an X chromosome of a mouse in a new visualization of the inactivation of the X chromosome. Credit: Los Alamos National Laboratory

Using dynamic modeling driven by a supercomputer and based on experimental data, researchers can now probe the process that deactivates an X chromosome in female mammalian embryos. This new ability helps biologists understand the role of RNA and chromosome structure in the X inactivation process, leading to a better understanding of gene expression and opening new avenues for treatments. medicinal for genetic disorders and diseases.

“This is the first time that we have been able to model all the RNA that is spreading around the chromosome and stop it,” said Anna Lappala, visiting scientist at Los Alamos National Laboratory and polymer physicist in Massachusetts. General Hospital and the Harvard Department. of molecular biology. Lappala is the first author of the article published on October 4 in the Proceedings of the National Academy of Sciences. “From the experimental data alone, which is 2D and static, you don’t have the resolution to see an entire chromosome at this level of detail. With this modeling, we can see the processes regulating gene expression, and the modeling is based on 2D. experimental data from our collaborators at Massachusetts General Hospital and Harvard. “

The model, considered 4D because it shows movement, including time as a fourth dimension, runs on Los Alamos supercomputers. The model also incorporates experimental data from mouse genomes obtained using a molecular method called 4DHiC. The combined molecular and computational methodology is a first.

In the visualization, RNA particles swarm on the X chromosome. The tangled spaghetti-like strands twist, change shape, and then the particles rush and penetrate the depths of the chromosome, deactivating it. See the visualization:






“The method allows us to develop an interactive model of this epigenetic process,” said Jeannie T. Lee, professor of genetics at Harvard Medical School and vice president of molecular biology at Massachusetts General Hospital, whose lab provided the data. experiments that underlie the model.

Epigenetics is the study of changes in gene expression and hereditary traits that do not involve mutations in the genome.

“What is missing in the field is a way for a user who is not math-savvy to interactively enter a chromosome,” Lee said. She compared using the Los Alamos model to using Google Earth, where “you can zoom anywhere on an X chromosome, pick your favorite gene, see the other genes around it, and see how they go. interact “. This ability could provide insight into how diseases are spread, for example, she said.

Based on the work in this article, Los Alamos is currently developing a Google Earth-style browser where any scientist can download their genomic data and dynamically visualize it in 3D at various magnifications, said Karissa Sanbonmatsu, structural biologist at Los Alamos National Laboratory. , corresponding author of the article and project leader in the development of the calculation method.

In mammals, a female embryo is conceived with two X chromosomes, one inherited from each parent. Inactivation of X shuts down the chromosome, a crucial step for the survival of the embryo, and variations in X inactivation can trigger a variety of developmental disorders.

Los Alamos’ new model will facilitate a deeper understanding of gene expression and related issues, which could lead to pharmacological treatments for various diseases and genetic disorders, Lee said.

“Our main goal was to see the chromosome change shape and see the levels of gene expression over time,” said Sanbonmatsu.

To understand how genes are turned on and off, said Sanbonmatsu, “it is really helpful to know the structure of the chromosome. The hypothesis is that a tightly-structured and compacted chromosome tends to turn genes off, but there is no isn’t a lot of smoking guns. about it. By modeling moving 3D structures, we can get closer to the relationship between structural compaction and gene deactivation. “

Lee compared the structure of the chromosome to origami. A complicated shape similar to an origami crane provides a lot of surface area for gene expression and might be biologically preferred for staying active.

The model shows a variety of substructures in the chromosome. When closed, “it’s a piecemeal process in which some substructures are kept but some are dissolved,” Sanbonmatsu said. “We see early, middle and final stages, through a gradual transition. This is important for epigenetics because this is the first time that we have been able to analyze the detailed structural transition in an epigenetic change.”

The modeling also shows genes on the surface of the chromosome that escape the inactivation of the X chromosome, confirming the first experimental work. In the model, they cluster together and apparently interact or work together on the surface of the chromosome.

In another look at modeling, “As the chromosome changes from an active X, when it is still large enough, to a compact inactive X, i.e. smaller, we notice that there is a nucleus of the chromosome that is extremely dense, but the surface is much less dense. We’re also seeing a lot more movement on the surface, ”Lappala said. “Then there is an intermediate region that is neither too fast nor too slow, where the chromosome can rearrange.”

An inactive X can activate later in a process called age-related activation of the inactive X. “It’s associated with problems in blood cells in particular that are known to cause autoimmunity,” Lee said. “Some research attempts to pharmacologically activate inactive X to treat neurological disorders in children by giving them back something that is missing on their active X chromosome. For example, a child might have a mutation that can cause disease. We believe that if we can reactivate the normal copy on the inactive X, then we would have epigenetic treatment for this mutation. ”


Study reveals new clues to X chromosome architecture


More information:
4D chromosomal reconstruction elucidates the spatio-temporal reorganization of the X chromosome in mammals, PNAS (2021). doi.org/10.1073/pnas.2107092118

Provided by the Los Alamos National Laboratory

Quote: Supercomputers Reveal How X Chromosomes Fold and Deactivate (2021, October 4) Retrieved October 4, 2021 from https://phys.org/news/2021-10-supercomputers-reveal-chromosomes-deactivate.html

This document is subject to copyright. Other than fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.

Source link

]]>
Texas Advanced Computing Center Marks Two Decades of Powerful Discoveries https://funwithjustin.com/texas-advanced-computing-center-marks-two-decades-of-powerful-discoveries/ Fri, 01 Oct 2021 21:57:04 +0000 https://funwithjustin.com/texas-advanced-computing-center-marks-two-decades-of-powerful-discoveries/

October 1, 2021 – Twenty years ago, a handful of computer experts with a first-hand Cray computing cluster began to build the Texas Advanced Computing Center, or TACC, at the University of Texas at Austin into an organization of research. which is today at the height of university intensive computing.

On September 30, the center and its oldest partners – the National Science Foundation (NSF) and Dell Technologies – celebrated this milestone with remarks on the growing importance of advanced computing and the role of TACC in scientific and technical discoveries.

“Two decades ago, UT made a big bet on TACC and supercomputing. It’s an investment that has paid off, ”said Jay Hartzell, president of UT Austin. “And, given the proliferation of data science, AI, and machine learning across fields and across society, there is no limit to the impact of TACC over the years. Next 20 years. “

Throughout its history, TACC has fueled many notable discoveries, helped society, and enabled new approaches to answer humanity’s oldest questions.

  • Astronomers used TACC systems to analyze the data and confirm the very first image of a black hole from the Event Horizon telescope.
  • The TACC has devoted more than 30% of its IT resources to supporting more than 50 COVID-19 research teams, leading to the first atomistic model of SARS-CoV-2 and the daily pandemic predictions that continue to guide people. national, local and national political decisions.
  • The TACC supercomputers have confirmed the first observation of gravitational waves by detectors of the Observatory of gravitational waves by laser interferometer (LIGO). The discovery opened a new window on the universe and led to a Nobel Prize in physics in 2017.
  • Physicists calculated the behavior of ‘magic angle’ twisted graphene using TACC systems and came up with a theory that a decade later led to superconducting materials that could enable quantum computing and electrical transmission. more efficient.

Since June 2001, the center has grown from a dozen employees to nearly 200, with emerging expertise in data science and artificial intelligence, life sciences, science gateways and STEM education.

The center now operates two of the most powerful university supercomputers in the United States: Frontera, 10th fastest in the world; and Stampede2, currently 35th – and over a dozen advanced computer systems in total. Tens of thousands of academics and students from across the United States use TACC supercomputers each year to advance all fields of science, from astronomy to zoology, and from nanoscale to cosmic scale.

A visualization of Hurricane Ike created by TACC’s Ranger supercomputer in 2008. The effort represented one of TACC’s first forays into urgent computation for natural hazards.[Credit:FuqingZhangetYonghuiWengPennsylvaniaStateUniversity;FrankMarksNOAA;GregoryPJohnsonRomySchneiderJohnCazesKarlSchulzBillBarthUniversitéduTexasàAustin[Credit:FuqingZhangandYonghuiWengPenniaSarlovyChocolateSchulzBillBarthUniversityofTexasatAustin[Credit:FuqingZhangandYonghuiWengPennomySarlashnomyServiceJohnahnySchulzBillBarthSchulzBillBarthUniversitySchulzBillBarthUniversitàTexasatAustin[Credit:FuqingZhangandYonghuiWengPennomySarlatheSarlashnyJohnohnySchneiderSchneiderJohnCazesKarlSchulzBillBarthUniversity[Crédit :FuqingZhangetYonghuiWengUniversitéd’ÉtatdePennsylvanie ;FrankMarksNOAA;GregoryPJohnsonRomySchneiderJohnCazesKarlSchulzBillBarthUniversitéduTexasàAustin[Credit:FuqingZhangandYonghuiWengPennsylvaniaStateUniversity;FrankMarksNOAA;GregoryPJohnsonRomySchneiderJohnCazesKarlSchulzBillBarthTheUniversityofTexasatAustin

“TACC’s growth has been remarkable and is a testament to the people who work here and the organizations that have supported us including UT Austin, UT System, the National Science Foundation, the O’Donnell Foundation and Dell Technologies – our longest and longest. consistent champions, ”said Dan Stanzione, executive director of TACC and associate vice president for research at UT Austin. responders after the Deepwater Horizon oil spill.

“TACC’s resources have been of extraordinary service to science, ranging from its resource contribution to the COVID-19 HPC consortium, to its cultivation of new talent through the Frontera Computational Science Fellowships,” said Margaret Martonosi, Deputy Director of the NSF for Computer and Information Science and Engineering. Support for the TACC has expanded in recent years to include federal agencies such as the Department of Defense, the National Institutes of Health, the National Oceanic and Atmospheric Administration and the Defense Advanced Research Projects Agency, as well as the State from Texas, the city of Austin, Microsoft and even Tito’s Vodka. Throughout its history, the center has forged close partnerships with technology companies, including Dell Technologies, to design systems and develop tools for the academic research community.

“At Dell Technologies, we are extremely proud to stand alongside UT and TACC as we continue to set the bar for high performance computing,” said Michael Dell, President and CEO of Dell Technologies.

The IT community has grown tremendously over the past two decades, encompassing entire new disciplines, from digital humanities to computational oncology and deep learning.

“Supercomputing has become essential to research in all areas of science, engineering and medicine,” said Dan Jaffe, vice president of research at UT Austin. “TACC has not only dramatically increased its computing capabilities, but also as a research support and partner for the many researchers around the world who use it. I can’t wait to see what upcoming improvements to the machines and the TACC ecosystem bring in terms of new discoveries and even more impactful contributions to society.

Read the announcement on the TACC website: https://www.tacc.utexas.edu/-/tacc-marks-two-decades-of-powering-discoveries


Source: Texas Advanced Computing Center

Source link

]]>
GENCI supports the development of quantum computing in France https://funwithjustin.com/genci-supports-the-development-of-quantum-computing-in-france/ Fri, 01 Oct 2021 20:53:57 +0000 https://funwithjustin.com/genci-supports-the-development-of-quantum-computing-in-france/

PARIS, October 1, 2021 – The development and use of quantum technologies is a major and growing issue at national, European and international levels. In this context, GENCI (Grand National Equipment for Intensive Computing) contributes to the adoption of quantum technologies to help France achieve the objective set by the President of the Republic on January 21, 2021 when the National Quantum Plan was announced. : “Place France in the first circle of countries mastering quantum technologies.

Committed with CEA, Inria, CNRS and CPU as part of the “national strategy on quantum technologies” to host and make available in France the first European infrastructure coupling supercomputers and various quantum acceleration devices, GENCI is also committed to a tailor-made program called PAck Quantique (PAQ) with the Île de France Region and Le Lab Quantique to promote synergies between manufacturers, start-ups and academic consortia and facilitate the appropriation of this major technological breakthrough.

In this dynamic, GENCI is pleased to announce its support as sponsor and co-organizer of the BIG Quantum Hackathon organized by QuantX. This commitment was made possible by a shared conviction: the success of quantum technologies will depend as much on scientific excellence as on entrepreneurship.

This support strengthens the partnership between GENCI and Le Lab Quantique, as well as other stakeholders, aiming at future collaborations and projects, promoting the development of the quantum ecosystem in France, to possibly lead to the development of industrial programs on concrete use cases.

The BIG Quantum Hackathon

The BIG Quantum Hackathon by QuantX, the first competition of its kind, aims to bring together the entire quantum computing value chain and demonstrate its ability to meet real-life challenges. During the event, the business community represented by industrial and financial companies, VC / PE investors and consulting groups will join forces with quantum computing specialists from academia, hardware vendors and quantum software to tackle a set of long-standing problems in different fields: chemistry, machine learning, optimization, numerical simulations, to name a few. By setting a milestone for collaborative problem solving, the BIG Quantum Hackathon will help participants explore the real impact of their work, the current viability of QA solutions, existing business interests, priorities, and effort in coming years.

The BIG Quantum Hackathon is organized under the high patronage of Cédric O – the Minister of State for Digital Transition and Electronic Communications, and Neil Abroug – the Head of the National Quantum Strategy. It is supported by BPIfrance – the Public Investment Bank, Inria – the National Institute for Research in Digital Sciences and Technologies, GENCI – the National Agency for High Performance Computing, Quantonation – a venture capital fund dedicated to startups deep physics with a focus on the emerging and disruptive field of quantum technologies; and Le Lab Quantique, a Paris-based think tank and network of bottom-up initiatives.

The Big Quantum Hackathon will take place October 2-5, in two phases. The technical phase will take place on October 2 and 3 at Inria. The application phase dedicated to companies will be held on October 4 and 5 at Quantonation.

About GENCI

Created by the French government in 2007, GENCI is a large-scale Research Infrastructure, a public operator body which aims to democratize the use of digital simulation thanks to high performance computing associated with artificial intelligence, to support scientific competitiveness and French industrial company.

GENCI has three missions:

– Implement the national strategy aimed at providing French open scientific research with high-performance computing, storage and massive data processing resources associated with AI technologies, in conjunction with the three national computing centers;

– Support the creation of an integrated HPC ecosystem at national and European level;

– Promote digital simulation and HPC to academic research and industry.

GENCI is a civil society owned 49% by the French State, represented by the Ministry of Higher Education and Research, 20% by the CEA, 20% by the CNRS, 10% by the universities represented by the Conference university presidents and 1% by Inria.

About QuantX

QuantX is an association of École Polytechnique (X) alumni with more than 150 quantum computing specialists around the world: representatives of industrial and university groups, startups, investors and consulting firms, the media and government. By bringing the quantum community together, QuantX aims to accelerate the growth of the QC ecosystem, improve knowledge transfer, help discover new applications, train future engineers and quantum experts, and motivate future generations to contribute to the quantum computing world. QuantX carries the vision that the success of quantum technologies depends on scientific excellence and entrepreneurship. After a series of events connecting the various players, QuantX looks forward to its next milestone event, the BIG Quantum Hackathon, the largest business-tech competition in the field.


Source: GENCI

Source link

]]>