Computer History Museum honors Cerebras Systems

When Cerebras Systems was released at Hot Chips in August 2019, the hardware community wasn’t quite sure what to think. Attendees were understandably skeptical of the new “wafer-scale” technology, let alone an estimated power envelope of around 15 kilowatts for the chip alone. In the three years since, the company – under the leadership of founder and CEO Andrew Feldman – has won over early critics with a series of impressive milestones. The company signed a multi-laboratory contract with the Department of Energy just a month after Hot Chips’ debut and now has a number of premier facilities at government labs and commercial sites across the world.

The new Cerebra Wafer-Scale Engine exhibit at the Computer History Museum in Mountain View, California.

Today at 2:30 p.m. PT, Cerebras is honored by the Computer History Museum (CHM), which will unveil a new exhibit featuring Cerebras’ Wafer-Scale Engine (WSE) at its iconic location in Mountain View, California. Roughly the size of a dinner plate, Cerebras WSE-2 packs 2.6 trillion transistors and 850,000 AI-optimized cores. Powered by the WSE-2, the Cerebras CS-2 system tackles AI models that reach billions and trillions of parameters.

“There are far more transistors in this single Cerebras chip than in the 100,000 computer objects in the Museum’s permanent collection combined,” said Dag Spicer, senior curator at the Computer History Museum.

Tune in here today, Wednesday, August 3, from 2:30-3:15 p.m. PT to watch a live chat with Cerebras Systems CEO Andrew Feldman and President and CEO of Computer History Museum, Dan’l Lewin.

THE STREAM LINK WILL BE LIVE AT 2 PM

Cerebras starts with a 300 mm insert and removes the largest square possible.

The WSE-2 is manufactured by TSMC on its 7nm node and is the second generation Wafer Scale Engine. The WSE-2 measures 46,225 mm2, more than 50 times larger than competing chips. Launched in 2021, the WSE-2 offers twice the transistor count, core count, memory, memory bandwidth, and fabric bandwidth of the first-generation product, with only a modest increase in its power footprint. (23 kW against 20 kW). The next chip, which is planned on 5nm process technology, will pack more cores to handle the rapidly growing computational needs of AI, Feldman said.

Cerebras has customers in North America, Asia, Europe and the Middle East. Key customers include GlaxoSmithKline, AstraZeneca, TotalEnergies, nference, Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center, Leibniz Supercomputing Centre, National Center for Supercomputing Applications, Edinburgh Parallel Computing Center (EPCC), National Energy Technology Laboratory and Tokyo Electron Devices.

About Mariel Baker

Check Also

How they could drastically increase energy efficiency

Traditionally, “quantum supremacy” is sought from the point of view of raw computing power: we …