Nuclear fusion simulator among boffinry tools picked for monster Summit supercomputer

As Uncle Sam builds another nuke test number-cruncher

In November, the US government announced it will build Summit, a $325m supercomputer capable of performing 300 quadrillion calculations per second if you redline it.

When installed at the Oak Ridge National Laboratory in 2017 and powered up by 2018, it will be the fastest computer in the world compared to its publicly known rivals as they stand today.

Today, The Register has learned of 13 science projects approved by boffins at the US Department of Energy to run on the 300-petaFLOPS Summit. These software packages, selected for the Center for Accelerated Application Readiness (CAAR) program, will be ported to the massive parallel machine, and are hoped to make full use of the supercomputer's architecture.

They range from astrophysics, biophysics, chemistry, and climate modeling to combustion engineering, materials science, nuclear physics, plasma physics and seismology.

The machine will be built from about 3,500 nodes powered by IBM Power9 processors and Nvidia Volta GPUs, glued together using NVlink – there's more information on Nvidia's 200Gbps interconnect on our sister site, The Platform. Mellanox will provide the networking gear.

According to a piece of paper we've seen, here are the 13 CAAR projects* for Summit:

  • Plasma physics code XGC: Led by Dr C.S. Chang, Princeton Plasma Physics Laboratory. This simulates and investigates electrostatic turbulence and effects present in torus-shaped fusion systems.
  • Seismology application SPECFEM: Led by Prof Jeroen Tromp, Princeton University. Models seismic wave propagation through.
  • Climate simulation code ACME: Led by Dr David Bader, Lawrence Livermore National Laboratory. Used to study changes to areas of snow and ice on Earth, among other things.
  • Relativistic chemistry code DIRAC: Led by Prof Lucas Visscher, Free University of Amsterdam. Calculates molecular properties and simulates quantum chemistry.
  • Astrophysics simulation code FLASH: Led by Dr Bronson Messer, Oak Ridge National Laboratory. Used to study thermonuclear-powered supernovae and other high-energy density physics.
  • Plasma physics code GTC: Led by Dr Zhihong Lin, University of California-Irvine. Also used to perform particle simulation of fusion devices.
  • Cosmology simulation code HACC: Led by Dr Salman Habib, Argonne National Laboratory. Used to power the world’s largest cosmology simulation, apparently.
  • Electronic structure application LS-DALTON: Led by Prof Poul Jørgenson, Aarhus University. Good for investigating molecular electronic structures.
  • Biophysics simulation code NAMD: Led by Prof Klaus Schulten, University of Illinois at Urbana-Champaign. Simulates large biomolecular systems.
  • Nuclear physics application NUCOR: Led by Dr Gaute Hagen, Oak Ridge National Laboratory. Used for running complex nuclear physics math.
  • Computational chemistry code NWCHEM: Led by Dr Karol Kowalski, Pacific Northwest National Laboratory. A set of computational chemistry tools.
  • Materials science application QMCPACK: Led by Dr Paul Kent, Oak Ridge National Laboratory. Performs electronic structure calculations of molecular, quasi-2D and solid-state systems.
  • Combustion engineering code RAPTOR [slides PDF]: Led by Dr Joseph Oefelein, Sandia National Laboratories. Framework for modeling advanced combustion engines.

Interestingly, the Summit supercomputer will be powered by Power9 chips from IBM, which openly licensed its processor technology to Chinese companies via the OpenPower Consortium. Meanwhile, the US government has banned Intel and others from shipping high-end chips to China's supercomputer builders.

Who else is using supercomputers these days besides physicists and the like? The oil and gas industry: Petroleum Geo-Services in Norway just bought a five-petaFLOPS Cray for processing ultra-high-resolution seismic data, and French energy giant Total is upgrading its Pangea super to a 6.7-petaFLOPS machine using kit from SGI.

Finally, Intel and Cray are building a 180-petaFLOPS computer called Aurora for the US Department of Energy. And Uncle Sam's National Nuclear Security Administration (NNSA) – which simulates nuclear stockpile degradation, storage, and maintenance – will be powering up a $174m Cray-made Trinity machine at the Los Alamos National Laboratory (LANL) in 2016.

Drop by The Platform, there's a link below, for more on that NNSA development. ®

Trinity will model nuclear stockpiles and to serve a broader array of classified and unclassified scientific and technical computing projects ...
Read more on platform logo

Updated to add

* Since the publication of this story, the CAAR website has updated with a little more information about the individual projects.


Biting the hand that feeds IT © 1998–2017