Feeds

Stanford super runs million-core calculation

'Sequoia' focuses its attention on fluid dynamics problem

Reducing the cost and complexity of web vulnerability management

Stanford University engineers are claiming a record for the year-old Sequoia supercomputer, after running up a calculation that used more than a million of the machine’s cores at once.

The work was conducted by the university’s Centre for Turbulence Research, seeking to get a model for supersonic jet noise that’s more sophisticated than “wow, that’s loud”. The predictive simulations the centre conducts contribute to designing quieter engines by providing input into components such as nozzle shape.

The trick is getting the models to run quickly enough: and it was the search for speed that led the researchers to get to work getting their code to run across so many cores in parallel.

Sequoia has more than 1. 5 million cores and 1.6 Petabytes of storage, which when it was first installed made it the world’s most powerful supercomputer. However, in normal use, that power is spread across a bunch of different workloads.

As Stanford’s announcement notes: “CFD [computational fluid dynamics] simulations test all aspects of a supercomputer. The waves propagating throughout the simulation require a carefully orchestrated balance between computation, memory and communication. Supercomputers like Sequoia divvy up the complex math into smaller parts so they can be computed simultaneously. The more cores you have, the faster and more complex the calculations can be.”

If anything’s wrong with any of the million bits of code running at once, the simulation is at best slowed down. The researchers first spent weeks “ironing out the wrinkles”, the university said, before kicking off the final simulation, starting with “full-system scaling” before finally watching the CFD simulation reach the million-core target.

The work, led by research associate Joseph Nichols, was described by the centre’s director Parviz Moin like this:

“Computational fluid dynamics (CFD) simulations, like the one Nichols solved, are incredibly complex. Only recently, with the advent of massive supercomputers boasting hundreds of thousands of computing cores, have engineers been able to model jet engines and the noise they produce with accuracy and speed.”

One of its outputs is shown below. Exhaust temperatures are shown in red, noise in blue, and the grey object at the left is a new nozzle design, with chevrons in the nozzle designed to reduce noise. ®

Stanford's jet-noise simulation needed a milliion cores

Image courtesy of the Center for Turbulence Research, Stanford University

Choosing a cloud hosting partner with confidence

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.