Feeds

US nuke boffins: Multicore CPU gains stop at eight

More core ≠ Moore's Law

Internet Security Threat Report 2014

US nuke boffins say they have seen the future of multicore computing, and it is troubled.

Researchers at the Sandia national lab say that their projections indicate that performance gains flatten out badly after quad cores and cease altogether after eight - and beyond that point, performance actually worsens as more cores are packed onto a processor chip.

According to the Sandia boffins, the issue is that as numbers of cores increase, access to the information to be processed becomes more difficult. Performance gains extrapolated from Moore's famous Law can't be sustained.

"Multicore gives chip manufacturers something to do with the extra transistors successfully predicted by Moore's Law," says Sandia's Arun Rodrigues. "The bottleneck now is getting the data off the chip to or from memory or the network."

Rodrigues and his colleagues ran simulations which indicated that a 16-core unit would actually perform "barely as well" as a dual-core one.

"To some extent, it is pointing out the obvious — many of our applications have been memory-bandwidth-limited even on a single core," says Rodrigues. "However, it is not an issue to which industry has a known solution, and the problem is often ignored."

The Sandia boffins say that multicore systems have successfully taken supercomputing into the petaflop (a quadrillion floating-point operations per second) era, but they aren't going to break the exaflop barrier without something new.

"The [chip design] community didn't go with multicores because they were without flaw," says Mike Heroux, another Sandia computing brainbox. "The community couldn't see a better approach. It was desperate. Presently we are seeing memory system designs that provide a dramatic improvement over what was available 12 months ago, but the fundamental problem still exists."

The Sandia lab is partnered with the Oak Ridge lab, home to the world-number-two-ranked "Jaguar" petaflop machine, on an exaflop push called the Institute for Advanced Architectures. The Sandia boffins say that the Jaguar is based on their Red Storm design, and that they have a "large investment in message-passing programs" which "may help solve the multicore dilemma".

The US Department of Energy, which operates the Sandia, Oak Ridge and other American nuke labs, has traditionally been interested in supercomputing in order to simulate atomic warhead performance. The idea is that very accurate sims using huge amounts of computing power will allow the US nuke arsenal to be maintained in reliable condition without the use of live tests.

There's more on the multicore research from Sandia here. ®

Internet Security Threat Report 2014

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
How to simplify SSL certificate management
Simple steps to take control of SSL certificates across the enterprise, and recommendations centralizing certificate management throughout their lifecycle.