Feeds

Power-mad HPC fans told: No exascale for you - for at least 8 years

And here's why...

Internet Security Threat Report 2014

I recently stumbled upon a transcript from a very recent interview with HPC luminaries Jack Dongarra (University of Tennessee, Oak Ridge, Top500 list) and Horst Simon (deputy director at Lawrence Berkeley National Lab.) The topic? Nothing less than the future of supercomputers. These are pretty good guys to ask, since they’re both intimately involved with designing, building, and using some of the largest supercomputers to ever walk the earth.

The conversation, transcribed into a chat format in Science magazine, focused on the biggest challenge to supercomputing: power consumption. We can’t simply scale today’s petascale systems up into exascale territory – the electrical demands are just too much. The current top super, ORNL’s Titan, needs a little more than 8 megawatts to deliver almost 18 petaflops. If we scaled Titan’s tech to exascale (which means growing it by 18 times), we’d see power consumption at a whopping 144 megawatts – which, if you could even get it into the building, would cost something like $450m per year at current rates.

We’ve often heard that exascale systems will need to come in at 20 megawatts or less in order to be somewhat affordable. While the evolutionary improvements in power consumption have been significant over the last several years, they won’t be nearly enough to get us into that 20MW power envelope. In the interview, Dongarra and Simon talk about how we’re going to need some revolutionary technology (they mentioned stacked memory and optical interconnects) to get us to a point where we can even talk about firing up an exascale system.

In the words of NVIDIA chief executive Jen-Hsun Huang: "Power is now the limiter of every computing platform, from cellphones to PCs and even data centres." But power consumption is only the first, and highest profile, problem. They say we’re going to need to see changes – even breakthrough – on several fronts, including operating systems, applications, and even algorithms in order to bring exascale home. And breakthroughs aren’t free, nor even very cheap. Simon said that a “complete exascale program” could cost an additional $300m to $400m per year for 10 years – over and above what is being spent on HPC now.

Given the current economic climate, it isn’t surprising to learn that funding, at least from Western nations, isn’t hitting these levels. Which is why neither of these HPC authorities is betting on exascale by 2020.

There’s plenty more interesting discussion in the interview, including China’s changing role in HPC, the benefits of exascale and the way HPC technology trickles down into even consumer products. And with all that Really Big Data on the way, we could all soon be indirect beneficiaries of all the funds and research various companies and governments have invested in it. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.