Feeds

Power-mad HPC fans told: No exascale for you - for at least 8 years

And here's why...

Internet Security Threat Report 2014

I recently stumbled upon a transcript from a very recent interview with HPC luminaries Jack Dongarra (University of Tennessee, Oak Ridge, Top500 list) and Horst Simon (deputy director at Lawrence Berkeley National Lab.) The topic? Nothing less than the future of supercomputers. These are pretty good guys to ask, since they’re both intimately involved with designing, building, and using some of the largest supercomputers to ever walk the earth.

The conversation, transcribed into a chat format in Science magazine, focused on the biggest challenge to supercomputing: power consumption. We can’t simply scale today’s petascale systems up into exascale territory – the electrical demands are just too much. The current top super, ORNL’s Titan, needs a little more than 8 megawatts to deliver almost 18 petaflops. If we scaled Titan’s tech to exascale (which means growing it by 18 times), we’d see power consumption at a whopping 144 megawatts – which, if you could even get it into the building, would cost something like $450m per year at current rates.

We’ve often heard that exascale systems will need to come in at 20 megawatts or less in order to be somewhat affordable. While the evolutionary improvements in power consumption have been significant over the last several years, they won’t be nearly enough to get us into that 20MW power envelope. In the interview, Dongarra and Simon talk about how we’re going to need some revolutionary technology (they mentioned stacked memory and optical interconnects) to get us to a point where we can even talk about firing up an exascale system.

In the words of NVIDIA chief executive Jen-Hsun Huang: "Power is now the limiter of every computing platform, from cellphones to PCs and even data centres." But power consumption is only the first, and highest profile, problem. They say we’re going to need to see changes – even breakthrough – on several fronts, including operating systems, applications, and even algorithms in order to bring exascale home. And breakthroughs aren’t free, nor even very cheap. Simon said that a “complete exascale program” could cost an additional $300m to $400m per year for 10 years – over and above what is being spent on HPC now.

Given the current economic climate, it isn’t surprising to learn that funding, at least from Western nations, isn’t hitting these levels. Which is why neither of these HPC authorities is betting on exascale by 2020.

There’s plenty more interesting discussion in the interview, including China’s changing role in HPC, the benefits of exascale and the way HPC technology trickles down into even consumer products. And with all that Really Big Data on the way, we could all soon be indirect beneficiaries of all the funds and research various companies and governments have invested in it. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.