Feeds

Top 500 supers - rise of the Linux quad-cores

Jaguar munches Roadrunner

  • alert
  • submit to reddit

Beginner's guide to SSL certificates

I see your petaflops - and I raise you 10

Petaflops had become boring on the June 2009 list, and all eyes on the HPC community are on how they can push up to 10 petaflops and beyond and push to get funding to build such monstrous machines. While there are only two machines on the list that have broken through the petaflops barrier, everybody knows they can do it. It is just a matter of doing what others have done, or mixing it up a little.

Getting to 10 petaflops is no more trivial now than breaking 1 teraflops was in 1996 or 1 petaflops was in 2008. It takes a lot of changes in technology to make such big leaps. The teraflops barrier was broken with massive parallelism and fast interconnects, and the petaflops barrier was initially broken by a hybrid architecture pairing x64 processors and co-processors to boost their math performance.

The fact that the current top-end Jaguar machine does not use GPU or FPGA co-processors to get to over 2.3 petaflops of peak performance does not mean 10 petaflops will be attained with CPUs alone. Some HPC codes work well with CPU-only setups, and others will do better with the combination of CPU-GPU architectures. What HPC vendors need to do is get GPUs into the server nodes and more tightly connected to the CPUs they serve.

If you draw the projections (as the techies behind the Top 500 list have done), then sometime in late 2011 or early 2012, the fastest machine in the Top 500 list should be able to hit 10 petaflops and the aggregate performance on the list will be well above 100 petaflops. By sometime in 2015, a supercomputer will have to be rated at 1 petaflops or so just to make it on the list, if projections stay linear as they have since 1993, when the Top 500 list started.

On the current list, it takes 20 teraflops to rank at all, just so you can see how quickly Moore's Law and a lot of clever networking pushes HPC technology. Provided supercomputing centers can shift their codes to hybrid architectures, the price/performance of multicore x64 processors and their related GPUs is probably the horse to bet on. Exotic machines may have seen their heydays already. ®

Remote control for virtualized desktops

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.