Top 500 supers – The Dawning of the GPUs

Coiled for the 10 petaflops spring

Remote control for virtualized desktops

The politics of the Top 500

The Top 500 list of supercomputers is put together twice a year to pit the fastest 500 supercomputers in the world against each other regardless of processor, interconnect technology, operating system, or whatever. Erich Strohmaier and Horst Simon, computer scientists at Lawrence Berkeley National Laboratory, Jack Dongarra of the University of Tennessee, and Hans Meuer of the University of Manheim make the list, which is based on the Linpack Fortran benchmark test created by Dongarra and colleagues Jim Bunch, Cleve Moler, and Pete Stewart back in the 1970s to gauge the relative number-crunching performance of computers. The official Top 500 list came out in 1993, and this June 2010 compilation is the 35th edition of the list.

The Top 500 is not a particularly good gauge of what is going on in the totality of the HPC market, but it most definitely is a good indicator of technology shifts among the cutting-edge buyers of supercomputers that will in many cases trickle their way into the mainstream of HPC computing.

Supercomputing is often about politics as much as it is about actually doing simulations, and it is hard to miss how fast China is becoming a force in the petascale era. Which stands to reason, given the strength of the Chinese economy and its desire to excel in the sciences and bend science to industry, just like its peers in the more established economies have done for decades. There are now 24 systems on the Top 500 list, and the two mentioned above - Nebulae and Tianhe-1 - are not only Top 10 machines, they have enough Linpack performance to catapult China ahead of all the other countries in the world in terms of sustained performance installed. China has as many supers on the list as Germany, and it now ranks second in terms of aggregate computing power, with 9.2 per cent of the 32.4 petaflops of total power accounted for on the list.

The United States is still the biggest investor in Top 500-class machines, with 282 of the 500 machines (56.4 per cent) and just under 18 petaflops of floating point oomph in those boxes (55.4 per cent of the total math power on the list). The United Kingdom has 38 machines with a total computing power of 1.7 petaflops, giving it a little more than half the share of flops as China has. France has 29 machines with a total of 1.76 petaflops (5.4 per cent of the total power), while Germany's 24 boxes have 2.25 petaflops (6.9 per cent).

Japan, once a high flyer in the HPC realm, has backed off from its formerly aggressive stance, mainly because it does not have the billions of dollars or the political will to sustain a supercomputing program that can compete with the US and now China. There are 18 Japanese supers on the June 2010 Top 500 list, which have a total of 1.25 petaflops of aggregate performance.

In terms of architecture, 74 of the machines on the June 2010 list are massively parallel boxes with some kind of sophisticated interconnect, while two are constellation configurations and 424 are more generic clusters using InfiniBand or Ethernet. There are 242 machines that use plain old Gigabit Ethernet, and only two using 10 Gigabit Ethernet. There are 205 machines that use one speed or another of InfiniBand, with the rest being a mix of custom and proprietary interconnects such as IBM's Federation, Cray's SeaStar, and SGI's NUMAlink. There's only one vector machine still on the list, which is the 122.4 teraflops parallel NEC SX-9 super known as Earth Simulator, which was at the top of the list a decade ago. (Ranked 37 today). All of the remaining machines use scalar processors, although more and more of them are being augmented with co-processors.

As is typical on the Top 500 list, old gear doesn't stay afloat for long. Of the 500 boxes on the list, 62 were installed in 2008, 229 in 2009, and 183 in 2010. There are only 26 machines older than that on the list. But the turnover on the list has slowed, thanks to the economic slowdown and despite plenty of stimulus money being shelled out by governments in the United States, Europe, and China.

To get on the list this time around, your machine had to demonstrate at least 52.8 teraflops of punch on the Linpack test, up from 47.7 teraflops only six months ago. The aggregate computing power on the list continues to swell, too, to 32.4 petaflops, up from 27.6 petaflops six months ago and 22.6 petaflops in the June 2009 list.

Intel's processors continue to dominate the list, with 408 machines using either Xeon (403 systems) or Itanium (five systems) processors. AMD's Opterons are used in 47 machines, IBM's various Power chips are used in 42 machines, with the remaining machines being comprised of two Sparc boxes and that one NEC Earth Simulator behemoth. For Intel-based supers, there are still 182 machines using the old "Harpertown" quad-core Xeon 5400s in their L, E, and X variants as well as a bunch of older Xeon 5100 and 5300 processors in 30 other machines. None of these machines can last for very long on the list, given the energy efficiency of new servers based on six-core or twelve-core x64 processors.

The "Nehalem-EP" quad-core Xeon 5500 processors are in 184 machines, and there are already seven boxes using the new "Westmere-EP" six-core Xeon 5600s. There are already two boxes on the list using the high-end, eight-core "Nehalem-EX" Xeon 7500 processors, but they are relatively tiny. There are 31 boxes using quad-core Opterons, five using six-core Opterons, and five with the twelve-core Opterons. There are ten boxes using IBM's PowerPC chips in BlueGene boxes and another 18 using machines using Power6 or Power6+ chips. There's a smattering of Power5, Sparc, and Itanium in there, too.

By manufacturer, IBM is once again at the top of the lost in terms of system count and aggregate flops installed. IBM has 198 machines on the list (39.6 per cent of the total) and the IBM label is associated with 10.9 petaflops of performance (33.6 per cent of the total). Hewlett-Packard, which hasn't had a Top 10 system in a long time, still sells lots of clusters of modest size, and has 185 machines on the list (37 per cent of machines) for a total of 6.62 petaflops (20.4 per cent of the flops pie).

Cray has 21 systems on the list, with a total of 4.8 petaflops (4.2 per cent of machines, but 14.8 per cent of capacity), with Silicon Graphics having 17 boxes (3.4 per cent of machines and 6.6 per cent of capacity). Sun (now Oracle) has a dozen machines on the list, but it is hard to imagine that Oracle will be interested in pursuing HPC for the sake of being on the Top 500 list. If Oracle has plans for HPC beyond data analytics, it sure hasn't communicated this to the IT community. ®

Intelligent flash storage arrays

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Do you spend ages wasting time because of a bulging rack?
No more cloud-latency tea breaks for you, users! Get a load of THIS
prev story


10 ways wire data helps conquer IT complexity
IT teams can automatically detect problems across the IT environment, spot data theft, select unique pieces of transaction payloads to send to a data source, and more.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
The total economic impact of Druva inSync
Examining the ROI enterprises may realize by implementing inSync, as they look to improve backup and recovery of endpoint data in a cost-effective manner.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.