Feeds

Top 500 supers - rise of the Linux quad-cores

Jaguar munches Roadrunner

  • alert
  • submit to reddit

Boost IT visibility and business value

The politics of petaflops

In terms of vendor share, the Top 500 is still dominated by server makers IBM and Hewlett-Packard. IBM has 185 systems on the list this time around, with a total of 2.14 million cores and 9.72 petaflops of aggregate performance, giving IBM's machines a 34.8 per cent share of the number crunching on the list.

HP might not have had a big box near the top of the list in a long time - HP's most powerful machine is the 132.8 teraflops Cluster Platform 3000 based on the BL460c blade servers installed at Tata & Sons in India, ranked 26 on the list - but the world's volume server supplier has 210 machines on the November 2009 Top 500 list, with more than 1 million cores and 6.64 petaflops of aggregate oomph across those boxes, a 22.8 per cent share of the combined performance of the Top 500 list.

Niche server players but HPC specialists Cray and Silicon Graphics have 19 machines each on the November ranking. Cray's machines have 596,315 cores for a total of 4.4 petaflops of combined performance, while SGI has much skinnier machines - at least until it starts installing the much-anticipated "UltraViolet" shared memory systems based on Intel's future Xeon 7500 "Nehalem EX" eight-core processors. SGI's 19 machines, including a mix of Altix 4700 Itanium-based machines as well as the Altix ICE Xeon blade clusters, have a total of 198,304 cores for a not-too-shabby aggregate of 1.83 petaflops of performance.

Sun Microsystems, which has wanted to be a more serious player in HPC for the past decade and considering its server designs and switches it should be, has 11 machines on the current Top 500 list. The Sun boxes have 171,442 cores and 1.52 petaflops of aggregate performance. Dell has 16 machines on the list, with 616 teraflops and a mere 85,766 cores. (Dell, IBM, and Sun share boxes not included in that Dell total, and Dell has partnered with ACS for another machine.) Bull has five machines on the list for 481 teraflops and Appro International has six machines for 481 teraflops.

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.