Feeds

Kluster Kamph results sliced, diced, pulverized

Who won – and, more importantly, why?

3 Big data security analytics techniques

HPC Blog With the dust settling on the ISC'13 Student Cluster Challenge, it's a good time to look back, take stock, and see what we've learned.

While traveling recently, I found myself unable to type on my laptop keyboard (after the idiot in front of me reclined into my lap), but I found I could still use the handy track point "nub".

So with a digit on the nub, I fired up a spreadsheet to attempt some one-finger analysis. I was curious to see what role cluster configuration might have played in the final competition results. (I used a different finger to express my feelings about the guy in 24F.)

The ISC Student Cluster Challenge is a series of "sprints" requiring students to run HPCC and a variety of HPC applications over a two-day period. They work on one application at a time, tuning and optimizing it to maximize throughput. And they can't just throw more hardware at the problem – there's a hard power cap of 3,000 watts.

With this in mind, let's take a look at how the various student clusters stacked up against each other and see if we can glean any insight. There was a lot of commonality between the teams, with all of them using Mellanox Infiniband interconnects and switches, and most running RHEL. But there were some major differences, which I've summarized in the charts below…

ISC'13 Student Cluster Challenge: total nodes chart

There are some clear differences in node count (above) and core count (below) in the 2013 competition.

ISC'13 Student Cluster Challenge: total CPU cores chart

South Africa and Tsinghua clearly have both more nodes and more cores than their fellow competitors, which certainly seemed to give them an advantage in terms of application performance.

We also see that there's quite a bit of difference when it comes to total memory: Tsinghua weighed in with a whopping 1TB.

ISC'13 Student Cluster Challenge: total cluster memory chart

Based on what I know about the scores for the individual applications, it looks like Tsinghua's massive memory gave them an advantage over the rest of the field, but not quite enough of an advantage to take the overall win.

ISC'13 Student Cluster Challenge: memory per CPU core chart

Tsinghua, Huazhong, and Chemnitz have double the RAM per core of other competitors, which certainly had a positive effect on performance. On the other hand, South Africa turned in solid app performance using less memory per core, but a large number of cores.

This is the first competition in which every team has used some kind of compute accelerator. Most went with Nvidia Kepler K20s, while Colorado and Purdue tried the new Intel Phi coprocessor on for size.

ISC'13 Student Cluster Challenge: accelerators (total in cluster) chart

One team, Germany's own Chemnitz, went wild on accelerators. They configured eight Intel Phi and eight Nvidia K20s into their cluster. While this gave them a hell of a lot of potential compute power, it also consumed a hell of a lot of electricity.

Difficulties also arose when the team tried to get both accelerators running on the same application, so they ended up running particular apps on the K20s and others on the Phis. The problem is that even when they're idled, these beasts consume significant power and generate heat as well. Chemnitz put together a monster cluster, but like most monsters, it couldn't be completely tamed.

Lessons Learned

So what have we learned? I think it's safe to say that more nodes and cores are better than fewer – now there's a blinding glimpse of the obvious, eh? There is also a relationship between lots of memory and higher performance – again, an obvious conclusion.

However, we see that more accelerators aren't necessarily more better. The winning teams had one accelerator per node, with Huazhong landing the top LINPACK with two accelerators per node.

All of the top teams were running Nvidia K20s rather than Intel's Phi coprocessor, but we can't put too much weight on this first Nvidia v. Intel showdown. This was the first time students had a chance to use the Phi, and they didn't get all that much time to work on optimization prior to the competition.

The upcoming cluster showdown at SC'13 in November will probably give us a better view of comparative accelerator performance. ®

SANS - Survey on application security programs

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
Bored with trading oil and gold? Why not flog some CLOUD servers?
Chicago Mercantile Exchange plans cloud spot exchange
Just what could be inside Dropbox's new 'Home For Life'?
Biz apps, messaging, photos, email, more storage – sorry, did you think there would be cake?
IT bods: How long does it take YOU to train up on new tech?
I'll leave my arrays to do the hard work, if you don't mind
Amazon reveals its Google-killing 'R3' server instances
A mega-memory instance that never forgets
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
prev story

Whitepapers

Designing a defence for mobile apps
In this whitepaper learn the various considerations for defending mobile applications; from the mobile application architecture itself to the myriad testing technologies needed to properly assess mobile applications risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.