Feeds

Canucks buy 300 teraflops Blue iDataPlex super

Melting Arctic sea ice to, er, save Arctic sea ice

SANS - Survey on application security programs

The University of Toronto's SciNet consortium, which provides supercomputing oomph for colleges, universities, and research hospitals across Canada, will today announce that it has selected IBM's iDataPlex servers using Intel's new "Nehalem EP" Xeon 5500 processors to create the most powerful supercomputer in Canada.

The Canuck super installed at SciNet is comprised of 3,800 iDataPlex compute nodes, which you can read about here, and it brings 30,400 processor cores to bear on a variety of simulations and calculations, for more than 300 teraflops in peak aggregate floating point power.

The iDataPlex servers are IBM's density compute offering, and take some of the best ideas of rack and blade computing and package them together in a more modular system than traditional IBM racks offer. iDataPlex nodes can be configured as compute, I/O, or storage nodes, and can have as many as two server nodes packed atop each other in a 2U chassis, each on its own tray.

Rather than go for the top-end four-core Nehalem EP parts, which dissipate 95 watts of heat, SciNet has opted to build its cluster (which inexplicably has not been nicknamed yet) using the fastest 80 watt parts from Intel: the 2.53 GHz Xeon X5540 chips.

Given that the X5570s running at 2.93 GHz only deliver 9.2 per cent more peak bandwidth on the QuickPath Interconnect and 15.8 per cent more clocks, but are 18.8 per cent hotter and cost 1.9 times as much, it is fair to expect a lot of supercomputers that will be no doubt be announced next week at the International Conference on Supercomputing 2009 show will use the X5540 processors, not the X5570s. The numbers don't make sense for the X5570s for parallel HPC workloads.

By the way, ISC '09 is being held in Hamburg, Germany, next week, and is not to be confused with ICS '09, which was hosted by Big Blue this year in its Watson Research Center in Yorktown Heights, New York, last week. (Both are supercomputing events.) ISC '09 will feature the semiannual Top 500 ranking of supercomputers, as usual. If this Canuck box at SciNet was stacked up against the current list, which came out last November, it would be ranked number eight on the list.

The SciNet iDataPlex super will run Linux - which one is not clear - and a wide variety of aerospace, astrophysics, bioinformatics, chemical physics, climate change prediction, and medical imaging applications. (SciNet supports a lot of hospital medical research in Canada as well as hard sciences.) The machine is also offering up cycles to CERN's Atlas Project, which needs flops to figure out the nature of the forces in the universe based on the data coming out of the Large Hadron Collider.

Perhaps most significantly, the SciNet machine will be used to create high resolution global climate models to predict the effects of accelerating melting of Arctic sea ice. One of the first projects the iDataPlex system will tackle is to create a climate simulation for the province of Ontario and the surrounding Great Lakes region showing the effects of the ice melt.

Given all this greenery, IBM was quick to point out that the iDataPlex box was equipped with its "Cool Blue" rear door heat exchangers, which extracts the heat directly out of racks of servers and pumps it into the air conditioning units instead of relying on air cooling in the data centre. The servers also have dynamic provisioning for software stacks that allows nodes to be quickly turned off if they are not in use and automatically reactivated as they are needed to do calculations, thereby saving power.

The new SciNet iDataPlex cluster is part of a $47m (Canadian) funding effort by the Canadian government and its SciNet partners to build a 360 teraflops supercomputing centre. The other 60 teraflops is comprised of an existing Power 575 parallel supercomputer, with 104 server nodes linked together and a total of 3,328 Power6 processors running at 4.7 GHz. This box currently ranks at number 53 on the Top 500 supers list. ®

3 Big data security analytics techniques

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
IBM rides nightmarish hardware landscape on OpenPOWER Consortium raft
Google mulls 'third-generation of warehouse-scale computing' on Big Blue's open chips
It's GOOD to get RAIN on your upgrade parade: Crucial M550 1TB SSD
Performance tweaks and power savings – what's not to like?
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
prev story

Whitepapers

Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.