Feeds

UK's HECToR supercomputer in 27PB MEGA-storage boost

Disks, tapes and GPFS

Build a business case: developing custom apps

HECToR, the Edinburgh-based supercomputer used by UK researchers to tackle science's more thorny mathematical problems, is having petabytes of disk and tape storage installed in a massive storage expansion.

HECToR stands for the High End Computing Terra Scale Resource and has been built in several phases. It is a Cray-built 800 TFlop XE6 supercomputer with 704 compute blades using AMD "Bulldozer" multi-core processor architecture with 90TB of memory.

The system takes up 30 cabinets in an Edinburgh University data centre the size of two tennis courts.

HECToR

HECToR

HECToR has a petabyte of disk space in another 10 cabinets, under the control of the Lustre parallel file system. There is a 70TB BlueArc (Now HDS) Titan 2200 backup NAS system with MAID (Masssive Array of Idle Disks) on a Copan Revolution Virtual Tape Library array. Crashed Copan was bought by SGI in June 2010. The BlueArc system holds users' home directories. Files are first backed up to the Copan storage, using Symantec NetBackup, and then archived off to a Quantum Scalar i2000 tape library with four LTO-4 drives, 1,300 tapes and a 1.02PB capacity.

This storage is being boosted with 7.8PB of DataDirect storage arrays – accessed by GPFS, not Lustre – and 19.5PB of IBM archive tape, so the existing archive infrastructure is obviously thought inadequate for this extended storage. This new capacity, designed and built by OCF, will be networked to HECToR, not actually integrated into its infrastructure; it's designed for the long-term and to be available to any HECToR successor machine. It's a separate silo.

DDN is supplying its SFA10K-X storage array. The tape facility is a high-end IBM TS3500 library which can have 15 frames connected together with a single robot system and uses IBM's TS1140 tape drive.

HECTor cabling

Supercomputing wiring hell: HECTOR cabling.

Professor Arthur Trew of Edinburgh University said: “Data persists beyond any computer, including HECToR, so we’re prioritising data storage, management and analysis. Doing this enables us to upgrade HECToR and integrate its successor without fear of impacting access to research data. Our expectation is that any future computer must be able to integrate seamlessly with our storage.”

OCF's MD, Julian Fielden, thinks one problem with big data isn't storing it but finding the stuff, accessing it and using it: "By making storage independent of the machine that generated it, combined with good network access and IBM’s parallel file system GPFS, the data becomes easy to locate and use by any researcher irrespective of location,” meaning anywhere in the UK. ®

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?