Feeds

Livermore lab opens Catalyst super to industry users

7,776 cores available for your number-crunching needs

The essential guide to IT transformation

The Lawrence Livermore National Laboratory “Catalyst” supercomputer, which started sucking its first electrons in November 2013, is now open for industry workloads.

When Igor pulled the big red switch last year, the lab touted features like the 800 GB of Flash attached to each of its 304 nodes via PCIe, in addition to the per-node 128 GB of DRAM.

This, it said, was a big-data-specific-spec: the LLNL design mapped the solid-state drives into application memory to make it look like standard DRAM. In big data analysis apps, as Vulture writer Jack Clark noted at the time, “fast memory – and lots of it – becomes a priority”.

The boffins now seem satisfied with how the Cray-Intel big box is working, and is seeking partnerships in “bioinformatics, big data analysis, graph networks, machine learning and natural language processing, or for exploring new approaches to application checkpointing, in-situ visualisation, out-of-core algorithms and data analytics.”

According to the notice of opportunity here, the program will be offered to US companies through LLNL's HPC Innovation Centre.

Here's some more Catalyst data by-the-numbers:

  • 304 dual-socket compute nodes with 2.4Ghz 12-core Xeon E5-2695v2 processors using the Intel TrueScale Fabric, for a total of 7,776 cores;
  • 128 GB DRAM and 800 GB flash per node;
  • Dual-rail Quad Data Rate (QDR-80) networking fabric;
  • 150 teraflops for the full Cray CS300 cluster.

®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.