Feeds

Gordon the supercomputer is intense about data

300TB of flash = Big Data, fast

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

SC11 According to San Diego Supercomputing Center chief Mike Norman, his brainchild 'Gordon' is the world’s first data intensive supercomputer.

In the works for two years, Gordon was being shipped from system house Appro to its new home in San Diego last week during SC11 in Seattle. In the video, I catch up with Mike and discuss what he means by a ‘data intensive supercomputer’ and how Gordon is different from what’s come before.

One of the biggest differences, of course, is the whopping 300TB of Intel MLC flash storage feeding Gordon’s 1,024 compute nodes. We also talk about the advantages/disadvantages of flash vs. spinning disk and, of course, kick around the cost issue.

Does the performance advantage of flash outweigh the significant cost premium? Steve Lyness, HPC Solutions Engineering VP from Appro (the folks who built Gordon), talks about how the use of flash in Gordon reduces the ‘dollars per science’ ratio by radically reducing the amount of time CPUs sit idle while data is transferred from and to storage. ®

Watch Video

Remote control for virtualized desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.