Feeds

IBM builds absolutely super computer

Swinging Blue Gene

  • alert
  • submit to reddit

Remote control for virtualized desktops

IBM is teaming up with the Lawrence Livermore National Laboratory to develop Blue Gene/L, a supercomputer which will be 15 times faster, 15 times more efficient and use 50 times less space than today's supercomputers.

Part of IBM's Blue Gene research project, the machine will operate at 200 teraflops, or 200 trillion operations per second, when it arrives in 2005.

Many existing data-intensive applications are slowed down by the time to access information from their memory chips. Blue Gene/L speeds up this process significantly - it's populated with 65,000 data-chip cells optimised for data access. Each chip includes two processors, one handling computing and one handling communication, as well as its own on-board memory.

The machine will be used by researchers to simulate physical phenomena, such as the ageing of materials, fires and explosions.

The Lawrence Livermore National Laboratory - used by the Department of Energy's (DoE) National Nuclear Security Agency (NNSA) - already hosts the world's top speedster, IBM's ASCI White RS/6000 SP supercomputer, which operates at a peak performance of 12 teraflops.

This kind of muscle comes very big: ASCI White consumes floor space equivalent to two basketball courts or about half a football field.

IBM says Blue Gene/L's predicted processing power will exceed the combined top 500 supercomputers in the world today. The Top 500 list, updated bi-annually, has just 16 computers with processing power exceeding one teraflop, half of them built by IBM.

Compaq, Intel, Hitachi and SGI all have machines in the top ten. ®

Related Stories
Compaq on speed with 1GHz 64-bit Alpha boxen
World's fastest supercomputer goes down a bomb
AMD cluster sneaks in Supercomputer top 500 list

Choosing a cloud hosting partner with confidence

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?