Feeds

Germany announces ITER fusion-reactor supercomputer

Teraflops in = terawatts out, hope boffins

Beginner's guide to SSL certificates

German supercomputing chiefs are chuffed to announce today that the Forschungszentrum Jülich supercomputing centre will provide the main computer for the ITER fusion reactor, the international effort intended to solve the human race's energy problems.

"We are proud that the European Fusion Development Agreement has chosen to make use of Jülich’s know-how," says Professor Achim Bachem, boss of the German research centre. "Jülich will show what Europe can do as a supercomputing site for energy research."

Boffins working at or with the ITER reactor once it comes online - it is being constructed at Cadarache in France, and is expected to begin operations in 2018 - will have access to a dedicated 100 teraflop* machine at Jülich. If necessary they will also be able to call on other resources at the supercomputing centre, to the tune of 300 teraflops all up.

Full specs on the ITERputer, courtesy of the Jülich boffins:

It will consist of 1,080 computing nodes each equipped with two Nehalem EP Quad Core processors from Intel. The grand total of 8,640 processors will have a clock rate of 2.93 GHz each, they will be able to access 24 gigabytes of total main memory and will be water-cooled.

Infiniband ConnectX QDR from the Israeli company Mellanox will be used as the network. The administrative infrastructure is based on servers of the type NovaScale R422-E2 from the French supercomputer manufacturer Bull, who will supply the system and integrate it at Jülich. The cluster operating system “Parastation” will be supplied by the Munich software company ParTec.

If the new machine were in operation right now, it would be ranked thirtieth on the TOP500 supercomputer league, which lists all the known big hitters. (It's speculated, reasonably enough, that some government-agency machines don't appear on the TOP500 list despite being eminently worthy of a place on it.)

The ITER supercomputer will be used mainly to help physicists understand the behaviour of the plasma inside the Cadarache containment ring. It's hoped that the new reactor, more powerful than any yet built, may allow fusion brainboxes to learn the trick of getting more useful power out of such a machine than it requires to keep the reaction going and contained. This is plainly possible - the sun and all the other stars run on self-sustaining fusion reactions - but achieving it using human technology has had top boffins stumped for decades.

If it can be done, we can all relax somewhat about energy. Fuel for fusion is plentiful - in many cases easily extracted from seawater, and definitely easier to find than scarce fissionables like uranium and thorium. While running a fusion reactor involves radiation hazards, and produces some "nuclear wastes" in the sense of things nearby/inside it getting made radioactive, the process itself doesn't leave behind any troublesome residues as fission does - and of course there aren't any carbon emissions to fret about.

If fusion can be made to work, the human race will be able to have all the electricity it wants for centuries at least, probably millennia. The knotty question of how to power all the electric heating, electric industry, electric cars etc after the switch away from oil and gas (or how to produce all the hydrogen for the hydrogen cars, etc) will have been answered.

And if not, well, at least European physicists are getting a nice new computer out of it. ®

*Teraflop = a trillion floating point operations per second. The petaflop barrier has now been broken, but petaflop machines remain rare for the moment.

Security for virtualized datacentres

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
AWS pulls desktop-as-a-service from the PC
Support for PCoIP protocol means zero clients can run cloudy desktops
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.