Feeds

Germany announces ITER fusion-reactor supercomputer

Teraflops in = terawatts out, hope boffins

Remote control for virtualized desktops

German supercomputing chiefs are chuffed to announce today that the Forschungszentrum Jülich supercomputing centre will provide the main computer for the ITER fusion reactor, the international effort intended to solve the human race's energy problems.

"We are proud that the European Fusion Development Agreement has chosen to make use of Jülich’s know-how," says Professor Achim Bachem, boss of the German research centre. "Jülich will show what Europe can do as a supercomputing site for energy research."

Boffins working at or with the ITER reactor once it comes online - it is being constructed at Cadarache in France, and is expected to begin operations in 2018 - will have access to a dedicated 100 teraflop* machine at Jülich. If necessary they will also be able to call on other resources at the supercomputing centre, to the tune of 300 teraflops all up.

Full specs on the ITERputer, courtesy of the Jülich boffins:

It will consist of 1,080 computing nodes each equipped with two Nehalem EP Quad Core processors from Intel. The grand total of 8,640 processors will have a clock rate of 2.93 GHz each, they will be able to access 24 gigabytes of total main memory and will be water-cooled.

Infiniband ConnectX QDR from the Israeli company Mellanox will be used as the network. The administrative infrastructure is based on servers of the type NovaScale R422-E2 from the French supercomputer manufacturer Bull, who will supply the system and integrate it at Jülich. The cluster operating system “Parastation” will be supplied by the Munich software company ParTec.

If the new machine were in operation right now, it would be ranked thirtieth on the TOP500 supercomputer league, which lists all the known big hitters. (It's speculated, reasonably enough, that some government-agency machines don't appear on the TOP500 list despite being eminently worthy of a place on it.)

The ITER supercomputer will be used mainly to help physicists understand the behaviour of the plasma inside the Cadarache containment ring. It's hoped that the new reactor, more powerful than any yet built, may allow fusion brainboxes to learn the trick of getting more useful power out of such a machine than it requires to keep the reaction going and contained. This is plainly possible - the sun and all the other stars run on self-sustaining fusion reactions - but achieving it using human technology has had top boffins stumped for decades.

If it can be done, we can all relax somewhat about energy. Fuel for fusion is plentiful - in many cases easily extracted from seawater, and definitely easier to find than scarce fissionables like uranium and thorium. While running a fusion reactor involves radiation hazards, and produces some "nuclear wastes" in the sense of things nearby/inside it getting made radioactive, the process itself doesn't leave behind any troublesome residues as fission does - and of course there aren't any carbon emissions to fret about.

If fusion can be made to work, the human race will be able to have all the electricity it wants for centuries at least, probably millennia. The knotty question of how to power all the electric heating, electric industry, electric cars etc after the switch away from oil and gas (or how to produce all the hydrogen for the hydrogen cars, etc) will have been answered.

And if not, well, at least European physicists are getting a nice new computer out of it. ®

*Teraflop = a trillion floating point operations per second. The petaflop barrier has now been broken, but petaflop machines remain rare for the moment.

Security for virtualized datacentres

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY?
China's Memblaze says they've got it in PCIe. Yow
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
This time it's SO REAL: Overcoming the open-source orgasm myth with TODO
If the web giants need it to work, hey, maybe it'll work
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.