Feeds

Germany announces ITER fusion-reactor supercomputer

Teraflops in = terawatts out, hope boffins

Remote control for virtualized desktops

German supercomputing chiefs are chuffed to announce today that the Forschungszentrum Jülich supercomputing centre will provide the main computer for the ITER fusion reactor, the international effort intended to solve the human race's energy problems.

"We are proud that the European Fusion Development Agreement has chosen to make use of Jülich’s know-how," says Professor Achim Bachem, boss of the German research centre. "Jülich will show what Europe can do as a supercomputing site for energy research."

Boffins working at or with the ITER reactor once it comes online - it is being constructed at Cadarache in France, and is expected to begin operations in 2018 - will have access to a dedicated 100 teraflop* machine at Jülich. If necessary they will also be able to call on other resources at the supercomputing centre, to the tune of 300 teraflops all up.

Full specs on the ITERputer, courtesy of the Jülich boffins:

It will consist of 1,080 computing nodes each equipped with two Nehalem EP Quad Core processors from Intel. The grand total of 8,640 processors will have a clock rate of 2.93 GHz each, they will be able to access 24 gigabytes of total main memory and will be water-cooled.

Infiniband ConnectX QDR from the Israeli company Mellanox will be used as the network. The administrative infrastructure is based on servers of the type NovaScale R422-E2 from the French supercomputer manufacturer Bull, who will supply the system and integrate it at Jülich. The cluster operating system “Parastation” will be supplied by the Munich software company ParTec.

If the new machine were in operation right now, it would be ranked thirtieth on the TOP500 supercomputer league, which lists all the known big hitters. (It's speculated, reasonably enough, that some government-agency machines don't appear on the TOP500 list despite being eminently worthy of a place on it.)

The ITER supercomputer will be used mainly to help physicists understand the behaviour of the plasma inside the Cadarache containment ring. It's hoped that the new reactor, more powerful than any yet built, may allow fusion brainboxes to learn the trick of getting more useful power out of such a machine than it requires to keep the reaction going and contained. This is plainly possible - the sun and all the other stars run on self-sustaining fusion reactions - but achieving it using human technology has had top boffins stumped for decades.

If it can be done, we can all relax somewhat about energy. Fuel for fusion is plentiful - in many cases easily extracted from seawater, and definitely easier to find than scarce fissionables like uranium and thorium. While running a fusion reactor involves radiation hazards, and produces some "nuclear wastes" in the sense of things nearby/inside it getting made radioactive, the process itself doesn't leave behind any troublesome residues as fission does - and of course there aren't any carbon emissions to fret about.

If fusion can be made to work, the human race will be able to have all the electricity it wants for centuries at least, probably millennia. The knotty question of how to power all the electric heating, electric industry, electric cars etc after the switch away from oil and gas (or how to produce all the hydrogen for the hydrogen cars, etc) will have been answered.

And if not, well, at least European physicists are getting a nice new computer out of it. ®

*Teraflop = a trillion floating point operations per second. The petaflop barrier has now been broken, but petaflop machines remain rare for the moment.

Intelligent flash storage arrays

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Go beyond APM with real-time IT operations analytics
How IT operations teams can harness the wealth of wire data already flowing through their environment for real-time operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.