Feeds

European Commission pays IDC to take a hard look at HPC

Acorn RISC United

Top 5 reasons to deploy VMware with Tegile

The European Commission has commissioned IT market researchers and analysts at IDC to work with some of the top European supercomputing labs to rationalize and coordinate the efforts to push into petascale and exascale computing within Europe.

HPC experts from Teratec, in France, Daresbury Laboratory, in the United Kingdom, Leibniz-Rechenzentrum and Forschungszentrum Jülich, in Germany (that's two different labs), will be working with the supercomputing experts at IDC to come up with a "strategic agenda" for HPC projects in Europe.

One of the goals of the study is to advance what the European Union calls "open science" within the member countries, which basically means collaboration among scientists and the computing facilities that are increasingly needed for them to perform basic research. European politicians are also keen on keeping pace with the United States, Japan, and China in the flops arms race, and are also keen on investigating options for local sourcing of HPC systems.

The number of HPC suppliers (and I don't mean resellers) has dwindled in the past decade, and European players are pretty thin on the ground, unless you want to consider IBM and Fujitsu as locals. [IDC is of course based in England - New England that is]

"The link has been firmly established between HPC and scientific and economic advancement," explained Steve Conway, research vice president in IDC's technical computing group, in a statement announcing the seven-month consulting gig with the EU. "The investments needed for the next generation of HPC systems will be substantial. Deciding on the optimal areas of investment - systems, storage, software, and people skills - that are most valuable to European HPC users, and the wider economy, is critical to the EU's success in developing its HPC agenda. Many countries are installing multiple petascale supercomputers today and are laying the foundation for using exascale systems in a few years."

Under the contract, IDC and the experts at the above-mentioned HPC labs will be doing a comparative study of supercomputing investments on a global basis from 2010 through 2020, the various architectures HPC centers are expected to deploy, and the impact that HPC has on scientific and industrial research.

If the EU wanted to shake things up a bit, it would establish a company explicitly to develop massively parallel supercomputers based on the ARM RISC processor running SUSE or Ubuntu Linux. The ARM chips have some pretty serious price/performance advantages and have floating point and media processing units that could be put to good use. The chips could be fabbed at the GlobalFoundries wafer bakers in Dresden, Germany to keep it local, and Fujitsu could be commissioned to build system boards and clusters.

This is, more or less, how IBM's BlueGene and Cray's Red Storm (XT4 and XT5) parallel supers sprung into being. Without hundreds of millions of dollars of support from Uncle Sam, these experimental systems would have never come to market, much less been commercialized and sold to non-government super labs. You could argue that the US government is underwriting a lot of the cost of developing the Power7 chips due next week, too.

The point is, supercomputing has always been much more involved with national pride than other forms of computing, and at this point, computer component design is largely done by US-based companies, even if a lot of the manufacturing and assembly is done overseas. It is foolish for the members of the European Union to lose the skills and jobs behind the engineering, building, and programming of powerful parallel computers. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Website security in corporate America
Find out how you rank among other IT managers testing your website's vulnerabilities.