European Commission pays IDC to take a hard look at HPC
Acorn RISC United
The European Commission has commissioned IT market researchers and analysts at IDC to work with some of the top European supercomputing labs to rationalize and coordinate the efforts to push into petascale and exascale computing within Europe.
HPC experts from Teratec, in France, Daresbury Laboratory, in the United Kingdom, Leibniz-Rechenzentrum and Forschungszentrum Jülich, in Germany (that's two different labs), will be working with the supercomputing experts at IDC to come up with a "strategic agenda" for HPC projects in Europe.
One of the goals of the study is to advance what the European Union calls "open science" within the member countries, which basically means collaboration among scientists and the computing facilities that are increasingly needed for them to perform basic research. European politicians are also keen on keeping pace with the United States, Japan, and China in the flops arms race, and are also keen on investigating options for local sourcing of HPC systems.
The number of HPC suppliers (and I don't mean resellers) has dwindled in the past decade, and European players are pretty thin on the ground, unless you want to consider IBM and Fujitsu as locals. [IDC is of course based in England - New England that is]
"The link has been firmly established between HPC and scientific and economic advancement," explained Steve Conway, research vice president in IDC's technical computing group, in a statement announcing the seven-month consulting gig with the EU. "The investments needed for the next generation of HPC systems will be substantial. Deciding on the optimal areas of investment - systems, storage, software, and people skills - that are most valuable to European HPC users, and the wider economy, is critical to the EU's success in developing its HPC agenda. Many countries are installing multiple petascale supercomputers today and are laying the foundation for using exascale systems in a few years."
Under the contract, IDC and the experts at the above-mentioned HPC labs will be doing a comparative study of supercomputing investments on a global basis from 2010 through 2020, the various architectures HPC centers are expected to deploy, and the impact that HPC has on scientific and industrial research.
If the EU wanted to shake things up a bit, it would establish a company explicitly to develop massively parallel supercomputers based on the ARM RISC processor running SUSE or Ubuntu Linux. The ARM chips have some pretty serious price/performance advantages and have floating point and media processing units that could be put to good use. The chips could be fabbed at the GlobalFoundries wafer bakers in Dresden, Germany to keep it local, and Fujitsu could be commissioned to build system boards and clusters.
This is, more or less, how IBM's BlueGene and Cray's Red Storm (XT4 and XT5) parallel supers sprung into being. Without hundreds of millions of dollars of support from Uncle Sam, these experimental systems would have never come to market, much less been commercialized and sold to non-government super labs. You could argue that the US government is underwriting a lot of the cost of developing the Power7 chips due next week, too.
The point is, supercomputing has always been much more involved with national pride than other forms of computing, and at this point, computer component design is largely done by US-based companies, even if a lot of the manufacturing and assembly is done overseas. It is foolish for the members of the European Union to lose the skills and jobs behind the engineering, building, and programming of powerful parallel computers. ®
Sponsored: Hyper-scale data management