Feeds

European Commission pays IDC to take a hard look at HPC

Acorn RISC United

Internet Security Threat Report 2014

The European Commission has commissioned IT market researchers and analysts at IDC to work with some of the top European supercomputing labs to rationalize and coordinate the efforts to push into petascale and exascale computing within Europe.

HPC experts from Teratec, in France, Daresbury Laboratory, in the United Kingdom, Leibniz-Rechenzentrum and Forschungszentrum Jülich, in Germany (that's two different labs), will be working with the supercomputing experts at IDC to come up with a "strategic agenda" for HPC projects in Europe.

One of the goals of the study is to advance what the European Union calls "open science" within the member countries, which basically means collaboration among scientists and the computing facilities that are increasingly needed for them to perform basic research. European politicians are also keen on keeping pace with the United States, Japan, and China in the flops arms race, and are also keen on investigating options for local sourcing of HPC systems.

The number of HPC suppliers (and I don't mean resellers) has dwindled in the past decade, and European players are pretty thin on the ground, unless you want to consider IBM and Fujitsu as locals. [IDC is of course based in England - New England that is]

"The link has been firmly established between HPC and scientific and economic advancement," explained Steve Conway, research vice president in IDC's technical computing group, in a statement announcing the seven-month consulting gig with the EU. "The investments needed for the next generation of HPC systems will be substantial. Deciding on the optimal areas of investment - systems, storage, software, and people skills - that are most valuable to European HPC users, and the wider economy, is critical to the EU's success in developing its HPC agenda. Many countries are installing multiple petascale supercomputers today and are laying the foundation for using exascale systems in a few years."

Under the contract, IDC and the experts at the above-mentioned HPC labs will be doing a comparative study of supercomputing investments on a global basis from 2010 through 2020, the various architectures HPC centers are expected to deploy, and the impact that HPC has on scientific and industrial research.

If the EU wanted to shake things up a bit, it would establish a company explicitly to develop massively parallel supercomputers based on the ARM RISC processor running SUSE or Ubuntu Linux. The ARM chips have some pretty serious price/performance advantages and have floating point and media processing units that could be put to good use. The chips could be fabbed at the GlobalFoundries wafer bakers in Dresden, Germany to keep it local, and Fujitsu could be commissioned to build system boards and clusters.

This is, more or less, how IBM's BlueGene and Cray's Red Storm (XT4 and XT5) parallel supers sprung into being. Without hundreds of millions of dollars of support from Uncle Sam, these experimental systems would have never come to market, much less been commercialized and sold to non-government super labs. You could argue that the US government is underwriting a lot of the cost of developing the Power7 chips due next week, too.

The point is, supercomputing has always been much more involved with national pride than other forms of computing, and at this point, computer component design is largely done by US-based companies, even if a lot of the manufacturing and assembly is done overseas. It is foolish for the members of the European Union to lose the skills and jobs behind the engineering, building, and programming of powerful parallel computers. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
Intel, Cisco and co reveal PLANS to keep tabs on WORLD'S MACHINES
Connecting everything to everything... Er, good idea?
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.