US compute labs in desperate need of Federal swill
Nuclear stockpiles not going to waste
Most of you probably have no idea that it's happening. You've been sleeping well. You've been enjoying moderate amounts of scotch, playing with the kids and scratching Fido behind the ears. You've been living your merry lives, and all the while, some of the finest minds have been suffering - suffering a fate worse that having their toenails peeled back with a screwdriver. You guessed it. We have computing researchers without the necessary government funding.
In a nation with so many dirty secrets, this one is easy not to notice. No one pays much mind to the frail brains hunkered behind a massive supercomputer in Los Alamos, New Mexico. No one hears their Diet Coke-fuelled cries for help. All of you, however, should start listening. These computer scientists have your nuclear stockpiles in their hands, and they need money - bad!
The call to subsidize US computer scientists became more pressing this week when a group of researchers turned in an extensive study to the Department of Energy. The study, conducted by computing experts from around the country, is calling for at least $140m in annual funds - up from close to $40m - to be doled out by the government for high performance computing projects. Without these funds and fundamental changes in the way the US approaches the highest-end computing problems, all will be lost or something to that effect.
Anyone who keeps even a half an eye on the supercomputing scene is probably well puzzled by the idea that even more federal handouts need to go the likes of Cray and IBM. Hardware vendors seem to send out a press release every couple of months, announcing a new DARPA-funded project or massive system at Lawrence Livermore National Laboratory. Hell, the argument could be made that Cray is already nothing more than an arm of the Feds in the guise of a public, money-hungry company.
On top of the existing funds, recent trends in supercomputing have drastically lowered the costs of high performance systems. Universities can cobble together a cluster of off-the-shelf PCs in a couple of weeks and create a relatively cheap cluster that takes at least the 20 spot on the Top500 fastest machines list. Faster processors from Intel and AMD have made high performance clusters not only cheap but also commonplace.
The averageness, however, of high performance clusters is something of a curse to America's national labs, the new report argues. While impressive, clusters can't perform the highest-end tasks. The labs need systems with some proprietary parts such as custom memory systems and high performance networking technology. A number of large hardware makers used to focus on these types of goods, but the market for specialized gear has shrunk and made the vendors less interested in helping out the labs.
"The US industrial base must include suppliers on whom the government can rely to build custom systems to solve problems arising from the government's unique requirements," the report states. "Since only a few units of such systems are ever needed, there is no broad market for the systems and hence no commercial off-the-shelf suppliers."
So average it hurts
So what exactly is at stake if our labs don't get their specialized machines?
Well, the biggest, baddest supercomputers are used for a wide variety of tasks. The systems, for example, help researchers test nuclear explosions without actually exploding a bomb and then help determine how best to get rid of our nuclear stockpiles. They're also used to study climate change (not that it's actually changing, right), the behavior of molecules and, of course, national intelligence types of matters.
The US has a long history of beating the snuff out of other countries in these areas due to our computing prowess, and the old labs hands would like to see this continue.
"Supercomputing has become a major contributor to the economic competitiveness of our automotive, aerospace, medical, and pharmaceutical industries," the report states. "The discovery of new substances and new techniques, as well as cost reduction through simulation rather than physical prototyping, will underpin progress in a number of economically important areas. The use of supercomputing in all of these areas is growing, and it is increasingly essential to continued progress."
"Current US investments in supercomputing and current plans are not sufficient to provide the supercomputing capabilities that our country will need. It needs supercomputers that satisfy critical requirements in areas such as cryptography and stockpile stewardship, as well as systems that will provide breakthroughs for the broad scientific and technological progress underlying a strong and robust US economy."
The report makes a strong case for more government funding. The labs need to encourage a wide variety of hardware makers to develop specialized gear in order to satisfy our national concerns. In addition, we need cash on hand to attract the best and brightest minds to the public sector, so they can solve the most difficult of problems instead of figuring out how to make Linux scale better.
The Average Joe benefits in myriad ways from the work the labs do. Their technology eventually makes its way to car companies that design safer automobiles with the help of fast computers. We all get better medicines, better oil and gas exploration, and fuller cans of Pringles. The US labs need more money to keep our edge on the world at large in so many areas.
Don't cry for me, Porky
While reading the report, however, one gets the sense that an awful lot of friendly advice is being tossed around. The report was created by researchers with vested interests in seeing more funds come their way. In addition it was reviewed by staffers at IBM, Cray and other companies who would also like to see their research and development projects funded by the government.
Funding computing projects certainly seems to make more sense than paying farmers not to farm, but you have to wonder if the survival of the US is in as much peril as the researchers say.
The US computing community was spooked a couple years back when the Japanese helped NEC create a system that put our boxes to shame. This year, the US trounced the NEC Earth Simulator machine, and we probably could have trounced it before with a little more effort.
The supercomputing report addresses this issue.
"The committee believes that had the United States at that time made an investment similar to the Japanese investment in the Earth Simulator, it could have created a powerful and equally capable system. The committee's concern is that the United States has not been making the investments that would guarantee its ability to create such a system in the future."
Later in the report, however, the committee seems to assuage its own fears.
"It appears that custom high-bandwidth processors such as those used by the Earth Simulator are not viable products without significant government support. Two of the three Japanese companies that were manufacturing such processors do not do so anymore, and the third (NEC) may also bow to market realities in the not-too-distant future, since the Japanese government seems less willing now to subsidize the development of cutting-edge supercomputing technologies."
If NEC, Fujitsu and the Japanese government can't challenge the US, it's unlikely that any country can in the future. Almost all of the great remaining hardware makers - IBM, HP, Sun, Cray and SGI - are based in the US. And IBM, for example, is doing more than its fair share to help out the government. IBM has opened up its Power processor architecture in a way that allows labs to add customized bits to chips - say more floating point units - that they need for specialized tasks. Meanwhile, DARPA has sent more than $100m to Cray, Sun and IBM for the creation of a next-generation supercomputer.
The new report addresses many of these concerns at length. Overall, it seems to be a fair and balanced look at the needs of our labs.
Still, one can't help but wonder if the doom and gloom warnings have their roots in a mentality accustomed to government pork. The US has a vibrant and innovative computing industry. Natural progressions in commodity technology coupled with the specialized work being done by IBM and others seems to be getting the job done. The pioneering benefits that once trickled down from the labs to the rest of the computing public seem less pronounced now as the IT landscape has matured.
Beyond these issues, the US military spends more on technology than most of the countries in the world put together. The US government is by far the world's largest IT customer. Our universities are the richest on the planet, and our vendors the best taken care of. If all of this doesn't count as making the investments that would guarantee our success in the future, it's not clear what would.
Do the labs, which already spends hundreds of millions of dollars on high performance machines, need more government handouts? Maybe so, but we doubt the situation is as dire as the labs would make it seem. ®
Sponsored: DevOps and continuous delivery