Feeds

Uncle Sam shells out $62m for 100GbE

Obama stimulates fed networks

Choosing a cloud hosting partner with confidence

It would be tough to call the development of 100 Gigabit Ethernet switches and adapters a shovel-ready project, but the Obama administration's stimulus package is picking up the $62m tab to help get faster networks to market.

The $787bn American Recovery and Reinvestment Act, which was signed into law in February, allocates money for traditional infrastructure - roads, bridges, and the like - as well as for its electronic infrastructure - broadband Internet, healthcare systems, and other goodies aimed at IT vendors.

The Department of Energy's vast supercomputing programs are getting a piece of the ARRA pie to build out a faster network for linking the nation's behemoth massively parallel supercomputers together. Specifically, Lawrence Berkeley National Laboratory, which runs the DOE's Energy Sciences Network (ESnet) for linking the HPC gear at the government labs (Sandia, Lawrence Livermore, Lawrence Berkeley, Oak Ridge, Los Alamos, Brookhaven, Argonne, Pacific Northwest, and Ames are the biggies) is getting the dough to pay more engineers at Berkeley Lab as well as to pick the hardware vendors who will help boost ESnet to 100 Gigabit Ethernet speeds.

DOE has its eyes on a much more ambitious network, however, and one that will take many years and much more funding to fulfill.

"This network will serve as a pilot for a future network-wide deployment of 100 Gbps Ethernet in research and commercial networks and represents a major step toward DOE's vision of a 1-terabit - 1,000 times faster than 1 gigabit - network interconnecting DOE Office of Science supercomputer centers," explained Michael Strayer, the head of the DOE's Office of Advanced Scientific Computing Research in a statement announcing the $62m contract.

DOE says that its supercomputers are already running simulations that have datasets on the terabyte scale and that soon they will be chewing through datasets in the petabytes range. For instance, a climate model that spans past, present, and future at Lawrence Livermore National Lab currently spans 35 terabytes and is being used by over 2,500 researchers worldwide. An updated (and presumably finer-grained) climate model is expected to have a dataset in the range of 650 terabytes and the distributed archive of datasets related to this model is expected to be somewhere between 6 and 10 petabytes. To move such datasets around the ESnet network requires a lot more bandwidth and better protocols than Gigabit Ethernet.

To that end, about $3m is being spent on some more network and software engineers. Another $8m to $9m will be spent on a testbed for new network gear and services from telcos, and the remaining $50m or so will go to actually buying 100 Gigabit Ethernet switches and services for ESnet to link the more than 40 computational centers in the United States that do supercomputing in conjunction with the DOE.

Let the cat fighting among the Ethernet switch makers begin....

By the way, ARRA is a big deal for these labs. As you can see from this tally of 20 projects that have been partially funded by ARRA at the Berkeley Lab, ARRA is covering $173.7m of the total $241.5m being shelled out. If there was no ARRA, many of these projects would not have been done at all. ®

Beginner's guide to SSL certificates

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
Cray-cray Met Office spaffs £97m on VERY AVERAGE HPC box
Only 250th most powerful in the world? Bring back Michael Fish
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Getting ahead of the compliance curve
Learn about new services that make it easy to discover and manage certificates across the enterprise and how to get ahead of the compliance curve.