Feeds

Uncle Sam shells out $62m for 100GbE

Obama stimulates fed networks

Choosing a cloud hosting partner with confidence

It would be tough to call the development of 100 Gigabit Ethernet switches and adapters a shovel-ready project, but the Obama administration's stimulus package is picking up the $62m tab to help get faster networks to market.

The $787bn American Recovery and Reinvestment Act, which was signed into law in February, allocates money for traditional infrastructure - roads, bridges, and the like - as well as for its electronic infrastructure - broadband Internet, healthcare systems, and other goodies aimed at IT vendors.

The Department of Energy's vast supercomputing programs are getting a piece of the ARRA pie to build out a faster network for linking the nation's behemoth massively parallel supercomputers together. Specifically, Lawrence Berkeley National Laboratory, which runs the DOE's Energy Sciences Network (ESnet) for linking the HPC gear at the government labs (Sandia, Lawrence Livermore, Lawrence Berkeley, Oak Ridge, Los Alamos, Brookhaven, Argonne, Pacific Northwest, and Ames are the biggies) is getting the dough to pay more engineers at Berkeley Lab as well as to pick the hardware vendors who will help boost ESnet to 100 Gigabit Ethernet speeds.

DOE has its eyes on a much more ambitious network, however, and one that will take many years and much more funding to fulfill.

"This network will serve as a pilot for a future network-wide deployment of 100 Gbps Ethernet in research and commercial networks and represents a major step toward DOE's vision of a 1-terabit - 1,000 times faster than 1 gigabit - network interconnecting DOE Office of Science supercomputer centers," explained Michael Strayer, the head of the DOE's Office of Advanced Scientific Computing Research in a statement announcing the $62m contract.

DOE says that its supercomputers are already running simulations that have datasets on the terabyte scale and that soon they will be chewing through datasets in the petabytes range. For instance, a climate model that spans past, present, and future at Lawrence Livermore National Lab currently spans 35 terabytes and is being used by over 2,500 researchers worldwide. An updated (and presumably finer-grained) climate model is expected to have a dataset in the range of 650 terabytes and the distributed archive of datasets related to this model is expected to be somewhere between 6 and 10 petabytes. To move such datasets around the ESnet network requires a lot more bandwidth and better protocols than Gigabit Ethernet.

To that end, about $3m is being spent on some more network and software engineers. Another $8m to $9m will be spent on a testbed for new network gear and services from telcos, and the remaining $50m or so will go to actually buying 100 Gigabit Ethernet switches and services for ESnet to link the more than 40 computational centers in the United States that do supercomputing in conjunction with the DOE.

Let the cat fighting among the Ethernet switch makers begin....

By the way, ARRA is a big deal for these labs. As you can see from this tally of 20 projects that have been partially funded by ARRA at the Berkeley Lab, ARRA is covering $173.7m of the total $241.5m being shelled out. If there was no ARRA, many of these projects would not have been done at all. ®

Beginner's guide to SSL certificates

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Reducing the cost and complexity of web vulnerability management
How using vulnerability assessments to identify exploitable weaknesses and take corrective action can reduce the risk of hackers finding your site and attacking it.