Feeds

Obama gets the big data bug

Swings swags of simoleons at science

Beginner's guide to SSL certificates

Six US federal government agencies are putting more than $US200 million to try and wrap their heads around the dizzying world of “big data”.

The big-ticket buzzword hunt is being led by the White House Office of Science and Technology, along with the National Science Foundation and National Institutes of Health, the Department of Defense, the Department of Energy, DARPA, and the US Geological Survey.

The NSF is to “extract and use knowledge from collections of large data sets to accelerate progress in science and engineering research”, and will fund research into algorithms and statistical methods, along with analytics and “e-science collaboration environments”.

The NSF’s early announcements include $10 million to UC Berkeley, whose Expeditions in Computing program is researching scalable machine learning algorithms, “datacenter-friendly programming environments” and computing infrastructure. It’s also backing a geosciences program called EarthCube, $US1.4 million for statistics research in biology, and $US2 million for university education in big data.

The NIH’s big data flagship is making its “1000 Genomes” project – described as the largest dataset on human genome variation – available on Amazon’s AWS.

The DoE’s initial contribution is to put $US5 million into the Scalable Data Management, Analysis and Visualization Institute, which it says will bring together experts from Berkeley, Argonne, Lawrence Livermore, Los Alamos, Oak Ridge, Sandia National Laboratories, a slew of universities, and data visualization company Kitware.

From the DoD comes the announcement that in addition to its existing programs, it will announce “a series of open prize competitions” over the coming months, as well as another $US25 million to be spent annually by DARPA on developing new software tools.

The USGS is rolling grants distributed through its John Wesley Powell Center for Analysis and Synthesis into the White House big data initiative. ®

Security for virtualized datacentres

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
AWS pulls desktop-as-a-service from the PC
Support for PCoIP protocol means zero clients can run cloudy desktops
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.