Feeds

Obama gets the big data bug

Swings swags of simoleons at science

Beginner's guide to SSL certificates

Six US federal government agencies are putting more than $US200 million to try and wrap their heads around the dizzying world of “big data”.

The big-ticket buzzword hunt is being led by the White House Office of Science and Technology, along with the National Science Foundation and National Institutes of Health, the Department of Defense, the Department of Energy, DARPA, and the US Geological Survey.

The NSF is to “extract and use knowledge from collections of large data sets to accelerate progress in science and engineering research”, and will fund research into algorithms and statistical methods, along with analytics and “e-science collaboration environments”.

The NSF’s early announcements include $10 million to UC Berkeley, whose Expeditions in Computing program is researching scalable machine learning algorithms, “datacenter-friendly programming environments” and computing infrastructure. It’s also backing a geosciences program called EarthCube, $US1.4 million for statistics research in biology, and $US2 million for university education in big data.

The NIH’s big data flagship is making its “1000 Genomes” project – described as the largest dataset on human genome variation – available on Amazon’s AWS.

The DoE’s initial contribution is to put $US5 million into the Scalable Data Management, Analysis and Visualization Institute, which it says will bring together experts from Berkeley, Argonne, Lawrence Livermore, Los Alamos, Oak Ridge, Sandia National Laboratories, a slew of universities, and data visualization company Kitware.

From the DoD comes the announcement that in addition to its existing programs, it will announce “a series of open prize competitions” over the coming months, as well as another $US25 million to be spent annually by DARPA on developing new software tools.

The USGS is rolling grants distributed through its John Wesley Powell Center for Analysis and Synthesis into the White House big data initiative. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
IT crisis looming: 'What if AWS goes pop, runs out of cash?'
Public IaaS... something's gotta give - and it may be AWS
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
BT claims almost-gigabit connections over COPPER WIRE
Just need to bring the fibre box within 19m ...
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.