Feeds

Obama gets the big data bug

Swings swags of simoleons at science

Boost IT visibility and business value

Six US federal government agencies are putting more than $US200 million to try and wrap their heads around the dizzying world of “big data”.

The big-ticket buzzword hunt is being led by the White House Office of Science and Technology, along with the National Science Foundation and National Institutes of Health, the Department of Defense, the Department of Energy, DARPA, and the US Geological Survey.

The NSF is to “extract and use knowledge from collections of large data sets to accelerate progress in science and engineering research”, and will fund research into algorithms and statistical methods, along with analytics and “e-science collaboration environments”.

The NSF’s early announcements include $10 million to UC Berkeley, whose Expeditions in Computing program is researching scalable machine learning algorithms, “datacenter-friendly programming environments” and computing infrastructure. It’s also backing a geosciences program called EarthCube, $US1.4 million for statistics research in biology, and $US2 million for university education in big data.

The NIH’s big data flagship is making its “1000 Genomes” project – described as the largest dataset on human genome variation – available on Amazon’s AWS.

The DoE’s initial contribution is to put $US5 million into the Scalable Data Management, Analysis and Visualization Institute, which it says will bring together experts from Berkeley, Argonne, Lawrence Livermore, Los Alamos, Oak Ridge, Sandia National Laboratories, a slew of universities, and data visualization company Kitware.

From the DoD comes the announcement that in addition to its existing programs, it will announce “a series of open prize competitions” over the coming months, as well as another $US25 million to be spent annually by DARPA on developing new software tools.

The USGS is rolling grants distributed through its John Wesley Powell Center for Analysis and Synthesis into the White House big data initiative. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Docker kicks KVM's butt in IBM tests
Big Blue finds containers are speedy, but may not have much room to improve
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Gartner's Special Report: Should you believe the hype?
Enough hot air to carry a balloon to the Moon
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Dell The Man shrieks: 'We've got a Bitcoin order, we've got a Bitcoin order'
$50k of PowerEdge servers? That'll be 85 coins in digi-dosh
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.