Feeds

Obama gets the big data bug

Swings swags of simoleons at science

Top 5 reasons to deploy VMware with Tegile

Six US federal government agencies are putting more than $US200 million to try and wrap their heads around the dizzying world of “big data”.

The big-ticket buzzword hunt is being led by the White House Office of Science and Technology, along with the National Science Foundation and National Institutes of Health, the Department of Defense, the Department of Energy, DARPA, and the US Geological Survey.

The NSF is to “extract and use knowledge from collections of large data sets to accelerate progress in science and engineering research”, and will fund research into algorithms and statistical methods, along with analytics and “e-science collaboration environments”.

The NSF’s early announcements include $10 million to UC Berkeley, whose Expeditions in Computing program is researching scalable machine learning algorithms, “datacenter-friendly programming environments” and computing infrastructure. It’s also backing a geosciences program called EarthCube, $US1.4 million for statistics research in biology, and $US2 million for university education in big data.

The NIH’s big data flagship is making its “1000 Genomes” project – described as the largest dataset on human genome variation – available on Amazon’s AWS.

The DoE’s initial contribution is to put $US5 million into the Scalable Data Management, Analysis and Visualization Institute, which it says will bring together experts from Berkeley, Argonne, Lawrence Livermore, Los Alamos, Oak Ridge, Sandia National Laboratories, a slew of universities, and data visualization company Kitware.

From the DoD comes the announcement that in addition to its existing programs, it will announce “a series of open prize competitions” over the coming months, as well as another $US25 million to be spent annually by DARPA on developing new software tools.

The USGS is rolling grants distributed through its John Wesley Powell Center for Analysis and Synthesis into the White House big data initiative. ®

Internet Security Threat Report 2014

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The hidden costs of self-signed SSL certificates
Exploring the true TCO for self-signed SSL certificates, including a side-by-side comparison of a self-signed architecture versus working with a third-party SSL vendor.