Feeds

9TB in 20 minutes? Sign me up!

IBM hails 'breakthrough' algorithm

  • alert
  • submit to reddit

Gartner critical capabilities for enterprise endpoint backup

In an experiment IBM researchers used the fourth most powerful supercomputer in the world - a Blue Gene/P system at the Forschungszentrum Julich in Germany - to validate nine terabytes of data in less than 20 minutes, without compromising accuracy. Ordinarily, using the same system, this would take more than a day. Additionally, the process used just one percent of the energy that would typically be required.

In a press release last week, IBM hailed a "breakthrough method based on a mathematical algorithm that reduces the computational complexity, costs, and energy usage for analyzing the quality of massive amounts of data by two orders of magnitude. This new method will greatly help enterprises extract and use the data more quickly and efficiently to develop more accurate and predictive models."

OK, I’m a sucker for anything that does something a lot faster – even if I don’t quite understand how it does it.

So I have to write a least a little bit about the paper that three IBM researchers submitted documenting their technique and results. I was faced with a blizzard of phrases such as “Inverse covariance matrices,” “Matrix factorizations,” “Cubit cost,” and this helpful explanation: “First, we turned to stochastic estimation of the diagonal.”

These terms, plus many others that I also don’t understand, are in just the abstract; the body of the paper seems considerably more technical and complex.

The one phrase that I fully understood was: “We stress that the techniques presented in this work are quite general and applicable to several other important applications.” And this is an important phrase, because it means that using this technique (and others that smart guys are working on right now), we’ll be able to see orders of magnitude improvement in other analytic tasks that use vast amounts of data. It’s always good to see progress.

In case you’re interested, here are some pictures of the guys who came up with it – mostly shots of them standing around looking smarter than any of us.

Secure remote control for conventional and virtual desktops

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
A beheading in EMC's ViPR lair? Software's big cheese to advise CEO
Changes amid rivalry in the storage snake pit
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.