Feeds

Apache Foundation embraces real time big data cruncher 'Storm'

Does for real time processing what Hadoop did for batch processing

Next gen security for virtualised datacentres

The Apache Foundation has voted to accept the “Storm” real time data processing tool into its incubator program, the first step towards making it an official part of the Foundation's open source offerings.

Storm aims to do for real time data processing what Hadoop did for batch processing: queue jobs and send them off to a cluster of computers, then pull everything back together into usable form. Nathan Marz, poster of the Storm GitHub repository believes “The lack of a 'Hadoop of real time' has become the biggest hole in the data processing ecosystem.”

Storm tries to fill that hole with software that “... exposes a set of primitives for doing real time computation. Like how MapReduce greatly eases the writing of parallel batch processing, Storm's primitives greatly ease the writing of parallel real time computation.”

Without Storm, Marz writes, one would have to “manually build a network of queues and workers to do real time processing.” Storm automates that stuff, which should mean better scaling: Marz already claims “one of Storm's initial applications processed 1,000,000 messages per second on a 10 node cluster, including hundreds of database calls per second as part of the topology.”

All of which should get high performance computing folks excited.

The Apache Foundation's incubation process isn't technical. One goal is to ensure any software offered with its feathered logo conforms to its preferred license, which should not prove problematic as Storm is currently offered under the Eclipse Public License. The Foundation also likes to ensure proper communities nourish software it offers, and again that should not be a struggle given Storm already has enthusiastic users including Yahoo!, Twitter and business-to-business tat bazaar Alibaba.

Once the Foundation adds its imprimatur to the list of testimonials from current Storm users, that community will doubtless grow. And be joined by Big Data marketers who have run out of things to say about Hadoop, although they'll doubtless soon assert Storm means sizzzling business insights are magically available in real time with just as little justification for that assertion as for the oft-repeated proposition that Hadoop+data=highly profitable insights in your inbox every afternoon. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?