Feeds

Apache Foundation embraces real time big data cruncher 'Storm'

Does for real time processing what Hadoop did for batch processing

Next gen security for virtualised datacentres

The Apache Foundation has voted to accept the “Storm” real time data processing tool into its incubator program, the first step towards making it an official part of the Foundation's open source offerings.

Storm aims to do for real time data processing what Hadoop did for batch processing: queue jobs and send them off to a cluster of computers, then pull everything back together into usable form. Nathan Marz, poster of the Storm GitHub repository believes “The lack of a 'Hadoop of real time' has become the biggest hole in the data processing ecosystem.”

Storm tries to fill that hole with software that “... exposes a set of primitives for doing real time computation. Like how MapReduce greatly eases the writing of parallel batch processing, Storm's primitives greatly ease the writing of parallel real time computation.”

Without Storm, Marz writes, one would have to “manually build a network of queues and workers to do real time processing.” Storm automates that stuff, which should mean better scaling: Marz already claims “one of Storm's initial applications processed 1,000,000 messages per second on a 10 node cluster, including hundreds of database calls per second as part of the topology.”

All of which should get high performance computing folks excited.

The Apache Foundation's incubation process isn't technical. One goal is to ensure any software offered with its feathered logo conforms to its preferred license, which should not prove problematic as Storm is currently offered under the Eclipse Public License. The Foundation also likes to ensure proper communities nourish software it offers, and again that should not be a struggle given Storm already has enthusiastic users including Yahoo!, Twitter and business-to-business tat bazaar Alibaba.

Once the Foundation adds its imprimatur to the list of testimonials from current Storm users, that community will doubtless grow. And be joined by Big Data marketers who have run out of things to say about Hadoop, although they'll doubtless soon assert Storm means sizzzling business insights are magically available in real time with just as little justification for that assertion as for the oft-repeated proposition that Hadoop+data=highly profitable insights in your inbox every afternoon. ®

Secure remote control for conventional and virtual desktops

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?