Feeds

Apache Foundation embraces real time big data cruncher 'Storm'

Does for real time processing what Hadoop did for batch processing

Next gen security for virtualised datacentres

The Apache Foundation has voted to accept the “Storm” real time data processing tool into its incubator program, the first step towards making it an official part of the Foundation's open source offerings.

Storm aims to do for real time data processing what Hadoop did for batch processing: queue jobs and send them off to a cluster of computers, then pull everything back together into usable form. Nathan Marz, poster of the Storm GitHub repository believes “The lack of a 'Hadoop of real time' has become the biggest hole in the data processing ecosystem.”

Storm tries to fill that hole with software that “... exposes a set of primitives for doing real time computation. Like how MapReduce greatly eases the writing of parallel batch processing, Storm's primitives greatly ease the writing of parallel real time computation.”

Without Storm, Marz writes, one would have to “manually build a network of queues and workers to do real time processing.” Storm automates that stuff, which should mean better scaling: Marz already claims “one of Storm's initial applications processed 1,000,000 messages per second on a 10 node cluster, including hundreds of database calls per second as part of the topology.”

All of which should get high performance computing folks excited.

The Apache Foundation's incubation process isn't technical. One goal is to ensure any software offered with its feathered logo conforms to its preferred license, which should not prove problematic as Storm is currently offered under the Eclipse Public License. The Foundation also likes to ensure proper communities nourish software it offers, and again that should not be a struggle given Storm already has enthusiastic users including Yahoo!, Twitter and business-to-business tat bazaar Alibaba.

Once the Foundation adds its imprimatur to the list of testimonials from current Storm users, that community will doubtless grow. And be joined by Big Data marketers who have run out of things to say about Hadoop, although they'll doubtless soon assert Storm means sizzzling business insights are magically available in real time with just as little justification for that assertion as for the oft-repeated proposition that Hadoop+data=highly profitable insights in your inbox every afternoon. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
BYOD's dark side: Data protection
An endpoint data protection solution that adds value to the user and the organization so it can protect itself from data loss as well as leverage corporate data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?