Flumejava, Millwheel ... No, not NSA codenames: The tech in Google Cloud's data grokker

Ad slinger whips out tool to stick in stream pipes

Internet Security Threat Report 2014

Google I/O Google has built a new thing for developers who want to have their data-filled cake and eat it right now, or perhaps set it aside for later and chomp it down at leisure.

The Google Cloud Dataflow streaming data analysis tool (breathe) was announced by the ad giant on Wednesday at its I/O conference in San Francisco.

It combines batch and streaming data processing in a single tool, and lets developers perform complex analysis and manipulation tasks either as data streams into a system, or after it's been gathered.

The software is Google's latest attempt to press its advanced internal technologies into the fight against Amazon and Microsoft for dominance in the public cloud world. Just as Bezos & Co have made their cloud money by exploiting an unwillingness by incumbent tech companies to lower their pricing, Google is preying on Amazon's allergy to disclosing its internal tech by putting its super-secret systems into its public cloud to leap ahead of Amazon.

"You can use Cloud Dataflow for use cases like [Extract, Transform, Load], batch data processing and streaming analytics, and it will automatically optimize, deploy and manage the code and resources required," Google explained in its blog post.

What sets "Google Cloud Dataflow" apart from other systems is its use of internal Google technologies that the company built after it ran into problems with MapReduce, a computing approach that was refined by Yahoo! into an open-source tech named Hadoop.

"We don't use MapReduce anymore," explained Google's cloud marketing bloke Brian Goldfarb in a chat with The Reg. "We wanted something new that handles exabyte scale [data]. If you want to use batch or real-time they're completely different programming models or concepts, we're trying to merge all those together."

Some of the ways you can use the tech are for real-time anomaly detection, such as analyzing World Cup data against historical data to find out when a game between two teams takes an unexpected term. Other examples we can think up include: isolating failure scenarios in large amounts of machine-generated data, like server logs; and spotting changes in shopper behavior on an e-commerce site after a design tweak.

Cloud Dataflow depends on two internal Google technologies – Flume and Millwheel – to make it "a fully managed service for creating data pipelines that ingest, transform and analyze data in both batch and streaming modes," according to Google's blog post.

The tech will compete with Amazon Web Services' Kinesis product, which was launched last year at the company's eponymous Re:invent conference in Las Vegas. Kinesis specializes in streaming data, and users have to go to other AWS services, like Elastic Map Reduce, for batch jobs.

Google is able to bung streaming and batch analysis together in one platform through its use of FlumeJava and MillWheel.

Stick that in your pipeline and smoke it

"FlumeJava is a pure Java library that provides a few simple abstractions for programming data-parallel computations," Google explains in an academic paper describing the tech [PDF]. "These abstractions are higher-level than those provided by MapReduce, and provide better support for pipelines."

MillWheel, meanwhile, is "a framework for building low-latency data-processing applications that is widely used at Google," according to a paper describing it [PDF]. "Users specify a directed compuation graph and application code for individual nodes, and the system manages persistent state and the continuous flow of records, all within the envelope of the framework's fault-tolerance guarantees."

"Imagine a world where you're connecting complex open source packages like Kafka and Hadoop together... by having [them accessible through] one API as a managed service," Goldfarb explained.

Alongside the tool, Google announced new monitoring utilities for its cloud to help "developers understand, diagnose and improve systems in production."

These include Google Cloud Monitoring, which uses software from recent Google-acquisition Stackdriver to give developers metrics, dashboards and alerts for Google's own technology along with Apache, Nginx, MongoDB, MySQL, Tomcat, IIS, Redis, Elasticsearch and others.

"You can use Cloud Monitoring to identify and troubleshoot cases where users are experiencing increased error rates connecting from an App Engine module or slow query times from a Cassandra database with minimal configuration," the company said. To help developers isolate specific faults it has also launched "Cloud Trace", which visualizes the time an app spends processing specific requests.

Finally, Google has released Cloud Debugger, which promises "to help you debug your application in production with effectively no performance overhead," by giving devs "a full stack trace and snapshots of all local variables for any watchpoint that you set in your code while your application continues to run undisturbed in production."

Though Google did not disclose what technologies have enabled these services, it's highly likely that the company's advanced low-overhead "CPI2" monitoring brain is feeding data up into the aforementioned cloud systems. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story


Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.