Feeds

Amazon pumps sky-high Big Data cruncher

NetBeans IDE hugs Hadoop

Protecting against web application threats using SSL

Hadoop World Amazon has juiced its Big-Data-crunching Elastic MapReduce service, announcing support for Hive, the SQL-like query engine for the open-source Hadoop platform that underpins the service.

Hadoop mimics the GFS and MapReduce platforms that drive Google's back-end infrastructure, and with Elastic MapReduce, Amazon sits the open source animal atop its sky-high compute resource services: EC2 (elastic compute cloud) and S3 (simple storage service).

In August, Amazon introduced support for Pig, a Yahoo!-developed programming language for Hadoop, and this morning at the Hadoop World developer conference in midtown Manhattan, the company announced its embrace of Hive, a higher-level language developed at Facebook.

Amazon's Peter Sirota said that EMR will support Hive version 0.4, and it will be integrated with the service's web console. "We're also introducing new features for Hive that make it a little bit easier to use in the cloud," he said. You can load partitions automatically from Amazon S3, and you can setup a data store that's used by several Hadoop clusters.

He also said that Hive performance had been optimized on the Amazon services. "We're optimizing data writes. So if you run inside the Amazon Elastic MapReduce, your jobs will run faster and more reliably."

Sirora then introduced a private beta release of an Elastic MapReduce incarnation based on the Hadoop distro from Cloudera, the star-studded startup that has commercialized the open-source platform in Red Hat-like fashion. In other words, Elastic MapReduce users can now turn to Cloudera for a support contract.

Separately, Cloudera offers a version of its distro that you can run on EC2 and S3 on your own. It has also introduced similar distros for VMware's imminent vCloud as well as so-called cloud services from Rackspace and SoftLayer.

Meanwhile, Amazon joined Karmasphere in announcing that Elastic MapReduce can now be used in tandem with the Karmasphere Studio For Hadoop, an integrated development environment based on the popular NetBeans IDE. With this free Hadoop Studio, you can prototype Hadoop jobs on the PC desktop, and once they're prototyped you can then deploy, debug, and monitor them on an internal Hadoop cluster or, yes, Amazon's Elastic MapReduce.

This morning, Cloudera released its own desktop tool for Hadoop, a GUI meant to run an number of applications for interacting with clusters. At the moment, it only runs with the Cloudera distro. ®

Choosing a cloud hosting partner with confidence

More from The Register

next story
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Google+ GOING, GOING ... ? Newbie Gmailers no longer forced into mandatory ID slurp
Mountain View distances itself from lame 'network thingy'
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
Seagate's triple-headed Cerberus could SAVE the DISK WORLD
... and possibly bring us even more HAMR time. Yay!
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.