Feeds

Pivotal: So who fancies skinny-dipping in our 'Business Data Lake'? PS: It'll cost you

Tries to tempt more into its Hadoop pond with Pivotal HD 2

Next gen security for virtualised datacentres

EMC and VMware's spin-off Pivotal is trying to make money from a bevy of open-source big-data-crunching technologies by making their most complicated aspects disappear.

With the launch of Hadoop distribution Pivotal HD 2.0 and analytics engine Pivotal GemFire XD on Monday, the company has created a clutch of technologies that let organizations make sense of large amounts of data without having to deal in the hard-to-master traditional Hadoop interfaces.

The two bits of software together give companies an in-memory SQL data store that sits upon data stored in the Hadoop File System (HDFS), and an engine called HAWQ that can query data via SQL.

In addition, HAWQ has gained integration with GraphLab OpenMPI and MADlib – software that gives it an integrated set of analytical algorithms for analyzing relational data and also graph analytics. It has also added compatibility for queries in R, Python, and Java as well.

Pivotal said in a press release that the launch of these products would "constitute the foundation for the Business Data Lake architecture" – this language caused our brain to melt and leak out of our noses in the form of a sad beige paste.

Marketing-aside, the technology is Pivotal's response to the proliferation of structured and unstructured data types within an organization. Pivotal hopes that companies will spend big for the pleasure of being able to ingest, store, analyze and query data within a single stack of integrated software modules. (Pivotal doesn't disclose prices and encourages people to get in touch, so it's using a how much can you afford enterprise pricing strategy.)

PivotalHD

Pivotal's gooey blend of open source and proprietary tech

It has done this by bolting numerous additions on top of the stock Hadoop distribution, and it reckons that its modular architecture will help it update the software over time without transforming the application's codebase into a gigantic ball of spaghetti.

Future areas of development for the software include multi-tenancy, enhanced Hadoop, and adding in other open-source projects such as Apache Spark.

"Multi-tenancy is a major theme we want to focus on next [by] allowing different groups and lines of users, and different types of workload requirements, to work with our data fabric," explained Anant Chintamaneni, Pivotal's director of product management for the Pivotal HD stack, in a chat with The Register.

Though the main changes to Pivotal HD are about putting proprietary, easy-to-use data querying and analysis interfaces on top of Hadoop, Chintamaneni confirmed that initial Hadoop installations are still troublesome for some organizations.

"When a customer deploys it, our guidance always is to plan," he said. "You do have to plan exactly how you're going to lay it out. If you want to run HBase and HAWQ on same cluster you need to apportion the cluster accordingly."

The company may even further its open-source contributions, he confirmed, saying the company is thinking of contributing some technology into Hadoop's advanced YARN job scheduler.

Hadoop may be a darling of sophisticated engineering-heavy companies, but Pivotal is betting that by making it easier to use, it may finally be able to make some cash out of the open-source project. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?