Feeds

Intel adds Lustre support to Hadoop

Chipzilla hopes HPC owners get busy with map reduce in their spare time

Internet Security Threat Report 2014

World+dog agrees that Hadoop is a very fine tool with which to tackle map reduce chores, but the software has a couple of constraints, especially its reliance on the Hadoop Distributed File System (HDFS).

There's nothing wrong with HDFS, but its integration with Hadoop means the software needs a dedicated cluster of computers on which to run.

That's not a bad thing for many reasons. But folks who run high performance computing clusters for other purposes often don't run HDFS, which leaves them with a bunch of computing power, tasks that could almost certainly benefit from a bit of map reduce and no way to put that power to work running Hadoop.

Intel's noticed this and, in version 2.5 of its Hadoop distribution that it quietly released last week, has added support for Lustre.

Girish Juneja, Intel's general manager for big data and software services, thinks Chipzilla's HPC customers are going to love that, and that the rest of us won't mid either given Intel's following all the open source rules with this contribution.

“Many customers do not want to deploy an entirely separate physical cluster just because we couldn't figure out how to run Hadoop on their file system,” Juneja told The Reg at Intel's Big Data and Cloud Summit in Ho Chi Minh City*. “HPC was a prime target for that. In the HPC segment a large part of the market runs on GPFS or Lustre and we had the fortune of having Lustre in my division.”

“We abstracted out an HDFS layer but underneath that it is actually talking to lustre.

“So if you look at Los Alamos lab and these research labs with humongous clusters that run HPC jobs 90 per cent of the time, but for ten per cent of the time they want to run a Hadoop job they can run it in exactly the same environment without moving data.”

Given such labs deal in vast quantities of data, the chance to leave it in place will be welcome.

Chipzilla's also turned is attention to encryption and an access control list for HBase.

“In this NoSQL environment the challenge becomes how to designate who has access to what data,” Juneja said. “We have added capabilities to allow for access control lists” that let administrators set policies for who can access what data in Hbase.

Janeja feels that addition, together with the introduction of encryption and data anonymisation, will mean financial services providers and users with heavy compliance requirements can now consider Hadoop. In the past, Janeja said, the absence of those security-oriented features meant Hadoop represented unacceptable risk.

Intel also sells its own management software to drive the access control lists, an arrangement Juneja feels customers won't mind too much.

Version 3.0 of Chipzilla's Hadoop distribution is also close to release, with September targeted for its emergence. Juneja said users can expect Intel's distribution to very closely resemble efforts of the wider Hadoop community. ®

*The author attended the summit as a guest of Intel, which paid for flights and accomodation.

Secure remote control for conventional and virtual desktops

More from The Register

next story
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Oi, Europe! Tell US feds to GTFO of our servers, say Microsoft and pals
By writing a really angry letter about how it's harming our cloud business, ta
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Reducing the cost and complexity of web vulnerability management
How using vulnerability assessments to identify exploitable weaknesses and take corrective action can reduce the risk of hackers finding your site and attacking it.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.