Feeds

Fujitsu saddles up its own Hadoop distro

Forget HDFS

Top 5 reasons to deploy VMware with Tegile

Japanese IT conglomerate Fujitsu is throwing its own elephant into the ring with a mashup of Fujitsu software with components of the Apache Hadoop big data muncher, which it says is better than just using the open-source code all by its lonesome.

Like many Hadoop users, customers using Fujitsu's mainframe, Sparc, and x86 iron complain about the crankiness and limitations of the Hadoop Distributed File System (HDFS), and so the company has grabbed the Apache Hadoop 1.0 stack that was announced in January and given HDFS the boot. Or rather... not.

The problem, according to Fujitsu, is that enterprise systems and Hadoop systems store their data in different formats and in distinct systems and you end up having to upload data from those enterprise systems to a Hadoop cluster, chew on them, and then download the reduced data back into enterprise systems.

Fujitsu Hadoop stack

Plain Hadoop on top, juiced Fujitsu Hadoop on bottom

With the Interstage Big Data Parallel Processing Server V1.0 takes the Hadoop MapReduce algorithm and marries it to a proprietary distributed file system cooked up by Fujitsu that the enterprise systems and the Hadoop cluster can use as peers. This file system runs on the hosts and makes use of Fujitsu's Eternus disk arrays and has a standard Linux interface for those enterprise systems and an HDFS-compatible interface for the Hadoop cluster to use it.

Fujitsu does not name this proprietary distributed file system, but it could be a variant of the Fujitsu Exabyte File System, announced last year and targeted at the company's supercomputer customers. (FEFS is itself a variant of the open-source Lustre file system.)

The other innovation that Fujitsu is tossing into its Interstage Hadoop distro is the ability to cluster the Hadoop master node – the controller that tells what server nodes to chew on what data and a single point of failure as well as a performance bottleneck for Hadoop clusters – for high availability.

The big plus is that both Hadoop and enterprise systems can chew on the data residing on the Eternus arrays, and this speeds up Hadoop jobs considerably because you are not waiting for enterprise data to be uploaded into the Hadoop cluster. This presumes that you don't have other external data that you also want to chuck into the MapReduce pot, and that is not necessarily a valid assumption for a lot of companies that are doing big data these days.

Interstage Big Data Parallel Processing Server V1.0 will begin shipping at the end of April. It will cost ¥600,000 ($7,465) per server processor for a license. Fujitsu says that prices outside of Japan may vary. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
729 teraflops, 71,000-core Super cost just US$5,500 to build
Cloud doubters, this isn't going to be your best day
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Oi, Europe! Tell US feds to GTFO of our servers, say Microsoft and pals
By writing a really angry letter about how it's harming our cloud business, ta
SAVE ME, NASA system builder, from my DEAD WORKSTATION
Anal-retentive hardware nerd in paws-on workstation crisis
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.