Feeds

Microsoft brings Hadoop option to SQL Server

Open-source crunchware SQOOPs to conquer

Next gen security for virtualised datacentres

Microsoft customers running SQL Server are getting a taste of really big data processing through an injection of Hadoop.

The company has released early code that will let Microsoft customers plug the open-source Java architecture from Doug Cutting into SQL Server 2008 R2, SQL Server Parallel Data Warehouse for huge data warehouses, as well as the next version of Microsoft's database, which is codenamed Denali.

Hadoop was built by Cutting, who was inspired by Google's MapReduce. It is becoming something of an industry standard for processing huge amounts of data on clustered servers thanks to the fact that its code is open. Hadoop has also been adopted by top-tier web properties including Amazon, Facebook and Twitter.

The industry thinking is that Hadoop can trickle down to customers outside the rarefied circles of serious number-crunchers, where it is used to understand the changing minutiae of millions of users' likes and status updates in order to change services in response. The aim is for Hadoop to find its feet in more mainstream IT.

Microsoft's Research unit has been working on something that sounds remarkably similar to Hadoop, called Dryad, since about 2006. Earlier this year the plan was to "productise" Dryad through integration with SQL Server and its Windows Azure cloud. There have been no updates from Microsoft, but it seems Dryad must now compete for the affections of big-data lovers on SQL Server.

The Microsoft connectors are called Hadoop Connector for SQL Server Parallel Data Warehouse and Hadoop Connector for SQL Server and are available as Community Technology Previews (CTPs).

The connectors are two-way, letting you move data backwards and forwards between Hadoop and Microsoft's database servers

Microsoft said the connectors would let its customers analyse unstructured data in Hadoop and then pull that back into the SQL Server environments for analysis.

Both connectors use SQL to Hadoop (SQOOP) to transfer the data "efficiently" between the Hadoop File System (HDFS) and Microsoft's relational databases. The Parallel Data Warehouse uses PDW Bulk Load/Extract tool for fast import and export of data.

SQL Server PDW customers can get the Hadoop connector from Microsoft while users of the regular SQL Server 2008 R2 can get the code for Hadoop Connector for SQL Server here. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Like condoms, data now comes in big and HUGE sizes
Linux Foundation lights a fire under storage devs with new conference
Community chest: Storage firms need to pay open-source debts
Samba implementation? Time to get some devs on the job
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?