Feeds

Microsoft brings Hadoop option to SQL Server

Open-source crunchware SQOOPs to conquer

Build a business case: developing custom apps

Microsoft customers running SQL Server are getting a taste of really big data processing through an injection of Hadoop.

The company has released early code that will let Microsoft customers plug the open-source Java architecture from Doug Cutting into SQL Server 2008 R2, SQL Server Parallel Data Warehouse for huge data warehouses, as well as the next version of Microsoft's database, which is codenamed Denali.

Hadoop was built by Cutting, who was inspired by Google's MapReduce. It is becoming something of an industry standard for processing huge amounts of data on clustered servers thanks to the fact that its code is open. Hadoop has also been adopted by top-tier web properties including Amazon, Facebook and Twitter.

The industry thinking is that Hadoop can trickle down to customers outside the rarefied circles of serious number-crunchers, where it is used to understand the changing minutiae of millions of users' likes and status updates in order to change services in response. The aim is for Hadoop to find its feet in more mainstream IT.

Microsoft's Research unit has been working on something that sounds remarkably similar to Hadoop, called Dryad, since about 2006. Earlier this year the plan was to "productise" Dryad through integration with SQL Server and its Windows Azure cloud. There have been no updates from Microsoft, but it seems Dryad must now compete for the affections of big-data lovers on SQL Server.

The Microsoft connectors are called Hadoop Connector for SQL Server Parallel Data Warehouse and Hadoop Connector for SQL Server and are available as Community Technology Previews (CTPs).

The connectors are two-way, letting you move data backwards and forwards between Hadoop and Microsoft's database servers

Microsoft said the connectors would let its customers analyse unstructured data in Hadoop and then pull that back into the SQL Server environments for analysis.

Both connectors use SQL to Hadoop (SQOOP) to transfer the data "efficiently" between the Hadoop File System (HDFS) and Microsoft's relational databases. The Parallel Data Warehouse uses PDW Bulk Load/Extract tool for fast import and export of data.

SQL Server PDW customers can get the Hadoop connector from Microsoft while users of the regular SQL Server 2008 R2 can get the code for Hadoop Connector for SQL Server here. ®

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Scale data protection with your virtual environment
To scale at the rate of virtualization growth, data protection solutions need to adopt new capabilities and simplify current features.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?