Feeds

Microsoft's SQL Server 2014 early code: First look

Welcome to the Hekaton

Next gen security for virtualised datacentres

In-memory database engine, improved integration with Windows Azure, and new indexing technology for high performance data warehousing applications - there's plenty to like in SQL Server 2014, released to manufacturers on Tuesday.

But while Microsoft has been busy and done some heavy lifting, the code that will become generally available on 1 April has some glaringly rough edges.

Let's tackle the best bit first - I looked at the CTP2 and saw later builds and tried them in a hands-on lab.

The in-memory database engine codenamed Hekaton is the most eye-catching feature, thanks to the dramatic performance increase it can offer – up to 30 times, according to Microsoft. The feature has been in development for five years, program manager lead Kevin Liu told journalists on the SQL Server 2014 workshop I attended.

The database engine is new code which accesses data directly in memory, uses a high level of concurrency, and compiles stored procedures to native code for further optimisation. A copy of the data is streamed to disk for persistence, though you can disable this for maximum performance if you do not care about losing data.

The performance benefit is real. Even on a modest virtual machine running on Azure (four cores, 7GB memory) I saw the time for inserting 100,000 orders, each in its own transaction, decline from two minutes and 54 seconds to 36 seconds after switching to in-memory tables.

Another plus is integration. You can mix in-memory and disk-based tables in a database, though querying across both is inefficient.

There are limitations though. The most severe is a long list of T-SQL keywords that are not supported for in-memory tables, including IDENTITY, UNIQUE, OUTER JOIN, IN, LIKE, DISTINCT and other common commands, Triggers and BLOB fields. Workarounds are suggested, but this does mean a porting effort in order to take advantage.

SQL Server 2014

SQL Server 2014: all the features, but can they be used? (click to enlarge)

There are some other limitations for this first release. One is a recommendation that in-memory data does not exceed 256GB.

“Rest assured, that is something we will bump up drastically in the next release,” Liu said.

The other is that “the recommendation for hardware is two sockets” to avoid issues with NUMA (Non Uniform Memory Access) that affect performance.

The best fit for using in-memory tables is where business logic is in stored procedures and client-server communication is not too chatty. Applications that implement business logic in external code are not optimal.

Microsoft is also making a big deal of new Azure integration. There are several possible scenarios. You can mount database files that are in Azure storage; the latency makes this unsuitable in many cases, though SQL Server will cache the most active data - but it can be useful for archiving.

Of wider use is the ability to backup to Azure storage, which is now built-in. In Management Studio you can select URL as a backup destination, which prompts for Azure credentials. There is also a new Managed Backup tool, aimed at smaller organisations, which will automatically backup databases to Azure storage. You only need configure the credentials and the data retention period.

Another Azure feature is replication to SQL Server databases running on Azure VMs. An Add Azure Replica wizard sets up always-on availability.

SQL Server 2014 resource governor

Resource Governor lets you limit resources to specific users (click to enlarge)

Secure remote control for conventional and virtual desktops

More from The Register

next story
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Community chest: Storage firms need to pay open-source debts
Samba implementation? Time to get some devs on the job
Like condoms, data now comes in big and HUGE sizes
Linux Foundation lights a fire under storage devs with new conference
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?