Feeds

Server workloads to go '70% virtual' by 2014

Uh, that's it?

Choosing a cloud hosting partner with confidence

By 2014, more than 70 per cent of all server workloads installed that year will be plunked down not on a bare metal piece of iron, but on a virtual or logical machine of some kind, according to the (virtual) box counters at IDC.

But don't get the wrong idea. There are going to be plenty of old-fashioned physical boxes still being sold over the next four years and beyond, as far as IDC can tell. The company is projecting that a little more than 23 per cent of all physical servers shipped will be "actively supporting virtual machine technology," representing $19.3bn in underlying server spending and 36 per cent of the overall $53.6bn in spending IDC is projecting for server iron in 2014.

The revenues from the sale of virtualized server iron is growing at a 14 per cent compound annual growth rate between 2009 and 2014 (inclusive), which IDC says is more than twice the rate of growth for the server market overall.

IDC further speculates that companies will consume 9.6 million physical servers in 2014, and that only 2.2 million of them will be virtualized. But here's the interesting twist. There will be 7.4 million non-virty servers sold in 2014, but on those 2.2 million virty boxes, the average virtual machine/logical partition count will be 8.5, meaning the number of virtual servers on those boxes sold in 2014 alone will represent 18.4 million in virtual/logical servers.

"Server virtualization is the 'killer app' for the datacenter and has forever changed IT operations," explains Michelle Bailey, research vice president for enterprise platforms and datacenter trends, who put together the virty server report. "Most datacenters have had a 'virtual first' approach to server deployment for the last three years and this has meant that the majority of application instances now reside inside a virtual machine.

"IDC expects that 2010 will be the first year when more than half of all installed application instances will run inside a virtual machine. This has profound implications for not just maintenance and management of the datacenter but also adjacent infrastructure such as storage and networking. As the needs of the enterprise begin to turn to management constraints, large virtualization customers will have to begin to more seriously consider investments in automation tools and converged hardware as a means to lower time to deployment and simplify an increasingly complex datacenter infrastructure."

To make its projections, IDC took out its server models and then augmented them with detailed information on server plans that it gathered from more than 400 IT shops worldwide. And based on what these companies told the IT prognosticator, IDC says that as server virtualization matures, it will radiate out from the data center to branch offices and emerging markets where the idea of having a server was relatively new even a few years ago.

It would have been interesting to see more fine-grained installed base data for virtual versus physical servers - as well as private versus hosted servers of both the virty and real kind - instead of just getting the data for the 2014 end point. But IDC had not responded to calls for more information at press time.

There is bound to be lots of grousing about the vagueness of these auguries among El Reg readers. For one thing, it is a tricky business talking about installed hypervisors and VM counts. Just like IDC can't really know what operating system is installed on a server once it leaves the Hewlett-Packard, Dell, IBM, or Oracle factory, the company can't really know that a machine that doesn't start out being virtualized doesn't end up with a hypervisor running on it either as soon as it comes off the pallet and enters the data center or after it is repurposed.

Moreover, it is hard to imagine that with so many extra cores coming down the pike in future processors that using a virtualization hypervisor of some kind won't become the norm almost regardless of workload, just because of the ease of managing and repurposing of the machines that hypervisors and their related control consoles offer. Supercomputers have remained un-hypervisored because clusters have had image and workload schedulers than can provision machines on the fly for years. But it may turn out that it is faster still to provision a virtualized HPC stack than to use cluster management tools. Whatever is faster, cheaper, and easier will inevitably become the normal practice.

It is hard to imagine that small and medium businesses won't be eager to have virtualized server images running on hypervisors, particularly if server makers and their cloudy partners (or internal services divisions) offer remote failover and recovery of virtualized servers running in the data closets of the world. If anything could possibly kill tape backup once and for all, virtual server failover just might be the thing to do it if it is not too costly and is done in a way that doesn't require a lot of bandwidth once the initial images are offloaded to a backup site.

If anything, IDC's numbers for physical-only server shipments might be low, depending on the services that companies come up with in the next four years. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
The hidden costs of self-signed SSL certificates
Exploring the true TCO for self-signed SSL certificates, including a side-by-side comparison of a self-signed architecture versus working with a third-party SSL vendor.