Feeds

Is server virtualization ready for production?

Beyond the low hanging fruit

  • alert
  • submit to reddit

Top 5 reasons to deploy VMware with Tegile

Lab The adoption of server virtualization technology follows several trajectories. We could consider the breadth of its penetration in terms of the number of organisations using it today, or we could consider the depth of its use in individual organisations, in terms of what they actually do with it.

The latter is of more relevance to IT departments that have already taken first steps down the server virtualization path. For other organizations, if we think beyond the workloads already being run in a virtualized environment, is there a ‘next’. And if there is, what is it?

Perhaps this is simplistic. When you think about virtualization do you think in terms of a proportion of the x86 server estate in your datacenter that ‘could’ be virtualized or do you think about the different types of workloads that need executing? Vendors with a vested interest in shifting virtualization technology tend to presume that something that could be a candidate for virtualization, automatically will be.

However, we know from Reg reader research that decision-making is typically focused on a simpler criterion: can it save money right now? As a result, virtualization tends to be employed for the more straightforward workloads that can be easily consolidated.

Admittedly, the notion of ‘straight forward’ is relative, although there are some commonly accepted candidates such as print servers, web servers and the like. Whether these are chosen because they are seen as cost-saving, low risk, ‘non-core’ or ‘non-critical’ areas, it’s where most organizations cut their teeth. So where do we go from here? The answer has to be into areas of higher potential risk, and less evident cost-benefit. So then: what is the rationale for making decisions?

Work up the list

To reiterate, the factors at play are: cost savings; virtualization benefit; business importance; and migration of risk. Does IT simply ‘work up the list’ from least risk / importance? Or are those with prior experience now applying virtualization to areas which would benefit specifically from it, regardless of their importance to the business?

Factors around migration-risk bring into question enough experience and confidence exists in the technology itself and on the periphery (availability, resilience and back-up and recovery systems), as well as the skills of the IT department itself to be able to consider higher-risk workloads as virtualization candidates.

One must also take into consideration the socio-political aspects of IT ownership. A line of business leader might have concerns about ‘his’ application running in a virtualized environment, even if he's perfectly happy with the service he gets from ‘lower value’ services. But if the technology is proven elsewhere, what’s the fuss?

Part of the answer could lie in how big the first step down the virtualization route was. Did the IT department have to fight to make it to happen, or did someone in the business make a request for it directly or indirectly – e.g., a demand that could only be fulfilled by employing technology in this way?

One argument suggests that had it not been for the economic crisis in 2008, many organizations would not have felt it necessary to virtualise any server infrastructure.

So, if you have moved virtualization beyond the pilot, how did you decide? Was it via the same process you employed the first time you decided to take advantage of this technology? Did it involve a complex risk management exercise or was it more about gut feel and trust in your collective abilities and the technology itself?

‘If you’ve already taken the ‘next step’, or are thinking about it, we’d like to hear about the decision-making processes you or your department have been working through and your experiences of migrating ‘next level’ workloads into the virtual environment so far.

Security for virtualized datacentres

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
'People have forgotten just how late the first iPhone arrived ...'
Plus: 'Google's IDEALISM is an injudicious justification for inappropriate biz practices'
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
Netscape plugins about to stop working in Chrome for Mac
Google kills off 32-bit Chrome, only on Mac
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.