Feeds

Where is virtualisation taking you?

Is ‘perfect’ possible – or indeed necessary?

  • alert
  • submit to reddit

Boost IT visibility and business value

Lab Anyone who’s been in this industry for longer than a decade will know that some of what IT vendors say needs to be taken with a pinch of salt. Virtualisation holds great promise, so we are told – but yet so did blade servers, grid architectures, enterprise management solutions, application service providers... the list goes on. But even if we cut through the we’re-already-there marchitectures of more zealous product pushers, virtualisation does appear to offer a bit of a path towards a better way to do IT.

It’s not as if what we have is completely broken – far from it. In general, and as shown repeatedly in our research, few organisations feel completely under the cosh when it comes to their IT. I can think of a couple of places I’ve worked that really were facing the technological equivalent of a failed marriage, but in general, IT and the business do tend to rub along.

All the same, many IT managers across the globe do reach that point in their careers where they think to themselves, “There has to be a better way of doing things than this.” And no doubt there is merit in exploring certain options, be they in software architectures, systems management, data centre design, backup policy... you name it.

One recurring ‘better way’ is that of running IT in a semi-automated manner – or indeed as automated as possible. I’ve said in the past that I don’t believe IT will become a utility in the short-to-medium term. It’s just too darned complex, and the level of technical competence required to deploy and manage efficient IT services is just too high. However, just making IT a bit more dynamic would be a good start, and virtualisation has been said by many to hold the key to such a transition.

But here’s the rub: what’s the real gain to be had from such a virtualised environment? Imagine pristine rows of servers, each running multiple virtual machines, delivering dynamically scaled services to users as efficiently as possible. While this might sound jolly good in principle, a number of counters exist.

First, that the cost of hardware – or at least, the relative cost per unit of processing – continues to drop. The issue with defining any ideal environment is that it needs to be sustainable – no point in doing so if, in three years’ time, you’ll need to do it all again. Business changes as fast as IT – and all it takes is a single merger for all that hard work defining the ideal environment to be thrown out of the window. Meanwhile there is a big question over people costs. The basic principle is that “people are expensive, automation is good,” but I’m sure you have your own anecdotes about how the systems that were supposed to simplify things then required double the operations staff to run.

Many of these questions remain unanswered, and the ultimate cost-benefit of virtualisation has still to be proven in the mainstream context. Perhaps this is a good thing when we consider that some pieces of the IT puzzle are still catching up with the potential of virtualisation – Intel and AMD’s latest chipsets will help, to be sure, and pan-industry vendor partnerships will take things forward in terms of interoperability. In the meantime, we have management best practice and its associated tooling, neither of which could be said to bake in virtualisation right now.

While there’s still work to be done, these are of course early days: most organisations are still in what we could consider a ‘pilot’ stage when it comes to virtualisation, and are only starting to consider what comes next. On Wednesday this week, we considered where virtualisation goes next after the pilot – and there is plenty that can be done with it without needing to take it to its ultra-dynamic conclusion.

So it’s certainly not about being downhearted, more a recognition that for virtualisation, perhaps the best is yet to come. ®

Boost IT visibility and business value

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft refuses to nip 'Windows 9' unzip lip slip
Look at the shiny Windows 8.1, why can't you people talk about 8.1, sobs an exec somewhere
Intel's Raspberry Pi rival Galileo can now run Windows
Behold the Internet of Things. Wintel Things
Linux Foundation says many Linux admins and engineers are certifiable
Floats exam program to help IT employers lock up talent
Microsoft cries UNINSTALL in the wake of Blue Screens of Death™
Cache crash causes contained choloric calamity
Eat up Martha! Microsoft slings handwriting recog into OneNote on Android
Freehand input on non-Windows kit for the first time
Linux kernel devs made to finger their dongles before contributing code
Two-factor auth enabled for Kernel.org repositories
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
BYOD's dark side: Data protection
An endpoint data protection solution that adds value to the user and the organization so it can protect itself from data loss as well as leverage corporate data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?