Feeds

Where is virtualisation taking you?

Is ‘perfect’ possible – or indeed necessary?

  • alert
  • submit to reddit

Security for virtualized datacentres

Lab Anyone who’s been in this industry for longer than a decade will know that some of what IT vendors say needs to be taken with a pinch of salt. Virtualisation holds great promise, so we are told – but yet so did blade servers, grid architectures, enterprise management solutions, application service providers... the list goes on. But even if we cut through the we’re-already-there marchitectures of more zealous product pushers, virtualisation does appear to offer a bit of a path towards a better way to do IT.

It’s not as if what we have is completely broken – far from it. In general, and as shown repeatedly in our research, few organisations feel completely under the cosh when it comes to their IT. I can think of a couple of places I’ve worked that really were facing the technological equivalent of a failed marriage, but in general, IT and the business do tend to rub along.

All the same, many IT managers across the globe do reach that point in their careers where they think to themselves, “There has to be a better way of doing things than this.” And no doubt there is merit in exploring certain options, be they in software architectures, systems management, data centre design, backup policy... you name it.

One recurring ‘better way’ is that of running IT in a semi-automated manner – or indeed as automated as possible. I’ve said in the past that I don’t believe IT will become a utility in the short-to-medium term. It’s just too darned complex, and the level of technical competence required to deploy and manage efficient IT services is just too high. However, just making IT a bit more dynamic would be a good start, and virtualisation has been said by many to hold the key to such a transition.

But here’s the rub: what’s the real gain to be had from such a virtualised environment? Imagine pristine rows of servers, each running multiple virtual machines, delivering dynamically scaled services to users as efficiently as possible. While this might sound jolly good in principle, a number of counters exist.

First, that the cost of hardware – or at least, the relative cost per unit of processing – continues to drop. The issue with defining any ideal environment is that it needs to be sustainable – no point in doing so if, in three years’ time, you’ll need to do it all again. Business changes as fast as IT – and all it takes is a single merger for all that hard work defining the ideal environment to be thrown out of the window. Meanwhile there is a big question over people costs. The basic principle is that “people are expensive, automation is good,” but I’m sure you have your own anecdotes about how the systems that were supposed to simplify things then required double the operations staff to run.

Many of these questions remain unanswered, and the ultimate cost-benefit of virtualisation has still to be proven in the mainstream context. Perhaps this is a good thing when we consider that some pieces of the IT puzzle are still catching up with the potential of virtualisation – Intel and AMD’s latest chipsets will help, to be sure, and pan-industry vendor partnerships will take things forward in terms of interoperability. In the meantime, we have management best practice and its associated tooling, neither of which could be said to bake in virtualisation right now.

While there’s still work to be done, these are of course early days: most organisations are still in what we could consider a ‘pilot’ stage when it comes to virtualisation, and are only starting to consider what comes next. On Wednesday this week, we considered where virtualisation goes next after the pilot – and there is plenty that can be done with it without needing to take it to its ultra-dynamic conclusion.

So it’s certainly not about being downhearted, more a recognition that for virtualisation, perhaps the best is yet to come. ®

Remote control for virtualized desktops

More from The Register

next story
PEAK APPLE: iOS 8 is least popular Cupertino mobile OS in all of HUMAN HISTORY
'Nerd release' finally staggers past 50 per cent adoption
Microsoft to bake Skype into IE, without plugins
Redmond thinks the Object Real-Time Communications API for WebRTC is ready to roll
Microsoft promises Windows 10 will mean two-factor auth for all
Sneak peek at security features Redmond's baking into new OS
Mozilla: Spidermonkey ATE Apple's JavaScriptCore, THRASHED Google V8
Moz man claims the win on rivals' own benchmarks
FTDI yanks chip-bricking driver from Windows Update, vows to fight on
Next driver to battle fake chips with 'non-invasive' methods
DEATH by PowerPoint: Microsoft warns of 0-day attack hidden in slides
Might put out patch in update, might chuck it out sooner
Ubuntu 14.10 tries pulling a Steve Ballmer on cloudy offerings
Oi, Windows, centOS and openSUSE – behave, we're all friends here
Was ist das? Eine neue Suse Linux Enterprise? Ausgezeichnet!
Version 12 first major-number Suse release since 2009
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
Simplify SSL certificate management across the enterprise
Simple steps to take control of SSL across the enterprise, and recommendations for a management platform for full visibility and single-point of control for these Certificates.