Feeds

Where is virtualisation taking you?

Is ‘perfect’ possible – or indeed necessary?

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

Lab Anyone who’s been in this industry for longer than a decade will know that some of what IT vendors say needs to be taken with a pinch of salt. Virtualisation holds great promise, so we are told – but yet so did blade servers, grid architectures, enterprise management solutions, application service providers... the list goes on. But even if we cut through the we’re-already-there marchitectures of more zealous product pushers, virtualisation does appear to offer a bit of a path towards a better way to do IT.

It’s not as if what we have is completely broken – far from it. In general, and as shown repeatedly in our research, few organisations feel completely under the cosh when it comes to their IT. I can think of a couple of places I’ve worked that really were facing the technological equivalent of a failed marriage, but in general, IT and the business do tend to rub along.

All the same, many IT managers across the globe do reach that point in their careers where they think to themselves, “There has to be a better way of doing things than this.” And no doubt there is merit in exploring certain options, be they in software architectures, systems management, data centre design, backup policy... you name it.

One recurring ‘better way’ is that of running IT in a semi-automated manner – or indeed as automated as possible. I’ve said in the past that I don’t believe IT will become a utility in the short-to-medium term. It’s just too darned complex, and the level of technical competence required to deploy and manage efficient IT services is just too high. However, just making IT a bit more dynamic would be a good start, and virtualisation has been said by many to hold the key to such a transition.

But here’s the rub: what’s the real gain to be had from such a virtualised environment? Imagine pristine rows of servers, each running multiple virtual machines, delivering dynamically scaled services to users as efficiently as possible. While this might sound jolly good in principle, a number of counters exist.

First, that the cost of hardware – or at least, the relative cost per unit of processing – continues to drop. The issue with defining any ideal environment is that it needs to be sustainable – no point in doing so if, in three years’ time, you’ll need to do it all again. Business changes as fast as IT – and all it takes is a single merger for all that hard work defining the ideal environment to be thrown out of the window. Meanwhile there is a big question over people costs. The basic principle is that “people are expensive, automation is good,” but I’m sure you have your own anecdotes about how the systems that were supposed to simplify things then required double the operations staff to run.

Many of these questions remain unanswered, and the ultimate cost-benefit of virtualisation has still to be proven in the mainstream context. Perhaps this is a good thing when we consider that some pieces of the IT puzzle are still catching up with the potential of virtualisation – Intel and AMD’s latest chipsets will help, to be sure, and pan-industry vendor partnerships will take things forward in terms of interoperability. In the meantime, we have management best practice and its associated tooling, neither of which could be said to bake in virtualisation right now.

While there’s still work to be done, these are of course early days: most organisations are still in what we could consider a ‘pilot’ stage when it comes to virtualisation, and are only starting to consider what comes next. On Wednesday this week, we considered where virtualisation goes next after the pilot – and there is plenty that can be done with it without needing to take it to its ultra-dynamic conclusion.

So it’s certainly not about being downhearted, more a recognition that for virtualisation, perhaps the best is yet to come. ®

Providing a secure and efficient Helpdesk

More from The Register

next story
Preview redux: Microsoft ships new Windows 10 build with 7,000 changes
Latest bleeding-edge bits borrow Action Center from Windows Phone
Google opens Inbox – email for people too thick to handle email
Print this article out and give it to someone tech-y if you get stuck
Microsoft promises Windows 10 will mean two-factor auth for all
Sneak peek at security features Redmond's baking into new OS
UNIX greybeards threaten Debian fork over systemd plan
'Veteran Unix Admins' fear desktop emphasis is betraying open source
Entity Framework goes 'code first' as Microsoft pulls visual design tool
Visual Studio database diagramming's out the window
Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...
Wobbly Gmail, Contacts, Calendar on the other hand ...
DEATH by PowerPoint: Microsoft warns of 0-day attack hidden in slides
Might put out patch in update, might chuck it out sooner
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.