Feeds

Why do we think virtualization is new?

Define your terms

Choosing a cloud hosting partner with confidence

Opinion There are very many different communities involved in IT, ranging from those on the sharp-end user side of things to the many more engaged in the vendor and channel.

Plus, of course, several different groups, including analysts like myself, that sit somewhere between the two. As with life everywhere else, all of these different groups of people have their own goals and drivers.

One of the many problems for these numerous interrelated communes is that quite often the inter-communications between them happen at so many different levels and at vastly conflicting rates that misunderstanding and confusion is almost guaranteed.

It is my firm belief that the term "virtualization", along with "green IT", is now suffering from garbled communications initiated by certain elements of the vendor community.

Those selling solutions, or more accurately, those marketing IT solutions often choose to make products sound new and exciting. Quite why they do so has often puzzled me since I, as a former IT manager, have always been highly skeptical of anything really new as it usually means trouble.

However, when it comes to the crazy, and slightly strange, world of virtualization, things appear to be wild and certainly subject to more than a little hype. What is certain is that virtualization really can deliver benefits to both IT and the greater business. What is far less certain is why everyone appears to think virtualization is new?

It is fair to say virtualization has become the shorthand that is used to describe a number of logically similar technologies. In the server space this idea is nothing new as the mainframe has long boasted the ability to run "virtual" machines, each of which operates as if it were the only system running on the hardware when the reality is quite the opposite.

Large Unix servers started to offer a range of similar capabilities several years ago. It is really only the so-called "industry standard" servers based around Intel and AMD processors that are relatively new entrants into the server virtualization game.

However, there are huge numbers of these platforms, many of which are managed by administrators who might never have encountered either the mainframe or Unix variants of virtualization, so perhaps it would be unfair to be too harsh on those vendors who market server virtualization as if it is something brand new.

In IT, as in most walks of life, very few developments are truly revolutionary and absolutely new. Most ideas grow up over time before really springing to the front of mind with a vengeance.

Server virtualization has grown up over the years, especially on the mainframe, and those systems that deliver virtual functionality to Intel/AMD systems are delivering value. But, and it is a big but, the efforts of those marketing virtualization are not always working effectively.

In my conversations with IT professionals it is becoming clear that the hype and at times exaggerated claims that marketing folk make, and the very inexact use of language employed around virtualization, are succeeding not only in raising levels of awareness, but also in creating confusion in the minds of some IT administrators. The same could also be said of storage virtualization marketing.

It would be good if the industry could take a little time to define a number of terms more accurately to help remove the smog that is building around virtualization systems. Gaining widespread attention is good, but creating confusion and perhaps building unachievable expectations could slow adoption of these often valuable solutions.

Copyright © 2007, Freeform Dynamics

Tony Lock is programme director with The Register's research partner Freeform Dynamics. An industry analyst since 2000, Tony’s held technical and management roles with the University of London, BP and BT, and has also spent time working in the vendor arena.

Remote control for virtualized desktops

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.