The long and winding road to server virtualisation

It’s all about Tetris and the price of fish

  • alert
  • submit to reddit

Intelligent flash storage arrays

Reader feedback "Move along, nothing to see here," said Reg reader Joshua 1, a self-confessed "old timer on the virtualisation front," in response to the question of whether server virtualisation was ready for prime time.

Indeed it is difficult to read anything about virtualisation without getting the impression that it is inevitable. But while some Reg readers might already be some way down the virtual track, feedback garnered in the course of this virtualisation lab has made it pretty clear that others are still on the starting blocks. "We've taken a few steps," says another respondent. "We just migrated one of our production servers from a physical environment to a virtual environment. It's running on 'dedicated' hardware, but it's still a step in this direction."

While the level of adoption may vary, the verdict would appear that virtualisation technology is ready to do the job it was designed for. "Hell yeah it's ready," says Nate Amsden. "I still think it requires some intelligence on the part of the people deploying it, you can get really poor results if you do the wrong things (which are by no means obvious). But the same is true for pretty much any complex piece of software."

So, where to start? Virtualisation comes with a suck-it-and-see mode, in that there is little to stop anybody running up a virtual machine and seeing what gives. From this point, as we have seen in previous research, the logical first step is in server consolidation, which has a reasonably straightforward business case.

From there, the question is how to scale things up, or indeed out, such that virtualisation becomes more the norm than the exception? Let's be clear: at the moment still, virtualisation is not going to be the answer to absolutely every workload, not according to your experiences anyway. In the case of databases for example, feedback suggests that a top-end database workload needs all the resources it can get, in which case virtualisation would be wasted on it. "If you don't care about performance then you can use VMs, but what would be the point?" said one reader.

But on the plus side the flexibility offered by virtualisation is seen as a major advantage, databases or no databases. Consider:

If I have a fully populated blade centre (7-14 servers depending on who we're talking about), all in a single VM pool, and I then fill 33-50% capacity with Virtual machines dedicated to database server work, I can further fill out the other 50-66% VM pool with Application/web/file server VM's, probably squeezing upward of 200% capacity out of the given rack space...

If you want to know more about making the transition to production-scale virtualisation, tune in here.

The role of management

Meanwhile, once you've made the transition, you're going to have to manage what you've deployed. Virtualisation does appear (from your responses) to make things easier to move around, plan, keep available and so on. But it doesn't take prisoners – you're still going to need the technical smarts and suitable processes to deal with the virtualized environment. While Trevor Pott might have confessed to insufficient caffeine when he talked about Tetris, he did make a valid point about thinking architecturally:

It really all boils down to VM Tetris. As much as virtualisation enables ease of administration and management, you still have to understand the workload of all your VMs. You have to understand the capabilities of your hardware. You pack your VMs in with other VMs in such a way that they won't impinge on one another, and you can do remarkable things. Some days you get a box, or an L, some days you get a squiggly jaggy thing you have no idea where to put.

By the way, if you think that metaphor's a stretch, it's worth sharing Jimmy Pop's Fishmonger analogy:

Each fishmonger is a virtual machine with a hypervisor controlling access to resources. If we allocate them efficiently, we can all get to the pub sooner as the work is done with less bottlenecks! And we can save money, instead of hiring 50 fishmongers when there is really only work for 10.. (even if each fishmonger works on a specific type of fish and can't handle the others, in this case, each would be a virtual fish-monger running inside 10 real fishmongers operating 24/7 with no sleep..)

Providing a secure and efficient Helpdesk

More from The Register

next story
UNIX greybeards threaten Debian fork over systemd plan
'Veteran Unix Admins' fear desktop emphasis is betraying open source
Netscape Navigator - the browser that started it all - turns 20
It was 20 years ago today, Marc Andreeesen taught the band to play
Redmond top man Satya Nadella: 'Microsoft LOVES Linux'
Open-source 'love' fairly runneth over at cloud event
Chrome 38's new HTML tag support makes fatties FIT and SKINNIER
First browser to protect networks' bandwith using official spec
Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...
Wobbly Gmail, Contacts, Calendar on the other hand ...
Admins! Never mind POODLE, there're NEW OpenSSL bugs to splat
Four new patches for open-source crypto libraries
prev story


Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.