Feeds

The long and winding road to server virtualisation

It’s all about Tetris and the price of fish

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Reader feedback "Move along, nothing to see here," said Reg reader Joshua 1, a self-confessed "old timer on the virtualisation front," in response to the question of whether server virtualisation was ready for prime time.

Indeed it is difficult to read anything about virtualisation without getting the impression that it is inevitable. But while some Reg readers might already be some way down the virtual track, feedback garnered in the course of this virtualisation lab has made it pretty clear that others are still on the starting blocks. "We've taken a few steps," says another respondent. "We just migrated one of our production servers from a physical environment to a virtual environment. It's running on 'dedicated' hardware, but it's still a step in this direction."

While the level of adoption may vary, the verdict would appear that virtualisation technology is ready to do the job it was designed for. "Hell yeah it's ready," says Nate Amsden. "I still think it requires some intelligence on the part of the people deploying it, you can get really poor results if you do the wrong things (which are by no means obvious). But the same is true for pretty much any complex piece of software."

So, where to start? Virtualisation comes with a suck-it-and-see mode, in that there is little to stop anybody running up a virtual machine and seeing what gives. From this point, as we have seen in previous research, the logical first step is in server consolidation, which has a reasonably straightforward business case.

From there, the question is how to scale things up, or indeed out, such that virtualisation becomes more the norm than the exception? Let's be clear: at the moment still, virtualisation is not going to be the answer to absolutely every workload, not according to your experiences anyway. In the case of databases for example, feedback suggests that a top-end database workload needs all the resources it can get, in which case virtualisation would be wasted on it. "If you don't care about performance then you can use VMs, but what would be the point?" said one reader.

But on the plus side the flexibility offered by virtualisation is seen as a major advantage, databases or no databases. Consider:

If I have a fully populated blade centre (7-14 servers depending on who we're talking about), all in a single VM pool, and I then fill 33-50% capacity with Virtual machines dedicated to database server work, I can further fill out the other 50-66% VM pool with Application/web/file server VM's, probably squeezing upward of 200% capacity out of the given rack space...

If you want to know more about making the transition to production-scale virtualisation, tune in here.

The role of management

Meanwhile, once you've made the transition, you're going to have to manage what you've deployed. Virtualisation does appear (from your responses) to make things easier to move around, plan, keep available and so on. But it doesn't take prisoners – you're still going to need the technical smarts and suitable processes to deal with the virtualized environment. While Trevor Pott might have confessed to insufficient caffeine when he talked about Tetris, he did make a valid point about thinking architecturally:

It really all boils down to VM Tetris. As much as virtualisation enables ease of administration and management, you still have to understand the workload of all your VMs. You have to understand the capabilities of your hardware. You pack your VMs in with other VMs in such a way that they won't impinge on one another, and you can do remarkable things. Some days you get a box, or an L, some days you get a squiggly jaggy thing you have no idea where to put.

By the way, if you think that metaphor's a stretch, it's worth sharing Jimmy Pop's Fishmonger analogy:

Each fishmonger is a virtual machine with a hypervisor controlling access to resources. If we allocate them efficiently, we can all get to the pub sooner as the work is done with less bottlenecks! And we can save money, instead of hiring 50 fishmongers when there is really only work for 10.. (even if each fishmonger works on a specific type of fish and can't handle the others, in this case, each would be a virtual fish-monger running inside 10 real fishmongers operating 24/7 with no sleep..)

Secure remote control for conventional and virtual desktops

More from The Register

next story
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
This is how I set about making a fortune with my own startup
Would you leave your well-paid job to chase your dream?
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?