Feeds

The long and winding road to server virtualisation

It’s all about Tetris and the price of fish

  • alert
  • submit to reddit

Business security measures using SSL

Reader feedback "Move along, nothing to see here," said Reg reader Joshua 1, a self-confessed "old timer on the virtualisation front," in response to the question of whether server virtualisation was ready for prime time.

Indeed it is difficult to read anything about virtualisation without getting the impression that it is inevitable. But while some Reg readers might already be some way down the virtual track, feedback garnered in the course of this virtualisation lab has made it pretty clear that others are still on the starting blocks. "We've taken a few steps," says another respondent. "We just migrated one of our production servers from a physical environment to a virtual environment. It's running on 'dedicated' hardware, but it's still a step in this direction."

While the level of adoption may vary, the verdict would appear that virtualisation technology is ready to do the job it was designed for. "Hell yeah it's ready," says Nate Amsden. "I still think it requires some intelligence on the part of the people deploying it, you can get really poor results if you do the wrong things (which are by no means obvious). But the same is true for pretty much any complex piece of software."

So, where to start? Virtualisation comes with a suck-it-and-see mode, in that there is little to stop anybody running up a virtual machine and seeing what gives. From this point, as we have seen in previous research, the logical first step is in server consolidation, which has a reasonably straightforward business case.

From there, the question is how to scale things up, or indeed out, such that virtualisation becomes more the norm than the exception? Let's be clear: at the moment still, virtualisation is not going to be the answer to absolutely every workload, not according to your experiences anyway. In the case of databases for example, feedback suggests that a top-end database workload needs all the resources it can get, in which case virtualisation would be wasted on it. "If you don't care about performance then you can use VMs, but what would be the point?" said one reader.

But on the plus side the flexibility offered by virtualisation is seen as a major advantage, databases or no databases. Consider:

If I have a fully populated blade centre (7-14 servers depending on who we're talking about), all in a single VM pool, and I then fill 33-50% capacity with Virtual machines dedicated to database server work, I can further fill out the other 50-66% VM pool with Application/web/file server VM's, probably squeezing upward of 200% capacity out of the given rack space...

If you want to know more about making the transition to production-scale virtualisation, tune in here.

The role of management

Meanwhile, once you've made the transition, you're going to have to manage what you've deployed. Virtualisation does appear (from your responses) to make things easier to move around, plan, keep available and so on. But it doesn't take prisoners – you're still going to need the technical smarts and suitable processes to deal with the virtualized environment. While Trevor Pott might have confessed to insufficient caffeine when he talked about Tetris, he did make a valid point about thinking architecturally:

It really all boils down to VM Tetris. As much as virtualisation enables ease of administration and management, you still have to understand the workload of all your VMs. You have to understand the capabilities of your hardware. You pack your VMs in with other VMs in such a way that they won't impinge on one another, and you can do remarkable things. Some days you get a box, or an L, some days you get a squiggly jaggy thing you have no idea where to put.

By the way, if you think that metaphor's a stretch, it's worth sharing Jimmy Pop's Fishmonger analogy:

Each fishmonger is a virtual machine with a hypervisor controlling access to resources. If we allocate them efficiently, we can all get to the pub sooner as the work is done with less bottlenecks! And we can save money, instead of hiring 50 fishmongers when there is really only work for 10.. (even if each fishmonger works on a specific type of fish and can't handle the others, in this case, each would be a virtual fish-monger running inside 10 real fishmongers operating 24/7 with no sleep..)

Choosing a cloud hosting partner with confidence

More from The Register

next story
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.