Feeds

Sizing a server to support desktop virtualisation

Savvy scripting and baselining

Secure remote control for conventional and virtual desktops

Virtualising the desktop can bring benefits at the endpoint. It makes desktops more manageable, can reduce power load throughout the building, and can make systems more secure.

But IT departments shouldn't underestimate the additional investments required at the back end. Running more endpoint logic centrally can have huge ramifications on server configurations, and can require significant adjustments to the budget. How can IT departments size their server infrastructures effectively?

Before the technology discussion comes the political and cultural one, says David Chalmers, CTO for the enterprise servers and storage business at HP. IT departments should look at the user requirements before deciding how to virtualise the desktop - and therefore, what kind of load will be placed on the server. VDI, for example, in which a separate desktop runs for each user at the back end, will place a far greater load on the server than, say, application streaming.

Daunting

VDI, should companies choose that route, is a daunting strategic decision. The costs are incurred at the server, among many other places. "A lot of people who choose that approach end up doing maybe 25 per cent of their desktops that way," says Ross Bentley, head of professional services at virtualisation consulting firm Assist Ltd.

"There will be some people that want the full footprint, although we tend to find that they are normally companies that don't worry about finance so much - like banks and offshore oil rigs and so forth. They have the capital."

Assessing the user requirements includes understanding what they're doing with their PCs, says Simon Gay, CTO at independent IT managed services provider Adapt.

A profile of a thousand callcentre users is different to half a dozen CAD/CAM users, from a computational perspective. He argues that analysing user activities can create opportunities for rationalising the application base, to help reduce server and storage loads. "The desktop has been largely unmanaged, and now IT is starting to discover what's out there," he says. "Typically, it spawns bigger questions, such as: 'Should we be running this stuff?'."

These considerations are important precursors to the benchmarking process. Deployment planners will want to choose applications that the users frequently access as the basis for application scripts that can be run against a load testing application on the server. Running these scripts will give planners an idea of how much the server can support.

Make sure that when estimating a server load, you plan for the worst- and then some, Bentley says. "If you take the medium or the middle ground, then if the worst ever happens you'll feel the load on the system. Get a baseline on what the worst possible scenario is." Adapt's Gay uses client-side agents to profile CPU and memory usage.

He advocates running them for up to a month, if possible, to assess CPU and memory uses as business conditions change. For example, CPU usage in some departmental desktops might spike at month's end.

Server sizing

The other approach involves server sizing by running scripted client sessions. Microsoft's approach for Terminal Services involves running client scripts against a centrally-hosting server sizing application (Roboserver). For VDI sizing, it also provides the Remote Desktop Load Simulation Toolset for server sizing purposes, which emulates client sessions operating on the server.

The server sizing process will give you a sense of the back-end resources needed to support your desktop sessions. Take that, and, if budget allows, add more memory and processor cores to provide more overhead for growth. However, consider I/O in this mix. The more powerful the server and the more sessions that are run on it, the more potential I/O traffic you'll find on its ports.

Much depends on other configuration decisions, such as where the storage is, but the savvy system designer will take this into account. Do you want a single server with high-capacity host bus adaptors, or a collection of smaller servers, with the desktop load spread across them? "It comes down to capital resources and floor space," says Bentley. "You also have power and air conditioning to consider".

Even after all of this, there are other issues facing IT departments trying to provide their users with a suitable virtual desktop experience. For example, bootstorms - which bog down servers and storage when lots of virtual machines start up at once - are a challenge (and will be addressed in a subsequent Register article).

Understanding the importance of server sizing is an important part of the desktop virtualisation process, but don't do it in isolation. Storage and network planning are all parameters that must be planned for. These are addressed elsewhere in this series. ®

Top 5 reasons to deploy VMware with Tegile

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting against web application threats using SSL
SSL encryption can protect server‐to‐server communications, client devices, cloud resources, and other endpoints in order to help prevent the risk of data loss and losing customer trust.