Feeds

Server virtualisation is not enough

What needs to change to build a private cloud?

Protecting against web application threats using SSL

The server computing architecture that lies at the heart of what we now refer to as private cloud has been considered several times over. The game-changer – the capability that makes private cloud possible today – is virtualisation, which enables logical workloads to be fully abstracted from machines.

The famous Moore’s Law has helped, in that today’s processors are more than capable of running multiple instances of an operating-system-plus-application combo.

Do the math

Add clever software that enable virtual instances to roll on and off a server like cars travelling on Eurostar, and you have the starting point for the private cloud.

It would be a mistake, however, to think that server virtualisation can do it all by itself. It’s a question of maths: if you’re looking to achieve consolidation ratios of five, ten, 20 virtual servers to one physical machine, the figures need to be supported by the entire IT architecture within and outside the server, not just the CPU.

Take RAM, for example. While traditional workloads may use only a fraction of available processor cycles, they could use a maximum of physical memory.

This is not only because operating systems tend to work the memory as hard as they can, but also because poorly written applications can insist on loading unnecessary functionality, or fail to de-allocate memory when it is no longer required.

Failing memory

You may not need ten times as much memory to run ten virtual machines, but you do need to think about how much you should be putting in each server. Some, but not all, virtualisation solutions allow for over-provisioning of memory – that is, the pre-allocation of a maximum quantity of memory that may or may not be required in practice.

You still need to size your physical RAM in advance, however, particularly for private cloud environments where, in theory, you don’t know in advance what you will want to run where.

Get the processor and RAM right, and the next thing you need to think about is server I/O. Again, the calculation is simple: if you have ten computers, say, all running on the same box, what happens when they all want to transmit data or access storage at the same time?

Server technologies have been advancing to respond to the parallelism needs of multiple virtual machines, such as Intel’s 7500 (formerly Boxboro) PCI bus chipset, which was designed with virtualisation in mind.

Message in a bottleneck

The server’s internal bus architecture is just the start of a sequence of electronics that leads to disk-based storage, all of which needs to take into account potentially increased throughput. Any of the links in the chain can become the bottleneck.

So scale the servers right, then look at networking requirements, then storage. While network engineers might say it’s simply a case of deploying bigger boxes that can support gigabit Ethernet, few would deny the issues that emerge in the storage layer.

You might easily end up backing up the same information twice

We will discuss storage in more detail in another article, but now let's look at backups as a simple, yet blindingly obvious, example of the challenges to be faced.

Most applications need to be backed up, as do operating systems, and indeed entire virtual machines. Aside from the fact that it would be a huge mistake to back up all virtual instances on a single server at the same time, you might quite easily end up backing up the same information twice, putting further pressure on I/O.

Intelligent design

In further articles we will also discuss management, policy, business alignment and all that jazz. For now the question is: given the complexity of today’s IT environments, the variations in size and scale of the applications we want to deploy, uncertainty about usage levels and so on, is it possible to define a generic private cloud that will cope with anything the organisation might throw at it?

The answer is in principle yes, but only if careful consideration, planning and design has been applied to all the links in the chain.

It is not just about having super-fast kit. Some pretty simple decisions can be made, such as locating data storage as close as possible to associated virtual machine storage, or defining a staggered backup policy that won’t bring a server down.

However dynamic IT becomes, the private cloud can never be a magic bullet that overcomes poor architectural decision-making. ®

Choosing a cloud hosting partner with confidence

More from The Register

next story
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Google+ GOING, GOING ... ? Newbie Gmailers no longer forced into mandatory ID slurp
Mountain View distances itself from lame 'network thingy'
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
Seagate's triple-headed Cerberus could SAVE the DISK WORLD
... and possibly bring us even more HAMR time. Yay!
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.