Feeds

Server virtualisation is not enough

What needs to change to build a private cloud?

Internet Security Threat Report 2014

The server computing architecture that lies at the heart of what we now refer to as private cloud has been considered several times over. The game-changer – the capability that makes private cloud possible today – is virtualisation, which enables logical workloads to be fully abstracted from machines.

The famous Moore’s Law has helped, in that today’s processors are more than capable of running multiple instances of an operating-system-plus-application combo.

Do the math

Add clever software that enable virtual instances to roll on and off a server like cars travelling on Eurostar, and you have the starting point for the private cloud.

It would be a mistake, however, to think that server virtualisation can do it all by itself. It’s a question of maths: if you’re looking to achieve consolidation ratios of five, ten, 20 virtual servers to one physical machine, the figures need to be supported by the entire IT architecture within and outside the server, not just the CPU.

Take RAM, for example. While traditional workloads may use only a fraction of available processor cycles, they could use a maximum of physical memory.

This is not only because operating systems tend to work the memory as hard as they can, but also because poorly written applications can insist on loading unnecessary functionality, or fail to de-allocate memory when it is no longer required.

Failing memory

You may not need ten times as much memory to run ten virtual machines, but you do need to think about how much you should be putting in each server. Some, but not all, virtualisation solutions allow for over-provisioning of memory – that is, the pre-allocation of a maximum quantity of memory that may or may not be required in practice.

You still need to size your physical RAM in advance, however, particularly for private cloud environments where, in theory, you don’t know in advance what you will want to run where.

Get the processor and RAM right, and the next thing you need to think about is server I/O. Again, the calculation is simple: if you have ten computers, say, all running on the same box, what happens when they all want to transmit data or access storage at the same time?

Server technologies have been advancing to respond to the parallelism needs of multiple virtual machines, such as Intel’s 7500 (formerly Boxboro) PCI bus chipset, which was designed with virtualisation in mind.

Message in a bottleneck

The server’s internal bus architecture is just the start of a sequence of electronics that leads to disk-based storage, all of which needs to take into account potentially increased throughput. Any of the links in the chain can become the bottleneck.

So scale the servers right, then look at networking requirements, then storage. While network engineers might say it’s simply a case of deploying bigger boxes that can support gigabit Ethernet, few would deny the issues that emerge in the storage layer.

You might easily end up backing up the same information twice

We will discuss storage in more detail in another article, but now let's look at backups as a simple, yet blindingly obvious, example of the challenges to be faced.

Most applications need to be backed up, as do operating systems, and indeed entire virtual machines. Aside from the fact that it would be a huge mistake to back up all virtual instances on a single server at the same time, you might quite easily end up backing up the same information twice, putting further pressure on I/O.

Intelligent design

In further articles we will also discuss management, policy, business alignment and all that jazz. For now the question is: given the complexity of today’s IT environments, the variations in size and scale of the applications we want to deploy, uncertainty about usage levels and so on, is it possible to define a generic private cloud that will cope with anything the organisation might throw at it?

The answer is in principle yes, but only if careful consideration, planning and design has been applied to all the links in the chain.

It is not just about having super-fast kit. Some pretty simple decisions can be made, such as locating data storage as close as possible to associated virtual machine storage, or defining a staggered backup policy that won’t bring a server down.

However dynamic IT becomes, the private cloud can never be a magic bullet that overcomes poor architectural decision-making. ®

Internet Security Threat Report 2014

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
IBM storage revenues sink: 'We are disappointed,' says CEO
Time to put the storage biz up for sale?
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.