Feeds

Simple ways to tune your private cloud infrastructure

Put the symphony on hold

Internet Security Threat Report 2014

“Dynamic workload management” and “private cloud” are just two of the terms currently in vogue to describe new approaches to the management of IT systems.

Dynamic IT is likely to be marketed as a fully automated environment where everything is controlled in an orchestrated manner.

We know from a recent survey that a growing number of Reg readers regard dynamic workload management and private cloud initiatives as important. But while interest is quite high, only a few indicate they are well advanced in their implementation.

The idea of a do-everything-at-once orchestration project is almost certain to be considered as overkill, beyond any hope of implementation without ripping out all systems and starting again from scratch.

Back to basics

Few companies would be prepared to contemplate such a big-bang approach. Most will be by looking for step-by-step improvements, beginning with the fundamentals of infrastructure management.

So what can organisations do to put in place the building blocks for private cloud without breaking the bank? Where do they start?

The first item on the list should be getting key elements of the underlying infrastructure – primarily servers, storage, and networking – working together effectively. The aim is to limit complexity when laying the foundations of an orchestrated environment.

But we know from a number of studies that few organisations feel they have the systems management tools they need to undertake routine daily administration, never mind operating dynamic private clouds.

Even fewer consider the tools they have to be well integrated with each other. In many organisations systems administration is performed in silos, usually using discrete tools.

This often provides a fragmented view of the world, with no coherent picture of systems components and how they work together.

Knowing me, knowing you

It is therefore important for IT administrators to have an accurate and up-to-date handle on just what systems are deployed in the company, what applications they support and how these are related to business services.

Essentially, this amounts to performing some basic inventory discovery, coupled with an elementary staff survey to find out the importance of each service and the numbers of people using it.

Equally, it is a good idea to have some information on the quality of service being delivered prior to making any alterations to the underlying infrastructure to make sure that flexing resources will not result in degraded services and unhappy users.

This is a key area, yet we know from many studies that most organisations have little service quality monitoring in place.

Other management tools may become important as flexibility increases. These include asset management and change management systems, perhaps ultimately leading to a service catalogue.

Many of these may already be in place, at least partially. But as part of a joined-up approach to any form of private cloud and dynamic IT service delivery, it is critical that the tools used to manage servers, both physical and virtual, are well integrated with those employed to administer storage systems and networks.

All part of the plan

In terms of prioritising what to include in the new dynamic management environment, it is usually better to start with relatively simple applications or services.

IT staff can then establish the operational procedures to manage the service as a whole, even if this means using multiple tools, without getting bogged down in application-level complexity.

When the processes are in place, any technology updates or changes can be considered case by case, as long as any tools acquired fit into an overall plan.

As things develop, it is likely that issues such as charge back and service accounting and reporting will become important. Process automation systems may also become relevant as the scope of the dynamic infrastructure expands, although they should not be an inhibiting factor at the start of the journey.

The aim is not to try to take on everything in one go and totally transform the whole of your IT delivery.

Start simple, gain confidence and grow from there. Boiling the ocean is rarely effective except in science fiction movies. ®

Tony Lock is programme director at UK analyst Freeform Dynamics

Remote control for virtualized desktops

More from The Register

next story
BIG FAT Lies: Porky Pies about obesity
What really shortens lives? Reading this sort of crap in the papers
Assange™ slumps back on Ecuador's sofa after detention appeal binned
Swedish court rules there's 'great risk' WikiLeaker will dodge prosecution
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.