Feeds

Simple ways to tune your private cloud infrastructure

Put the symphony on hold

The essential guide to IT transformation

“Dynamic workload management” and “private cloud” are just two of the terms currently in vogue to describe new approaches to the management of IT systems.

Dynamic IT is likely to be marketed as a fully automated environment where everything is controlled in an orchestrated manner.

We know from a recent survey that a growing number of Reg readers regard dynamic workload management and private cloud initiatives as important. But while interest is quite high, only a few indicate they are well advanced in their implementation.

The idea of a do-everything-at-once orchestration project is almost certain to be considered as overkill, beyond any hope of implementation without ripping out all systems and starting again from scratch.

Back to basics

Few companies would be prepared to contemplate such a big-bang approach. Most will be by looking for step-by-step improvements, beginning with the fundamentals of infrastructure management.

So what can organisations do to put in place the building blocks for private cloud without breaking the bank? Where do they start?

The first item on the list should be getting key elements of the underlying infrastructure – primarily servers, storage, and networking – working together effectively. The aim is to limit complexity when laying the foundations of an orchestrated environment.

But we know from a number of studies that few organisations feel they have the systems management tools they need to undertake routine daily administration, never mind operating dynamic private clouds.

Even fewer consider the tools they have to be well integrated with each other. In many organisations systems administration is performed in silos, usually using discrete tools.

This often provides a fragmented view of the world, with no coherent picture of systems components and how they work together.

Knowing me, knowing you

It is therefore important for IT administrators to have an accurate and up-to-date handle on just what systems are deployed in the company, what applications they support and how these are related to business services.

Essentially, this amounts to performing some basic inventory discovery, coupled with an elementary staff survey to find out the importance of each service and the numbers of people using it.

Equally, it is a good idea to have some information on the quality of service being delivered prior to making any alterations to the underlying infrastructure to make sure that flexing resources will not result in degraded services and unhappy users.

This is a key area, yet we know from many studies that most organisations have little service quality monitoring in place.

Other management tools may become important as flexibility increases. These include asset management and change management systems, perhaps ultimately leading to a service catalogue.

Many of these may already be in place, at least partially. But as part of a joined-up approach to any form of private cloud and dynamic IT service delivery, it is critical that the tools used to manage servers, both physical and virtual, are well integrated with those employed to administer storage systems and networks.

All part of the plan

In terms of prioritising what to include in the new dynamic management environment, it is usually better to start with relatively simple applications or services.

IT staff can then establish the operational procedures to manage the service as a whole, even if this means using multiple tools, without getting bogged down in application-level complexity.

When the processes are in place, any technology updates or changes can be considered case by case, as long as any tools acquired fit into an overall plan.

As things develop, it is likely that issues such as charge back and service accounting and reporting will become important. Process automation systems may also become relevant as the scope of the dynamic infrastructure expands, although they should not be an inhibiting factor at the start of the journey.

The aim is not to try to take on everything in one go and totally transform the whole of your IT delivery.

Start simple, gain confidence and grow from there. Boiling the ocean is rarely effective except in science fiction movies. ®

Tony Lock is programme director at UK analyst Freeform Dynamics

Secure remote control for conventional and virtual desktops

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
No, thank you. I will not code for the Caliphate
Some assignments, even the Bongster decline must
Kaspersky backpedals on 'done nothing wrong, nothing to fear' blather
Founder (and internet passport fan) now says privacy is precious
TROLL SLAYER Google grabs $1.3 MEEELLION in patent counter-suit
Chocolate Factory hits back at firm for suing customers
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Sit tight, fanbois. Apple's '$400' wearable release slips into early 2015
Sources: time to put in plenty of clock-watching for' iWatch
Facebook to let stalkers unearth buried posts with mobe search
Prepare to HAUNT your pal's back catalogue
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.