This article is more than 1 year old

The evolution of the data center

We need management software that spans the common platforms

The data center has its origins in mainframe computing - an era that had its virtues as well as its limitations. Its limitations are well known. Computer power was rationed to IT users and they were regimented in its use. They had little choice in the way business applications evolved and any kind of change to applications took a long time to implement.

Mainframe Computing

Most of the virtues of mainframe computing were benefits for the IT Department. It had complete control of the computing resource. It was thus possible to manage it efficiently, predict its growth reasonably accurately and plan IT investment accordingly. It had to kiss that goodbye, when minicomputers and then PCs and servers proliferated. The data centers remained in place, along with the glass houses that kept the expensive central computing resources walled off from the world. But outside this domain, computer power proliferated and the user departments and IT users assumed greater control of computer usage than they had ever dreamed of. We are now experiencing the benefits and, unfortunately, suffering the consequences.

The major problem that dominates corporate computing is complexity. It's a matter of numbers; huge populations of PCs, large populations of servers, large numbers of applications and all of it needing support, co-ordination and management. The whole situation is made more difficult by the complexity of the cost equations. While the cost of technology falls regularly, about 20 per cent per year, the cost of IT does not. Even in lean times most companies raise the IT budget a little and, in most circumstances, they have little choice in the matter.

There are unwelcome shocks that have to be dealt with. IT security issues have escalated year-on-year leading to an unavoidable investment in security products. Nobody predicted that spam email was going to turn into a corporate cost, but the level of spam traffic is now so high (about 80 per cent of all email) that almost all organizations have been forced to implement some form of spam blocking technology. On top of this, the amount of data that needs to be stored grows inexorably (up to 50 per cent per year). And technology continues to evolve, making it necessary in time to rip and replace or upgrade; hardware, networks, software, licenses - just about everything. Costs abound.

TCO & ROI?

The magic acronyms; TCO (Total Cost of Ownership) and ROI (Return On Investment), are bandied about by vendors and analysts in an attempt to rationalize what to try and what to buy, but for most large corporations, it isn't that simple. IT Departments have to implement whatever is bought and they have a limited human resource with which to do it. The products that look so impressive on Powerpoint don't always integrate well. The cost of labor and the availability of skills is part of the problem.

We can summarize the situation of the data center by saying that although the cost of processing power, storage and network bandwidth continue to fall, corporate computing is not reaping many of the benefits. The heart of the problem lies in IT infrastructure and the data center. For most organizations the IT operation is, by any reasonable measure, inefficient. System utilization rates are famously low - often between 15 and 20 per cent for servers. The number of systems that a single administrator can handle is low - somewhere between 15 and 30. New applications usually take an inordinate time to deploy - days or even weeks. (Compare it to adding an application to a home PC). For most corporations, 70 to 80 per cent of the IT budget goes in operational cost and maintenance.

It's a legacy problem that IT sites are well aware of and it is now driving the evolution of the data center. The goal is to create a reasonable compromise between the flexibility that is now available to IT users and the disciplines that used to govern mainframe computing.

Plug and Play

There needs to be a constant drive towards consolidating the IT infrastructure and the management of the IT resource. Remember back to the time when adding devices to PCs was a problem - the PC industry standardized with an initiative that went by the name of "plug and play". The same kind of "plug and play" initiative needs to happen for corporate networks.

This idea is - of course - a major theme of most vendor initiatives; IBM's On Demand, HP's Adaptive Enterprise, Oracle's Grid, Sun's N1 and everything else that falls within the idea of utility computing - "plug and play" for the corporate network. These visions are fine - even welcome, but the reality in the data centers is sobering. New technology may be able to fulfill some of these enthusiastic visions, but it's the old technology that causes the problem and the systems don't migrate so easily from one to the other.

The corporate data centers need to find an intelligent way forward between what is possible from these initiatives and where their own problems lie - and that can vary. At the hardware level, it needs to be possible to fully utilize the computer power that is already there and to upgrade the capability without increasing the complexity of managing the resource.

The Management Challenge

The major challenge is in the infrastructure management layer and it won't be solved without investment and consolidation of the system management and operational software that it consists of. There are however, promising returns to be made from the right investment. A great deal of manual activity in the data center can be better automated with the right choice of management products, from the help desk through security, password management, patch management, provisioning, asset management, license management, log management, data archiving and more. This layer needs to be properly integrated and not be a random collection of point solutions that lack central co-ordination.

This is how the data center must evolve in the coming years. We need management software that spans the common platforms and automates the whole infrastructure. If we have that, then some beguiling technology possibilities would be easier to exploit; On Demand operation, computing as a service, web services, wireless networks, VoIP, mobile technology, embedded systems and so on.

Technically we are gradually converging towards utility computing, but to exploit the opportunities that it presents, the corporate infrastructure has to be capable. Infrastructure software is the key to the door. It would be good for IT if corporate executives understood that, because it won't come for free.

Copyright © 2005, IT-Analysis.com

Related stories

Fujitsu Siemens unveils PRIMEQUEST
Computacenter signs up for BladeLogic
Centrica wants you

More about

TIP US OFF

Send us news


Other stories you might like