Feeds

Autonomic Computing – the IBM blueprint

Broad Brush

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

IBM has been talking about autonomic computing for well over a year. This month it issued a 40-page blueprint (pdf), so what is it, why do we need it, how does it work, is it important and have IBM got it right, asks Peter Abrahams, of Bloor Research?

Autonomic computing is IBM's term for the ability of systems to be more self-managing. The term "autonomic" comes from the autonomic nervous system, which controls many organs and muscles in the human body. Usually, we are unaware of its workings because it functions in an involuntary, reflexive manner.

So what does it mean for an IT environment to be autonomic? It means that systems are:

  • Self-configuring, to increase IT responsiveness/agility
  • Self-healing, to improve business resiliency
  • Self-optimizing, to improve operational efficiency
  • Self-protecting, to help secure information and resources

Why do we need it? The computing systems we have developed over the last ten years have become complex meshes of inter-related applications and servers. Keeping these running smoothly is a time and people intensive activity that does not always succeed. But you ain't seen nothing yet! Loosely coupled web services, outsourcing of parts of the environment and the implementation of more complex and larger applications all point to the fact that the manual management of the systems will become impossible. The answer is to give this large problem to the computer to fix.

How does it work? The solution covers elements in all levels from the base hardware platforms, through the various software layers to the business processes. Each element must include sensors that collect information about state and transitions, and effectors that can alter the elements configuration.

The autonomic management layer has four major components: the monitor that collects information from the sensors, the analyser that correlates and models the state of the systems and predicts and reports on issues, the planning function which takes these issues and develops solutions, and then an execution element puts the plan into effect through the effectors. The management layer is an element of the total system that must also be managed so it has its own sensors and effectors. This structure enables hierarchies of management and also peer to peer management functions.

Finally in the management layer there is a knowledge base that includes system topology, calendars, activity logs and policy information.

The blueprint also makes it clear that autonomic computing is a journey and defines basic, managed, predictive, adaptive and autonomic as the steps along the way.

Is it important? The problem is real so a solution is important. IBM is a major player and solving this problem is essential to their e-business on-demand strategy. E-business on-demand, we are told, is pivotal to IBM's future direction. So the blueprint is important as it will have an impact on the market. IBM is working very closely on related standards such as DMTF-CIM, IETF-SNMP, OASIS-WS-S and WS-DM, SNIA-BlueFin, GGF-OGSA, Open Group-ARM and will be using the blueprint to guide their input to the standards.

Has IBM got it right? IBM has a very wide view of the issues as its provides hardware, software, services and increasingly out-sourcing e-business on-demand utility computing. The blueprint is similarly broad and this is its strength as well as its weakness.

It is an excellent overall vision and gives a structure for understanding all the initiatives in this space from standards bodies and other vendors; but is more difficult to understand than the more targeted solutions and messages from 'niche' players like CA, HP and Microsoft.

© IT-Analysis.com

IBM's autonomic computing site

Secure remote control for conventional and virtual desktops

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
Troll hunter Rackspace turns Rotatable's bizarro patent to stone
News of the Weird: Screen-rotating technology declared unpatentable
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.