Feeds

Autonomic Computing – the IBM blueprint

Broad Brush

  • alert
  • submit to reddit

Internet Security Threat Report 2014

IBM has been talking about autonomic computing for well over a year. This month it issued a 40-page blueprint (pdf), so what is it, why do we need it, how does it work, is it important and have IBM got it right, asks Peter Abrahams, of Bloor Research?

Autonomic computing is IBM's term for the ability of systems to be more self-managing. The term "autonomic" comes from the autonomic nervous system, which controls many organs and muscles in the human body. Usually, we are unaware of its workings because it functions in an involuntary, reflexive manner.

So what does it mean for an IT environment to be autonomic? It means that systems are:

  • Self-configuring, to increase IT responsiveness/agility
  • Self-healing, to improve business resiliency
  • Self-optimizing, to improve operational efficiency
  • Self-protecting, to help secure information and resources

Why do we need it? The computing systems we have developed over the last ten years have become complex meshes of inter-related applications and servers. Keeping these running smoothly is a time and people intensive activity that does not always succeed. But you ain't seen nothing yet! Loosely coupled web services, outsourcing of parts of the environment and the implementation of more complex and larger applications all point to the fact that the manual management of the systems will become impossible. The answer is to give this large problem to the computer to fix.

How does it work? The solution covers elements in all levels from the base hardware platforms, through the various software layers to the business processes. Each element must include sensors that collect information about state and transitions, and effectors that can alter the elements configuration.

The autonomic management layer has four major components: the monitor that collects information from the sensors, the analyser that correlates and models the state of the systems and predicts and reports on issues, the planning function which takes these issues and develops solutions, and then an execution element puts the plan into effect through the effectors. The management layer is an element of the total system that must also be managed so it has its own sensors and effectors. This structure enables hierarchies of management and also peer to peer management functions.

Finally in the management layer there is a knowledge base that includes system topology, calendars, activity logs and policy information.

The blueprint also makes it clear that autonomic computing is a journey and defines basic, managed, predictive, adaptive and autonomic as the steps along the way.

Is it important? The problem is real so a solution is important. IBM is a major player and solving this problem is essential to their e-business on-demand strategy. E-business on-demand, we are told, is pivotal to IBM's future direction. So the blueprint is important as it will have an impact on the market. IBM is working very closely on related standards such as DMTF-CIM, IETF-SNMP, OASIS-WS-S and WS-DM, SNIA-BlueFin, GGF-OGSA, Open Group-ARM and will be using the blueprint to guide their input to the standards.

Has IBM got it right? IBM has a very wide view of the issues as its provides hardware, software, services and increasingly out-sourcing e-business on-demand utility computing. The blueprint is similarly broad and this is its strength as well as its weakness.

It is an excellent overall vision and gives a structure for understanding all the initiatives in this space from standards bodies and other vendors; but is more difficult to understand than the more targeted solutions and messages from 'niche' players like CA, HP and Microsoft.

© IT-Analysis.com

IBM's autonomic computing site

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.