Original URL: https://www.theregister.com/2003/10/26/is_grid_computing_finally/

Is grid computing finally a reality?

Old premise, new promise

By Datamonitor

Posted in Systems, 26th October 2003 21:16 GMT

For decades, there has been talk of technology that can optimize processes and even heal itself, but 'artificial intelligence' and 'artificial life' never quite managed to live up to their hype. Now, eyes are turning to 'grid computing' to meet these goals - but can the latest technology finally turn fiction into reality?

Einstein once famously said that, "Not everything that counts can be counted; not everything that can be counted, counts." The logic behind this argument can be applied rather neatly to the technology industry. Since the dawn of the first computers that filled an entire room and yet offered less processing power than today's mobile phones, the race has been on to make computers do more than simply count.

But as Oracle, one of the driving forces behind today's technology infrastructure, prepares to launch 'grid-enabled' versions of its database and application server, the question is whether we are any closer to a computer grid, which promises the self-diagnostic and self-healing capabilities that computer companies and their customers have sought for so many years.

An old promise

The basic premise behind grid computing is far from new. Back in the early 1980s a similar amount of hype was heaped on so-called artificial intelligence. Back then, AI was seen as the great hope for technology and the technology industry. Computers that could 'think' for themselves, solve the most complex problems, and most importantly heal themselves, were said to be just around the corner. They would solve the mysteries of the universe, help us reverse global warming, even put an end to world hunger thanks to breakthroughs in science and medicine. Nearly every computer company worth its salt was conducting research into AI, if not launching products said to feature such functionality.

But by the end of the decade it had become clear that AI was not going to live up to its hype. Remaining obstinately complex and really only useful in helping to solve scientific and technical problems, AI never reached the mainstream.

Though AI hardly made it beyond limited use in academic circles, by 1990 the talk had turned to the possibility of Artificial Life, seemingly an even more hare-brained idea, which would see computer software actually evolving just like an organism.

A computer would run thousands of programs simultaneously, and a specifically tailored master program would select those most efficient for each problem, the computerized equivalent of survival of the fittest. The more powerful programs would then be merged to create a new generation that would be even better at accomplishing the task. The only problem was that each of those thousands of initial programs needed to be written first, which rather defeated the object.

Too much imprecision

One of the problems with AI was that it relied on what is known as predicate logic, which is very intolerant of imprecision. In life, few things are precise: grass is not always green, and trees are not always leafy. AI software failed to offer anything close to genuine intelligence, because it could only handle 100% truths. There was no room for a gray area.

About the only computer company that still talks about AI today is Computer Associates, which says that its neural network agent ('Neugent') technology is able to monitor and manage distributed servers using AI-like techniques.

A legacy of rules

But what artificial intelligence did show was the value of rules-based technology. While earlier generations of software required far more extensive programming, much of the thinking behind AI did eventually leech its way into mainstream software programming and make software programs more versatile.

Today technologies like Ilog's Rules are among the best examples of this in commercial software. The product's Rules engine can trawl through thousands of possible outcomes per minute, and choose the most favorable based on rules established by the developer. It has proved invaluable in all sorts of disciplines, from calculating the optimum route in logistics, to calculating the optimum network in mobile telephony.

But what has all this got to do with what technology vendors are now describing as 'grid computing'? Well, while a common definition of grid computing has yet to emerge, it essentially means that rather than having standalone hardware, middleware and software, the computing infrastructure is linked together more tightly into a grid. The advantage is said to be that the individual elements of the grid, from storage arrays to application servers, are aware of one another. If they are aware, then they can start to optimize their own behavior based on what the other elements are telling them. They can start to become self-optimizing, as well as self-healing.

Grid computing today

While IBM has talked about the concept of grid computing for some time, and even claims a number of customers to be running on grid-like systems, this usually means that the customer has outsourced its computing to one of IBM's datacenters. There is nothing inherently wrong with this, but just as valuable as being able to outsource computing tasks to IBM's own grid of compute resources would be the ability of enterprises to turn their existing IT infrastructure into a grid.

This is where Oracle is about to make a splash as it gears up to launch Application Server 10g in November, and Database 10g in December. Using concepts that were first embodied in artificial intelligence, each of these elements will be able to manage themselves to a degree based on the behavior of the other. The Database 10g Optimizer, for instance, will gather statistics not only on the queries being asked of its own rows and tables, but also from Application Server 10g.

Likewise, Application Server 10g will be able to spot deteriorating performance from one database or database partition, and reroute application calls to another. In this way, when working in tandem, the products begin to exhibit an amount of self-healing and self-tuning. Even in isolation, each product has had self-optimization capabilities added, so for instance Database 10g has automatic shared memory management, with automatic memory reallocation.

We are still a long way from a true grid of technology, which would see every element of the IT infrastructure inextricably linked, and working together in unison to self-tune and self-heal one another. For one thing, Oracle has only grid-enabled its database and application server so far. Versions of its Ebusiness Suite of enterprise applications will not get similar treatment until some time next year. But even then, most companies have IT assets other than Oracle software, so a homogeneous grid is still a long way off.

Nonetheless, Oracle has the right idea in attempting to simplify the management and optimization of the technology infrastructure. IT staff need to be doing fewer mundane management tasks if they are to be freed up to concentrate on providing the more flexible IT infrastructure that enterprises need today.

The latest hype

As Oracle develops further grid technologies, IBM further establishes its own take on grids (autonomic computing), and HP and Sun flesh out their utility computing strategies, customers are likely to hear more and more about self-healing and self-tuning.

This is no bad thing, but it is also worth remembering that self-healing has been promised since the early 80s. The same problems encountered by AI apply here: there is little room for a gray area, and without rules, even the most intelligent software does not know what to do given a certain event. That means that grids can only become self-healing once a human has told them how to heal themselves, and that means they are never going to be complete solutions straight out of the box.

All in all, grid computing is a sensible goal to chase. But are truly self-healing, heterogeneous grids just around the corner? Don't believe the hype.

Source: Computerwire/Datamonitor