IBM rises to the optimisation challenge
The right tools for the job
In computing, it sometimes pays to specialise. Generic systems will handle most computational needs, but they may not excel at them.
For larger companies, honing systems to handle specialised tasks involving large amounts of data could help to make data centres more efficient. This is what workload optimisation is for.
Transaction processing is not the same as business analytics, for example. Transactions may be processed independently of each other and pushed through in a queue.
Conversely, some analytics work requires results from one set of analytics to be used by another. And when that starts happening in real time, the computational demands become very specific.
That is why Watson, the supercomputer that beat two human opponents in a game of Jeopardy earlier this year, was a workload optimised system, says Doug Brown, vice-president of global marketing at IBM.
“There may be analytics that require real-time calculations in the middle of a transaction flow, in which case having an aggregated database close to the processing, like IBM’s System z provides, might be best. So it varies even within analytics,” he says.
One of three pillars supporting IBM's Smarter Computing strategy, workload optimisation involves tweaking hardware and software stacks to suit a particular task. IBM claims that operational costs can fall by as much as 55 per cent when using optimised computing solutions.
The second pillar of Smarter Computing focuses on using every aspect of increasingly large data volumes for analytic purposes. In a world where large amounts of data come from an increasingly wide array of sources, this becomes ever more important.
Finally, the strategy’s cloud component emphasises the role of cloud-based technologies in helping to optimise IT service delivery.
The workload optimisation component of Smarter Computing focuses on three sub-elements: hardware, software and domain knowledge. In many cases, the hardware and software components can be largely generic.
It is the domain knowledge, for example, that gave Watson the smarts to process language in real time and question a huge database of information to come up with the answers that beat its human opponents.
Workload optimisation can be implemented in a variety of ways. Customers can buy and configure their own systems, which requires a significant amount of IT expertise, time and effort.
The third approach offers the best of both worlds
They can purchase an appliance, customised for a specific task. Or they can buy a system that has been pre-integrated by IBM to support specific computing tasks.
The third approach offers the best of both worlds. It provides customisation opportunities while also allowing customers design flexibility. The pre-integration involves collaboration between hardware and software teams.
For example, IBM's DB2 team might optimise the database for use on a clustered set of Power 7 servers. This might enable the customer to get more performance from WebSphere and DB2 using a specific hardware footprint.
Hardware plays an important role in workload optimisation, and Watson was based on a collection of Power7-based clusters. This eight-core, 32-thread processor is designed to be optimised for a variety of different workloads with the addition of specific software solutions.
“Watson used extreme analytics, and massive threading capabilities from the Power 750 clusters,” Brown says.
“That capability with the clusters created a lot of simultaneous calculations. We combined those hardware capabilities with the software and the domain knowledge.”
A potential downside of vendor-integrated analytics systems is the price premium. They tend to ship with rich sets of features, some of which customers may not use.
However, Brown argues that the cost savings in reduced complexity and configuration cut some of the operational expenses, offsetting part of the capital expenditure.
IBM also offers a service component to the sale, maintaining and tweaking the systems as customers’ needs evolve.
Made to measure
In any case, there are different systems to fit different needs. You won’t see a single page outlining IBM’s workload optimised offerings because workload optimisation permeates a lot of what it does.
For example, its Smart Analytics systems are optimised to provide businesses with analytics capabilities. Its CloudBurst series of cloud computing servers, available on both its System X and Power platforms, is another optimised offering.
The vendor also offers systems such as the Netezza analytics appliances, designed to help companies breach the entry point into business analytics systems.
As the world moves towards more efficient computing infrastructures workload optimisation will continue to gain traction among larger companies. Smaller businesses with fewer performance requirements and space-to-power constraints are likely to want more generic systems. ®
Sponsored: Hyper-scale data management