Feeds

IBM rises to the optimisation challenge

The right tools for the job

Secure remote control for conventional and virtual desktops

In computing, it sometimes pays to specialise. Generic systems will handle most computational needs, but they may not excel at them.

For larger companies, honing systems to handle specialised tasks involving large amounts of data could help to make data centres more efficient. This is what workload optimisation is for.

Transaction processing is not the same as business analytics, for example. Transactions may be processed independently of each other and pushed through in a queue.

Conversely, some analytics work requires results from one set of analytics to be used by another. And when that starts happening in real time, the computational demands become very specific.

Superhuman powers

That is why Watson, the supercomputer that beat two human opponents in a game of Jeopardy earlier this year, was a workload optimised system, says Doug Brown, vice-president of global marketing at IBM.

“There may be analytics that require real-time calculations in the middle of a transaction flow, in which case having an aggregated database close to the processing, like IBM’s System z provides, might be best. So it varies even within analytics,” he says.

One of three pillars supporting IBM's Smarter Computing strategy, workload optimisation involves tweaking hardware and software stacks to suit a particular task. IBM claims that operational costs can fall by as much as 55 per cent when using optimised computing solutions.

The second pillar of Smarter Computing focuses on using every aspect of increasingly large data volumes for analytic purposes. In a world where large amounts of data come from an increasingly wide array of sources, this becomes ever more important.

Finally, the strategy’s cloud component emphasises the role of cloud-based technologies in helping to optimise IT service delivery.

Stay flexible

The workload optimisation component of Smarter Computing focuses on three sub-elements: hardware, software and domain knowledge. In many cases, the hardware and software components can be largely generic.

It is the domain knowledge, for example, that gave Watson the smarts to process language in real time and question a huge database of information to come up with the answers that beat its human opponents.

Workload optimisation can be implemented in a variety of ways. Customers can buy and configure their own systems, which requires a significant amount of IT expertise, time and effort.

The third approach offers the best of both worlds

They can purchase an appliance, customised for a specific task. Or they can buy a system that has been pre-integrated by IBM to support specific computing tasks.

The third approach offers the best of both worlds. It provides customisation opportunities while also allowing customers design flexibility. The pre-integration involves collaboration between hardware and software teams.

For example, IBM's DB2 team might optimise the database for use on a clustered set of Power 7 servers. This might enable the customer to get more performance from WebSphere and DB2 using a specific hardware footprint.

Hardware plays an important role in workload optimisation, and Watson was based on a collection of Power7-based clusters. This eight-core, 32-thread processor is designed to be optimised for a variety of different workloads with the addition of specific software solutions.

“Watson used extreme analytics, and massive threading capabilities from the Power 750 clusters,” Brown says.

“That capability with the clusters created a lot of simultaneous calculations. We combined those hardware capabilities with the software and the domain knowledge.”

A potential downside of vendor-integrated analytics systems is the price premium. They tend to ship with rich sets of features, some of which customers may not use.

However, Brown argues that the cost savings in reduced complexity and configuration cut some of the operational expenses, offsetting part of the capital expenditure.

IBM also offers a service component to the sale, maintaining and tweaking the systems as customers’ needs evolve.

Made to measure

In any case, there are different systems to fit different needs. You won’t see a single page outlining IBM’s workload optimised offerings because workload optimisation permeates a lot of what it does.

For example, its Smart Analytics systems are optimised to provide businesses with analytics capabilities. Its CloudBurst series of cloud computing servers, available on both its System X and Power platforms, is another optimised offering.

The vendor also offers systems such as the Netezza analytics appliances, designed to help companies breach the entry point into business analytics systems.

As the world moves towards more efficient computing infrastructures workload optimisation will continue to gain traction among larger companies. Smaller businesses with fewer performance requirements and space-to-power constraints are likely to want more generic systems. ®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Fujitsu CTO: We'll be 3D-printing tech execs in 15 years
Fleshy techie disses network neutrality, helmet-less motorcyclists
Space Commanders rebel as Elite:Dangerous kills offline mode
Frontier cops an epic kicking in its own forums ahead of December revival
Intel's LAME DUCK mobile chips gobbled by CASH COW
Chipzilla won't have money-losing mobe unit to kick about anymore
First in line to order a Nexus 6? AT&T has a BRICK for you
Black Screen of Death plagues early Google-mobe batch
Ford's B-Max: Fiesta-based runaround that goes THUNK
... when you close the slidey doors, that is ...
Disturbance in the force lets phones detect gestures with Wi-Fi
These are the movement detection devices you're looking for
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?