Feeds

IBM's Watson-as-a-cloud: Is it a bird? Is it a plane? No, it's another mainframe

Forget point'n'click computing, think long-term data crunching and training

Maximizing your infrastructure through virtualization

Analysis IBM's attempt to spin its supercomputer-cum-TV star Watson into a $1bn business unit may eventually boost Big Blue's bottom line – but going from beating Jeopardy! to defeating cancer is going to be harder than expected.

The system's decision engine and advanced natural-language processing technology was launched as a new moneymaking machine at a glossy event in New York on Thursday. So far, it seems, the clever mainframe has proved less a panacea and more a pain in the neck for IBM.

As of 2013, about seven years after it was conceived, Watson has beaten the aforementioned gameshow but generated less than $100m in revenue. Some of its projects are "in a ditch", according to the Wall Street Journal, yet IBM's CEO Virginia Rometty hopes the tech will generate $10bn per year in 10 years.

Watson, though, is not cooperating. It takes a long time to set up and train for new tasks, and when it starts to provide predictions, they're not always much use to begin with, the WSJ claimed. A look at the supercomputer's underlying architecture tells us why.

Software under the hood

Watson gets its magic from a DeepQA analysis engine, which blends together Hadoop, Apache UIMA, and other tools to achieve machine learning: this allows the machine to ingest a large amount of structured and unstructured information, analyse links between facts, come up with likely answers in response to questions, and ultimately rank them in terms of confidence.

"The overarching principles in DeepQA are massive parallelism, many experts, pervasive confidence estimation, and integration of shallow and deep knowledge," IBM wrote in an AI Paper discussing the tech in 2010.

DeepQA required IBM to do fundamental research into diverse areas, such as question processing, relation extraction, linguistic frame extraction, passage matching ensembles, and refining knowledge out of extracted data. All extremely tricky problems, and areas where Big Blue made great strides.

These advances mean that Watson can, given sufficient training and good enough data, offer effective predictions about best courses of action. However, the quality of the insights it can generate are bound by the amount of data fed into it.

For this reason each Watson on-premises project requires lots of fresh data, retraining, and a lengthy time commitment to get the humming silicon brain tackling questions correctly. IBM admits as much in an academic paper [PDF] that discusses the challenges in shifting Watson from Jeopardy! winner to medicine:

Applying DeepQA to any new domain requires adaptation in three areas:

  • Content adaptation involves organizing the domain content for hypothesis and evidence generation, modeling the context in which questions will be generated
  • Training adaptation involves adding data in the form of sample training questions and correct answers from the target domain so that the system can learn appropriate weights for its components when estimating answer confidence
  • Functional adaptation involves adding new domain-specific question analysis, candidate generation, hypothesis scoring and other components.

Think of a mainframe. Watson seems a lot like one of those, as it preferences long-term relationships, an undisclosed financial outlay, and lock-in-by-default as this technology is only fielded by IBM.

That's not a terribly bad thing, mind, as for some organisations a tool like this could be useful. But it does mean you are right to be sceptical when IBM starts portraying Watson as a cloud product that's is easy to get started with.

For this reason, Big Blue's very plan to offer Watson's impressive brain power as a cloud-based service is beguiling and heavy with the black syrup of marketing.

Whose cache line is it, anyway?

Though Watson would undoubtedly benefit from having more data in a central repository, IBM will have to organise data into relevant domains to maximise confidence in the system, and as far as we can tell DeepQA is not built in such a way that it can be easily segmented into individual domains.

Rather, for IBM's cloud-based service to generate insights greater than the sum of an individual developer's contributed data, it seems Big Blue will have to add in a hierarchical system that can select which information repositories are relevant for solving a particular problem. Hardware-wise, this is all doable (it needed 2,880 Power7 cores, plus Wikipedia and other texts held in 15TB of RAM, to win Jeopardy! in 2011), but it's unclear whether the software is there.

One thing is for sure – in its current state, Watson-powered projects require heavy development by both IBM and the prospective customer, and though IBM is forming a lab to help work with Silicon Valley firms to create Watson apps, it looks to be a hard road. ®

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Amazon Reveals One Weird Trick: A Loss On Almost $20bn In Sales
Investors really hate it: Share price plunge as growth SLOWS in key AWS division
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.