Feeds

IBM's Watson-as-a-cloud: Is it a bird? Is it a plane? No, it's another mainframe

Forget point'n'click computing, think long-term data crunching and training

High performance access to file storage

Analysis IBM's attempt to spin its supercomputer-cum-TV star Watson into a $1bn business unit may eventually boost Big Blue's bottom line – but going from beating Jeopardy! to defeating cancer is going to be harder than expected.

The system's decision engine and advanced natural-language processing technology was launched as a new moneymaking machine at a glossy event in New York on Thursday. So far, it seems, the clever mainframe has proved less a panacea and more a pain in the neck for IBM.

As of 2013, about seven years after it was conceived, Watson has beaten the aforementioned gameshow but generated less than $100m in revenue. Some of its projects are "in a ditch", according to the Wall Street Journal, yet IBM's CEO Virginia Rometty hopes the tech will generate $10bn per year in 10 years.

Watson, though, is not cooperating. It takes a long time to set up and train for new tasks, and when it starts to provide predictions, they're not always much use to begin with, the WSJ claimed. A look at the supercomputer's underlying architecture tells us why.

Software under the hood

Watson gets its magic from a DeepQA analysis engine, which blends together Hadoop, Apache UIMA, and other tools to achieve machine learning: this allows the machine to ingest a large amount of structured and unstructured information, analyse links between facts, come up with likely answers in response to questions, and ultimately rank them in terms of confidence.

"The overarching principles in DeepQA are massive parallelism, many experts, pervasive confidence estimation, and integration of shallow and deep knowledge," IBM wrote in an AI Paper discussing the tech in 2010.

DeepQA required IBM to do fundamental research into diverse areas, such as question processing, relation extraction, linguistic frame extraction, passage matching ensembles, and refining knowledge out of extracted data. All extremely tricky problems, and areas where Big Blue made great strides.

These advances mean that Watson can, given sufficient training and good enough data, offer effective predictions about best courses of action. However, the quality of the insights it can generate are bound by the amount of data fed into it.

For this reason each Watson on-premises project requires lots of fresh data, retraining, and a lengthy time commitment to get the humming silicon brain tackling questions correctly. IBM admits as much in an academic paper [PDF] that discusses the challenges in shifting Watson from Jeopardy! winner to medicine:

Applying DeepQA to any new domain requires adaptation in three areas:

  • Content adaptation involves organizing the domain content for hypothesis and evidence generation, modeling the context in which questions will be generated
  • Training adaptation involves adding data in the form of sample training questions and correct answers from the target domain so that the system can learn appropriate weights for its components when estimating answer confidence
  • Functional adaptation involves adding new domain-specific question analysis, candidate generation, hypothesis scoring and other components.

Think of a mainframe. Watson seems a lot like one of those, as it preferences long-term relationships, an undisclosed financial outlay, and lock-in-by-default as this technology is only fielded by IBM.

That's not a terribly bad thing, mind, as for some organisations a tool like this could be useful. But it does mean you are right to be sceptical when IBM starts portraying Watson as a cloud product that's is easy to get started with.

For this reason, Big Blue's very plan to offer Watson's impressive brain power as a cloud-based service is beguiling and heavy with the black syrup of marketing.

Whose cache line is it, anyway?

Though Watson would undoubtedly benefit from having more data in a central repository, IBM will have to organise data into relevant domains to maximise confidence in the system, and as far as we can tell DeepQA is not built in such a way that it can be easily segmented into individual domains.

Rather, for IBM's cloud-based service to generate insights greater than the sum of an individual developer's contributed data, it seems Big Blue will have to add in a hierarchical system that can select which information repositories are relevant for solving a particular problem. Hardware-wise, this is all doable (it needed 2,880 Power7 cores, plus Wikipedia and other texts held in 15TB of RAM, to win Jeopardy! in 2011), but it's unclear whether the software is there.

One thing is for sure – in its current state, Watson-powered projects require heavy development by both IBM and the prospective customer, and though IBM is forming a lab to help work with Silicon Valley firms to create Watson apps, it looks to be a hard road. ®

High performance access to file storage

More from The Register

next story
Seagate brings out 6TB HDD, did not need NO STEENKIN' SHINGLES
Or helium filling either, according to reports
European Court of Justice rips up Data Retention Directive
Rules 'interfering' measure to be 'invalid'
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
Just what could be inside Dropbox's new 'Home For Life'?
Biz apps, messaging, photos, email, more storage – sorry, did you think there would be cake?
IT bods: How long does it take YOU to train up on new tech?
I'll leave my arrays to do the hard work, if you don't mind
Amazon reveals its Google-killing 'R3' server instances
A mega-memory instance that never forgets
USA opposes 'Schengen cloud' Eurocentric routing plan
All routes should transit America, apparently
prev story

Whitepapers

Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.