Feeds

Watson? Commercial – not super – computer

Off-the-shelf gear ‘gets’ humans

The essential guide to IT transformation

Now that IBM’s Watson has pounded the best human Jeopardy competitors into a fine slurry, let’s take stock. Our human proxies took their ass-kicking in good spirits, with Ken Jennings writing on his ‘Final Jeopardy’ card, “I for one welcome our new computer overlords.” (For the sake of adding a bit more inane trivia, the Jeopardy answer for his phrase would be “Who was Kent Brockman on The Simpsons?”)

And I believe I’ve found the best Register reader comment so far: In response to this story’s subtitle, “Robots will keep us as pets,” came this clever bit: “There already are humans kept as pets by machines - they're called ‘iPhone owners!’” LinkOfHyrule, we’re not worthy.

It’s gratifying to see so much coverage of a tech story in the non-tech media, though some of it is frustrating as well. A good many of our fellow carbon-based life forms refer to Watson as a “supercomputer” and laud its ability to do lightning-fast “searches”. Neither of these describe what Watson is or what Watson does.

First of all, it’s not a supercomputer. It’s a commercial system – or rather, a bunch of commercial systems lashed together for parallel processing purposes. The hardware is readily available POWER-based gear that can run either IBM’s AIX Unix operating system or Linux.

It’s the same box that’s running commercial apps like SAP and Oracle in thousands of companies. Watson is made up of 90 4-socket IBM Power 750 systems with 360 8-core POWER7 processors running at 3.55GHz with 16 GB of memory per server. The systems are connected together via 10 GbE networking.

There is also a misconception about how Watson comes up with answers – it’s not ‘searching’ for them as we typically think of the search process. You can’t do that with many Jeopardy questions due to their indirect nature.

After Watson is asked a question, it analyzes the question and topic, and pulls hundreds of possible “candidate” answers from hundreds of sources, and then begins hypothesis generation. Thousands of pieces of “evidence” are sorted to weigh the validity of the candidate answers.

These candidate answers and their “proofs” are scored and synthesized using deep analysis algorithms to create answer “models” from which the final answer choices – and Watson’s confidence in each – are derived. Watson, of course, goes with the “highest-confidence” response. On Jeopardy, this process took place before the human contestants who knew the answers could hit their buzzers.

No supercomputer required

What is coming across in the media, fortunately, is why all of this matters: real-time answers that are accurate despite the human frailty of the information we provide and the questions we ask. We human types are ambiguous. We have nearly endless ways to say the same thing. Our statements and questions are unstructured, and must be interpreted through the context in which they’re made.

Computers want things to be black or white. They have to go to great lengths (as we can see above) to be able to figure out the meanings in human statements or questions. Humans are great at processing ambiguous, unstructured data; our brains are wired to see patterns and put together theories as to why those patterns occur. Computers are great at doing the grunt work of sifting through masses of evidence to either support or disprove our theories.

This Jeopardy exercise isn’t about computers besting humans. It’s really about how collections of computing hardware and software can be optimized to understand humans better, and to understand what we’re trying to get them to do. A lot of time, effort, and money is expended in getting real-world data into a form where it can be understood and processed by digital devices like computers. Watson is the best recent example of a machine crossing over the divide between human and machine-style thinking.

This means that in the future, we’ll be able to spend more time on actual human work and less time on generating digital-compatible data to feed the machines. This will pay concrete dividends even in the near term. Information from thousands of patients’ vital signs and millions of clinical reports and doctors’ notes could be synthesized to provide diagnoses that aren’t guesswork.

Businesses can make sense of staggering amounts of data that have been “noise” until now. Who knows – maybe our consumer information and requests and incoherent rants could be analyzed in such a way that we get actual help from a help desk. No supercomputer required. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.