This article is more than 1 year old

Mac daddy predicts all-knowing, all-seeing UI

'I have seen the future, and it's in your ear'

Macworld In the future, you'll use a speech-based interface to access all the world's knowledge – including your own personal memories – stored in the cloud, according to a legendary engineer who was a member of the team that designed Apple's original Macintosh user interface.

"More and more people have smartphones in their pockets ... and the best way to interact with them is going to be a conversational user interface – as long as we can get natural language understanding to work," said Bill Atkinson, a member of the original Macintosh team, the principal designer of the Mac's pioneering user interface, and the author of QuickDraw, MacPaint, and HyperCard.

"Without that, forget it," Atkinson told his audience at Wednesday's Macworld Expo Industry Forum.

Explaining natural-language conversations, Atkinson said: "This is not speech recognition, isolated utterance of something, but the actual understanding of a deep body of knowledge of how the world works, understanding the flow of the conversation – what's being talked about – and being able to interact intelligently that way. And so that's going to require a lot more than just speech recognition."

A conversational interface is needed, Atkinson said, because the standard keyboard-and mouse, point-and-click – or even point-and-tap – metaphor is unsuited to mobile computing. In a mobile world, "The user interface is going to change. ... It'll be a conversational interface. You'll talk to your virtual personal assistant, who gets to know you.

"So people will have a smartphone in their pocket or their purse, but they'll probably have an earpiece," he said. "Right now there are Bluetooths with just a speaker and a microphone, but they will eventually have a speaker, a microphone, and a video camera.

The central component of the conversational interface will be what Atkinson called a personal virtual assistant. "The way you interact with things in a conversational way is going to be through a personal assistant. That assistant will hear, and see, and speak, and – if you wish – record everything that you hear and see."

The two "key apps" that he envisions will be the answering of question and the storage of personal memories: "A 'memory prosthesis', if you will," he said.

"You're going to actually ask questions and get answers. You'll point to a building and say 'What's that building?', and the video camera on your earpiece is going to see where you're pointing and say, 'Oh, that's the Bank of America building'."

His desire for a memory prosthesis, it appears, is personal. "I'm about to turn 60," he said, "and I've got most of my memories stored in my wife's brain – but what happens when she starts forgetting?"

He envisions a day when "every conversation you've had, and every place you've visited, and every memory that you want to keep could be stored on the cloud. It's recorded by your little earpiece that's got a video camera.

"I want to be able to go back and say: 'What was that restuarant we went to? And what was the food that we thought was good there?' And when I see somebody: 'Okay, I know his name is Bill, but what was his last name? Where does he live? What does he do?', and I'm spacing out on that stuff, [but] my assistant is watching, and understands the situation, and ... is filling me in."

The heavy lifting required for the personal virtual assistant, Atkinson said, will take place in the cloud. "All your local device has to do is a little bit of feature extraction and data compression so that what you send to the network is compact. But the actual intelligence, the body of all human knowledge and the recognizing of your speech is really actually going to happen in the network."

It's a Back to the Future moment for Atkinson: "We're going to move back in the direction of the dumb terminal and the telephone, that there's less intelligence on you and more in the cloud."

The natural language ideal is closer than we might think, Atkinson said, citing IBM's Jeopardy-playing Watson QA super, which is schedule to appear on that long-running game show this Valentine's Day, and which has already bested two top Jeopardy players in a test run at IBM's TJ Watson Research Center this January:

Watson listens to the question posed by the game-show host, rummages through its database of around 200 million pages of "natural language content", then buzzes in and speaks what it believes to be the correct answer.

To Atkinson, when the viewing public sees Watson at work, it will be an epochal event, "much like the first pictures from Apollo 11 of the earth floating against the void, telling people for the first time, down in a gut-level feeling, that we're all on this planet together and we all gotta make it work. We knew that the earth was round, but we didn't know that the earth was round.

"When we see a computer intelligently interacting by a natural-language interface, people are going to want it and the technology will be driven toward that."

For Atkinson, the development of natural-language, cloud-based, conversational interface for mobile computing isn't a matter of "if" – it's a matter of "when".

"So, when's it going to happen? I would be surprised if it isn't happening in 10 years from now, and I would be surprised if it's happening in two years from now."

Citing computing visionary Alan Kay's reference to the role of personal computers to be "slaves without guilt", Atkinson is convinced of the inevitability of the day when we'll all converse with our own personal virtual slaves assistants.

"I know it's going to happen," he said. ®

Bootnote

Atkinson, in addition to spending most of his time these days as professional nature photographer, is also an iOS developer with an eponymous app, Bill Atkinson PhotoCard, available in for the iPhone, iPad, and iPod touch.

As such, and after noting that his app has only fifty-thousand users, he has some advice for his former employer: "The App Store is a haystack of over 300,000 apps, and even a shiny little needle gets lost in that. If you're not one of the top 50 apps that have been there for a long time, [you'll have] a hard time getting discovered.

"Apple has some work to do in making the App Store better at matching customers with software they'd like to use if they knew it existed."

Perhaps by providing them with all-knowing, all-seeing personal virtual assistants?

More about

TIP US OFF

Send us news


Other stories you might like