Lenovo eyes peeper-pursuing PC controller
From mouse balls to eyeballs
Do I spy with my little eye, something beginning with IT? Lenovo has redesigned the personal computer interfacing system and debuted the world's first laptop with integrated eye control.
Yes, users can point, select and scroll, as well as zoom-in and browse webpages, simply by moving their eyeballs, apparently.
The company has built 20 fully-functional laptops using eye-tracking tech from Swedish manufacturer Tobii Technology.
Two infra-red sources bounce invisible light off the user's retinas. The reflected light is picked up by two cameras and the data used to calculate exactly where the user is looking.
Track the position over time and you can map out eye movements as gestures, just as you'd track a user's finger moving over a trackpad.
The prototypes are on display at this year's CeBit trade show in Hannover, Germany, and while the technology is still in the early stages of its development, Lenovo said it is working hard to bring Tobii's eye-tracking tech to consumers. Tobii reckons a commercially viable system will arrive in about two years' time.
Such technology may be ideal for those with disabilities - Gareth the videogamer, who uses his chin to play games springs to mind - but could eye-controlled devices put extra unnecessary strain on our delicate blinkers? What are your views? ®
You need BOTH hands?
Stop boasting :-p~
When I'm working with a ton of windows open and frequently swapping between them, and my attention is distracted, I already sometimes catch myself semi-consciously attempting to switch focus to a window by glancing at it and blinking - obvious displacement activities for moving the cursor and clicking. So I think this would be a great addition to keyboard- and mouse-based UI navigation. (Presumably the mouse would still be needed for fine detail work, I can't see it having the resolution for e.g. accurately selecting blocks of text in a document.)
Maybe I'm lazy
But my eyes feel tired just thinking about this!
Couple of things pop to mind
Firstly, touch typing - you might not be looking at the screen, let alone the active window as you are using it.
Second, multiscreen - does each one have the sensor, what does that do to us who have a screen where you read output and a screen where the window focus remains for entering commands to create said output.
It'll be fantastic for the disabled users, but less so for the folk who have either twitchy or lazy eye(s)!!
I suppose I'd have to try it first, but...
I'm generally not too interested in fancy features. There's a few essentials that I really really want or I feel disconnected, but those mostly resolve around the keyboard and wanting a trackpoint because it saves moving over to the mouse and back again. Which reminds me, I really should teach this wm to take keyboard shortcuts again.
I sincerely hope I'll never have to do without the ability to touch-type (knock on wood) but I imagine that in that case this sort of thing suddenly becomes a lot more interesting. So I certainly don't mind that this technology exists, even if I hopefully won't need to deploy it.