Boffins turn to Wii tech for speech-loss therapy
Communicate again, using gestures
Researchers at London's City University are to try out motion control gaming hardware - Nintendo's Wii Remote, Microsoft's Kinect and Sony's PlayStation Move - to see if the technology can help stroke victims cheaply and easily regain the power to communicate.
One of the likely outcomes of a stroke is aphasia, effectively the loss or severe reduction in the ability to use spoken or written language. Aphasia can have other causes too, and treatments include teaching sufferers a rudimentary form of sign language - gestures, basically - to allow them to communicate non-verbally.
For stroke sufferers, many of whom also suffer from a degree of paralysis, this isn't easy, requiring one-on-one therapy, which is expensive.
However, the City University team reckon a computer program linked to a motion control system might make for a cheaper alternative - and one that aphasia sufferers could use in their own home.
"Gesture tracking and recognition technologies are becoming a ubiquitous part of new computing and gaming environments," said Stephanie Wilson, Senior Lecturer in Human-Computer Interaction Design (HCID) at City University London. "We will evaluate the suitability of such technologies in aphasia rehabilitation.”
Jane Marshall, Professor of Aphasiology at City University London, said: "Computer-based treatments have been shown to improve verbal language skills in previous studies, but this is the first time that gestures will be addressed."
The team said they will develop a prototype rig that will allow users to practise gesturing and receive instant feedback.
According to the Stroke Association, around 150,000 Britons a year suffer from strokes, while some 45,000 new cases of aphasia will be diagnosed. ®
Oh no I can see clippy making a return :(
"Then, again, can you imagine a Microsoft patient care app?"
"Windows for stiffs"
I'll skip remarks about "Blue screens of death."
No, I will not provide a title.
Shame MS downgraded Kinect's camera at the last minute, killing the chance for sign language reading. It might have been useful for this type of application. Better than making subjects grip a dildo^H^H^H^H^H^H controller.
Then, again, can you imagine a Microsoft patient care app?
'It looks like you're trying to summon a nurse, do you want help with that? Are you sure?'
'You appear to be turning blue, please adjust the camera so I can see you properly'