Original URL: https://www.theregister.com/2014/03/29/train_your_brain_by_playing_video_games_inside_it/

Watch your brain LIVE in 3D, then train your mind from inside it

Computer gaming revitalizes thinking – comprendo, old farts?

By Rik Myslewski

Posted in Science, 29th March 2014 01:19 GMT

GTC As we age, our brains' cognitive abilities decrease. No surprise there. But one team of researches has experimental proof of a therapeutic technique that can reverse that decline: video games.

What's more, they've devised a way to directly stimulate your brain to improve its cognitive abilities, and created a 3D digital model, GlassBrain, into which a neurologist – and eventually you – can travel through your brain, and observe and control its inner workings in real time.

Not just any video game will improve cognitive abilities, mind you. The team has created video games specifically designed to challenge users by presenting them multiple streams of information in a distracting environment, a particularly difficult scenario for older adults.

The research was led by Adam Gazzaley, founding director of the Neuroscience Imaging Center at the University of California at San Francisco. Speaking at the GPU Technology Conference (GTC) in San José this week, Gazzaley said that he approached engineers at Lucas Arts five years ago to enlist their help in developing such a game.

"They told me they'd be delighted to work with us after spending their entire careers teaching teenagers how to kill aliens," Gazzaley said. "And I found that by supplying some sushi and beer you can get triple-A game developers, artists, and video-game programmers into your lab to work with your post-docs and students and actually create something really new."

GlassBrain video

By modern standards, the team's first effort, NeuroRacer, was a rather simple game. The game player was asked to use a joystick to steer a cartoon car over a winding road, and at the same time react as rapidly as possible to signs under which the car drove – but only some signs that had a particular shape or color on them.

In addition, the game was adaptive, Gazzaley said – it adjusted the challenge level as the player improved, "keeping you right in the sweet spot, which our game developers called the 'flow state'," and the reward levels were set so that both the driving and sign-identifying skills needed to improve in tandem before the player could level up.

The researchers then developed a version of the game that could be played while the user was in an MRI scanner or while wearing an electroencephalography cap, so that their brain's activity could be monitored while the game was in progress.

Being proper brain boffins, the team performed a battery of cognitive-assessment tests on each research subject – aged 68 to 80 – before their involvement in the research project. The test subjects were then given a laptop with the game installed, and asked to play it for a month, one hour a day, three days a week.

"Then they come back into the lab," Gazzaley said, "and we see what has changed in their ability to play the game, what's changed in their brain, and what's changed in terms of other cognitive abilities."

The team had baseline data which showed that the cognitive "loss" induced by the distracting signs in the driving game gradually and nearly linearly increased from 27 per cent in 20-year-olds to 63 per cent in 70-year-olds. "Are there any 23-year-olds in the audience today?" Gazzaley asked the keynote crowd. "If you are, this is the top of your cognitive pyramid – you should really go out and treasure today. It's as good as it's going to get."

That may be true, but the experimental data produced by Gazzaley's team showed that the cognitive decline experienced during aging is reversible. Data from a 64-channel EEG cap worn by the test subjects showed that after the month of game-playing, the test subjects' theta brainwaves – associated with activities involving attention, cognitive control, and working memory – improved to a level exceeding that of the 20-year-olds' baseline results.

"We also found that skills that were not trained in the game – we're not part of the game at all – also improved," Gazzaley said. "So their working memory ability, the ability to hold a face in mind for a short period of time, or their vigilance to a very boring target that appears rarely also improved."

Fly inside your own brain, play with its electrical system

These finding were published in an article in prestigious boffinary rag Nature last September, and Gazzaley and some of the Lucas Arts developers have formed a new company, Akili Interactive Labs, to create and market such therapeutic video games.

Their first offering, Project: EVO, is built on the Unity 3D mobile gaming engine, and is a tablet-based game with more-modern graphics and improved gameplay.

Screenshot from Project: EVO by Akili Interactive Labs

Project: EVO can save you from fading cognitive abilities, possibly help ADHD

But don't expect to buy Project: EVO on the iTunes Store or Google Play – at least not yet. "The goal is to see if it can go through an FDA pathway and essentially become the world's first prescribed video game," Gazzaley said.

That therapeutic possibility has caught the eyes and wallets of the pharmaceutical industry: Big Pharma giants Pfizer – which makes Aricept for Alzheimer's – and Shire Pharmaceuticals – which makes Adderall for attention-deficit hyperactivity disorder (ADHD) – are early investors in Akili Interactive Labs.

In addition to their investments, Pfizer is funding an investigation to determine if Project: EVO is effective as a "biomarker or cognitive endpoint for individuals at risk of developing Alzheimer's disease," and Shire is conducting a clinical study of the game's effectiveness at the other end of the age spectrum: children ages 8-to-12 with ADHD.

Gazzaley's lab is also working on new games and examining their effects on other cognitive disorders, including depression, dementia, autism, post-traumatic stress disorder, schizophrenia, and others. They're also investigating different platforms, including Microsoft Kinect and Oculus Rift.

But to determine not only the games' effect on cognitive functioning, but also to boost that effectiveness and develop feedback loops between the game and the gamer's brain, a combination of neurofeedback and neuromodulation mechanisms is needed, Gazzaley said.

To bring neural signals into a closed feedback loop, however, sensing and stimulation must operate in real time, he said – and that's why his talk wasn't being given at a meeting of the American Neurological Association, but at the GPU Technology Conference.

"This is really where CUDA-enabled processing and the GPU really comes in," Gazzaley told his audience of CUDA coders and GPU integrators. "It's a large problem: there's lots of independent sources in the brain that we're trying to localize, and in addition we're trying to do it fast – the closest we can get to zero latency."

The lower the latency, the more valuable the tool, not only to understand what the brain is doing in real time, but also to be able to both modify the game and stimulate specific areas in the brain in real time, at the moment when brain-operation data is received.

GlassBrain being demoed inside UCSF's Neuroscape Lab

Inside the comfy confines of UCSF's Neuroscape Lab

To be able to visually observe brain activity in real time, the Neuroscape Lab of the University of California, San Francisco's Neuroscience Imaging Center (NIC) has developed what it calls GlassBrain, a Unity3D-based real-time model of activity in the brain of a subject wearing an EEG cap.

GlassBrain can display the real-time brain animation either on a flat-panel 3D display or through a virtual-reality setup such as the Oculus Rift. It maps the real-time data from the EEG cap onto a generic brain model, constructed from a 3D MRI scan, that contains both the brain's structure and its primary connection-fiber pathways. Thanks to the Unity3D game engine, you can fly through the live brain at will, piloting your way using a joystick or the sensors in a virtual-reality headset.

"One of the key features is this real-time EEG data that's really taking use of really sophisticated CUDA processing of the GPU on the Tesla card on our machine here," Gazzaley said. In real time, GlassBrain's CUDA code performs artifact rejection, environmental electrical interference rejection, and calculates predictions of activity on the brain's electrical pathways in response to the EEG signals, with different frequencies mapped to different colored lights traveling along the brain's fiber tracks.

In the show-stopper finale to his keynote, Gazzaley brought onstage Tim Mullen, a chief scientist of the GlassBrain project, and Mickey Hart, percussionist and former drummer of the Grateful Dead. Hart donned an EEG cap, and both strapped on Oculus Rift virtual-reality headsets.

Hart then played a new game that Gazzaley and his team are developing, NeuroDrummer, while Mullen navigated through Hart's brain in real time during his game play, which involved a broad range of MIDI-triggered rhythmic inputs; what they saw through their headsets was projected on a large screen behind them.

The effect was impressive, indeed – even to this rather jaded Reg hack – with Mullen being able to look up and down inside Hart's brain, zoom in and out of it, all while colored electrical impulses flowed about the brain's animated fibers.

However, even more impressive was the promise of future such setups in which the brain that was being observed would be the brain of the observer. "What would it be like to enter your own brain," Gazzaley proposed, "and play a video game in the [brain's] structure, in that area, having your challenge be to guide the neural processing that you're seeing, so that you could learn how to control how your brain processes information?"

What would it be like? Personally, I look forward to finding out – after all, your humble Reg reporter is hurtling headlong towards that older adult's 67 per cent "cost" rate.

How about you? If you're over 23, remember, you're already fading. ®