Control your home with thought alone

TWO friends meet in a bar in the online environment Second Life to chat about their latest tweets and favourite TV shows. Nothing unusual in that – except that both of them have Lou Gehrig’s disease, otherwise known as amyotrophic lateral sclerosis (ALS), and it has left them so severely paralysed that they can only move their eyes.

These Second Lifers are just two of more than 50 severely disabled people who have been trying out a sophisticated new brain-computer interface (BCI). Second Life has been controlled using BCIs before, but only to a very rudimentary level. The new interface, developed by medical engineering company G.Tec of Schiedlberg, Austria, lets users freely explore Second Life’s virtual world and control their avatar within it.

It can be used to give people control over their real-world environment too: opening and closing doors, controlling the TV, lights, thermostat and intercom, answering the phone, or even publishing Twitter posts.

The system was developed as part of a pan-European project called Smart Homes for All, and is the first time the latest BCI technology has been combined with smart-home technology and online gaming. It uses electroencephalograph (EEG) caps to pick up brain signals, which it translates into commands that are relayed to controllers in the building, or to navigate and communicate within Second Life and Twitter.


In the past, one of the problems with BCIs has been their reliability, and they have tended to be limited in the number of functions that can be controlled at once, says John Gan of the BCI group at the University of Essex, UK. Like most BCI systems, G.Tec’s interface exploits an involuntary increase in a brain signal called P300 that occurs in response to an unexpected event.

To activate a command, the user focuses their attention on the corresponding icon on a screen, such as “Lights On”, while the EEG cap records their P300. The icons are flashed randomly, one at a time, and it is possible to tell which icon they are looking at by correlating a spike in the P300 with the timing of when that icon flashes, says Guenter Edlinger, G.Tec’s CEO. He will be presenting the system at the Human and Computer Interaction International conference in Orlando, Florida, this month.

G.Tec’s system works better, the more functions are added. That is because when there are more icons on the screen, it comes as a bigger surprise when the target icon flashes, creating a stronger P300 response. More than 40 icons can be displayed at once and submenus make it possible to add even more options.

G.Tec’s system has been tested at the Santa Lucia Foundation Hospital in Rome, Italy. “BCIs are definitely beginning to make the transition out of the lab,” says Ricardo Chavarriaga, a BCI researcher at the Swiss Federal Institute of Technology in Lausanne.

G.Tec says it is working on adding wheelchair control as a function, to help give users more mobility. “The point is that they can start making their own decisions,” says Edlinger.

Source | NewScientist

Leave a Reply