Controlling your smart home using your brain
Could it be possible to switch off a light or to flick between TV channels through brain activity alone? At their platform for research into connected homes, Orange Labs are starting to explore the potential of neural interfaces. Orange is R&D network recently tasked the Inria Rennes - Bretagne Atlantique centre with delivering a proof of concept encapsulating the the very latest scientific developments in the field.
Science fiction? Not any more. At least, not entirely. When your brain makes a decision, its neurons produce electrical impulses. An EEG headset can be used to capture these signals on the scalp. Although the technology remains rudimentary, scientists are now able to interpret the signal produced, before then converting it into a command line. This is what is known as a brain-machine interface. For the past fifteen years or so, the Hybrid research team in Rennes has been designing algorithms aimed at improving computer processing of these brain signals. This research paved the way for OpenViBE, a software platform that is now used worldwide to develop BCI applications, and it is this expertise that Orange Labs are now interested in.
"We are dealing very much in possibilities here ", explains Jean-Philippe Javaudin, the head of the research programme at the Orange Labs centre in Rennes. "It will be some time before we're able to install this technology for our clients, but our aim is to improve our expertise in BCI in order to better assess how we might one day be able to incorporate neural control into our products. "
Which products? "Connected homes ", is the answer given by Sylvain Marrec, head of the research project at the Orange Labs centre in Lannion. "More specifically, we focus on what we call multimodal interaction with appliances found in the home. Users are currently able to control these appliances using text, gestures, voice, etc. In the not too distant future, brain control might be added to this list. With this in mind, our aim is to investigate how this might be implemented and to test it in order to assess the results. Of course, BCI is not one of our core business areas. In order to acclimatise ourselves on this subject, we made the decision to get in touch with Inria, giving us direct access to state-of-the-art technology. "
This partnership will take the form of an external research contract, running for a period of seven months. "The goal is to design a proof of concept capable of demonstrating how a home might be controlled through the use of a headset capturing brain impulses and an augmented reality interface ", explains Foued Bouchnak, project leader in charge of the research contract at Orange Labs in Rennes. "In this instance, the technology used is a pair of Microsoft HoloLens glasses, which display icons in the wearerís field of vision. Users select one of these icons by focusing their attention on it, and it is then up to us to identify their choice by analysing the brain signal. For this case study, we employed two scenarios: switching a light on and off, and then switching a television on and off or moving to the next/previous channel. "
This research involves not only scientists from the Hybrid team but Inria research engineers as well. "In our scenario, these research engineers are responsible for bridging the gap between between OpenViBE and Home'In, Orange Labs' integrative research platform for smart homes ", adds Sylvain Marrec. "As things currently stand, brain control is not yet ready to be marketed ", notes Jean-Philippe Javaudin.
First and foremost, there is the issue of maturity in terms of expected performance levels. If a BCI command is able to reach a success rate of 80%, that might be interesting, but it wouldn't be sufficient if we were looking to put this technology on the market. Then you have the ergonomic constraints to take into account: users have to wear a headset and Hololens glasses. This is an acceptable constraint in certain situations, for people with disabilities for example, but before we can take this to the wider public, there are certain technological barriers that will need to be overcome. It's really a question of time."
But how much time? Five years? Ten years? "Things can move very quickly ", says Foued Bouchnak. "I firmly believe in the potential of BCI. There is an obvious logic to our company's decision to adopt this technology. Comparisons can definitely be made with the voice recognition systems that first appeared in the 2000s, which were initially limited to persons with reduced mobility, but the technology was very twitchy and far from perfect. Words had to be enunciated one by one, etc. As time went on, voice assistants developed very quickly, and they are now extremely fluid. Everyone uses them - take Djingo, for example, Orange's voice assistant. I believe that BCI are on the verge of becoming commonplace relatively soon. "