New man-machine interface: mini-robots for display and interaction
At the UIST2016 (User Interface Software and Technology) conference that took place in Tokyo from 16 - 19 October 2016, Mathieu Le Goc, PhD student with the AVIZ team at the Inria Saclay - Île-de-France centre, presented a new man-machine interface mode comprising several autonomous mini-robots that manage display and interaction. On this occasion, he received the best paper award.
The collaboration was the result of a meeting between Mathieu Le Goc and Pr. Sean Follmer (University of Stanford). Both had worked on a similar project and their work was complementary. Supervised by Pierre Dragicevic and Jean-Daniel Fekete, Mathieu spent three months in California in collaboration with Pr. Sean Follmer and his students. Among other things he custom-built and produced - together with Lawrence Kim, PhD student at the University of Stanford - all of the material components of the robots. "The platform is made up of mini-robots, a wireless communication card and a high-speed structured light projector for optical tracking. Each of the 100 robots that exist to date measures slightly over two centimetres in diameter, and is equipped with small wheels and two types of sensors; light sensors for localisation, tactile and movement sensors for interactions." Mathieu describes.
From the computer to the robots
The structured light sent by the projector makes it possible to situate the localisation of the robot and its orientation. The robots are not autonomous; the computer coordinates each robot and their interactions. Mathieu Le Goc explains that "this new form of man-machine interaction is based on the principle of swarms", i.e. phenomena of movements and interactions in certain communities of social insects, such as ants and bees, or moving animals such as migrating birds. So, based on this model, the computer commands the movement to each robot and their grouping mode. Structured light is the vector of the information sent by the computer to the robots.
From the robots to the computer
The transmission of information is reciprocal. The localisation sensors on the robots enable the computer to visualise positioning and movements on the screen. "The localisation systems were developed by our Stamford partners", Mathieu specifies. As for physical events, these are identified using the tactile sensors.
Unprecedented perspective and mobility
It is a new man-machine interaction mode firstly because the screen is no longer the intermediary. This platform provides another perspective; it becomes possible to act via touch thanks to the tactile sensors and thereby interact with the physical world.
Another innovation lies in the localisation of the robots and the mobility of the device. "Other platforms that operate using magnets have already been developed. Magnets, like the wheels in our project, make it possible to move the robots. Functioning using structured light has the advantage of localising the robots. In addition, this technology makes the platform more mobile" Pierre Dragicevic explains.
"It is a real evolution in advances in display and visualisation techniques" explains Jean-Daniel Fekete, head of the AVIZ team. We still do not know what type of application will popularise this technology - one that links the real world to the virtual world - but "in terms of display, we can use the robots like we use pixels, for a physical representation" Mathieu le Goc concludes.