Leap motion: the ideal mode of interaction in the operating room
With its retail release planned for 22 July, the eponymous technology created by Leap Motion is set to make waves as a video game interface and computer controller. The accurate detection of each finger means 3D virtual objects can be manipulated remotely. This technology could be highly beneficial in specific environments. The Shacra team researchers have been experimenting with the device for several months in order to integrate it into their medical simulation platform (SOFA) and use it in their interventional radiology applications.
A gadget? Not only that
The Leap Motion controller is a small object similar to a USB flash drive, which enables remote human-computer interaction. Equipped with VGA cameras and sensors, the small device is designed to detect finger or pen movements which in turn control the computer: selecting keys, zooming, changing views, drawing or playing video games. Is it a new gadget? Of course, but the technology could be beneficial for applications which need to interact with computers in specific environments. In operating rooms, for example, it is out of the question for a surgeon to tap away at a keyboard, both in order to maintain sterility and because the space is limited. In interventional radiology, in particular when surgery is performed under MRI, it is also very difficult to introduce electronic devices due to the magnetic field generated by the scanner. This is why the clinician gives verbal instructions to assistants standing outside the room in order to check the pre-surgical images necessary to plan the procedure. This is a laborious working method likely to lead to misunderstandings, frustration and the loss of time. Opting for motion-based interaction would therefore make it possible to carry out in-situ manipulations, thereby saving precious time.
Developing a specific movement vocabulary...
Stéphane Cotin, Shacra team manager, is developing assistive technologies for surgical procedures used in the following context: "Our team took advantage of the fact that leap motion was in beta testing to explore the device's potential for the design of interfaces suited to operating room conditions ." The researchers were attracted to the hands-off computer interaction, combined with the ability to accurately detect the position of their hands in the room. "Although leap motion has difficulty identifying fingers in certain hand positions and the robustness of the monitoring system must be improved, its accuracy is still very high ," said Stéphane Cotin. This accuracy makes it possible to develop a simple, specific and intuitive movement vocabulary suited to operating conditions, in particular by monopolizing one hand only, since the other hand is often holding an instrument. This movement can also be more natural, allowing the clinician greater freedom of movement. Furthermore, it is possible to add visual feedback in order to guide the user in his movements by indicating which action will be triggered. Thereby, users can focus on the effect of a movement, and not the movement itself. "Being able to perform the necessary commands with simple movements having a semantic interpretation specific to the application is a real challenge which does not currently have a commercial solution ," specifies Myriam Lekkal, who has been working on this application since April.
... And modes for manipulating 3D objects
The team is also exploring another application of leap motion technology. Rémi Bessard Duparc is working on a movement interface which will enable surgeons to manipulate 3D simulations or images, grabbing virtual objects and moving them around. Thanks to this interface, practitioners can change, on the 3D model of the patient's organ directly, the position of electrodes intended to destroy a tumour by applying deep cold (cryoablation). They can check the impact of this change on the tumour destruction through simulation and adapt their operating strategy if necessary (pre-operative planning). In order to carry out these complex movements on 3D representations, the Shacra researchers have combined the recognition of a movement performed to virtually grab the electrode and the tracking of the hand movement which will determine the overall movement of the needle. "In order to overcome the difficulty of identifying each finger for certain hand positions, we have opted for movements using 2 fingers only, and we have used the very accurate and robust detection of the hand palm position to determine the needle's position ," emphasizes Rémi Bessard Duparc.
From August, these researchers will conduct essential work consisting of studying the practical benefit of this interface for clinicians and radiographers and ensuring that it achieves a satisfying compromise between robustness and precision. "The most important factor is to be able to improve the speed and accuracy of the current technique ," concludes Stéphane Cotin, "and we are fairly confident about that point!”
Improving user interfaces, and maybe more!
"Each device is more suited to certain tasks than others. They are not competitive, but complementary ," states Thomas Pietrzak, researcher in the Mint project team and specialised in movement interactions. "The strength of Leap Motion lies in accurately detecting 3D movements in a restricted space. It is perfectly suited for interacting in specific environments, such as operating rooms, but probably not for typing text while seated at a desk. " As such, the researcher is primarily interested in the benefits he could reap from the combination of existing technologies. For example, touchscreen users can only touch (enable) or not touch (disable) a function. The combination of tactile interaction and leap motion enables the inclusion of a hovering function, i.e. taking into account movements slightly above the screen; this is referred to as 2.5D. Hovering over a feature would enable the corresponding tool tip to be displayed, indicating its purpose to the user. This would greatly improve the ergonomics of tablets and mobile phones.
Thomas Pietrzak also considers combining the use of keyboards and leap motion to enhance the ergonomics of text-processing software keyboard shortcuts. Keyboard shortcuts are used to enter commands (copy/paste, formatting, etc.) without a mouse, by combining alphabetic keys with other keys, in particular the Control key, which specifies a change in the mode of use of the keyboard. These shortcuts are often dismissed by users, since they are hard to remember and sometimes difficult to perform, given the number of keys which must be pressed at the same time. "In collaboration with colleagues from the Max Planck Institute and the University of Toronto, we are studying the possibility of simplifying these shortcuts, for example by changing the mode when the user touches the mouse. " To this end, the first step would be to study - using leap motion - how users currently handle the keyboard and the mouse, and then to rely on the information collected to improve interaction.
Could other fields also benefit from this new technology? "Just like Kinect, which owes its success mainly to its alternative uses, I think Leap Motion will be most useful for its unplanned functions, " emphasizes the researcher. "I look forward to discovering such uses! "
Medicine is clearly a field of application which could benefit from the leap motion device. It should also enable the development of psychomotor rehabilitation tools to help people having suffered brain trauma to re-learn basic movements. Such applications are developed with Kinect for movements engaging the whole body. Leap motion would be more suited to the rehabilitation of fine motor skills: dexterity, prehension and manipulation of objects.
These articles could interest you:
Find out more
"Medical Image control using LeapMotion