Sites Inria

Version française

Medical - Robotic

Jean-Michel Prima - 25/06/2013

Bringing Visual Servoing to Medical Robotics

First introduced 20 years ago, medical robots have ushered a new era of minimally invasive surgery. Inria researcher Alexandre Krupa studies how visual servoing could yield yet a more accurate control of these machines. His latest work leverages the dense information contained in ultrasound images to control a robotized ultrasound probe, bringing probe automatic positionning and per-operatory real-time compensation of moving soft tissues.

Non-invasiveness and pinpoint accuracy are the twin benefits of robotic surgery. But when it comes to control, most of these robots still rely on a telemanipulator. In other words : a device that essentially duplicates the surgeon's gestures and allows him to wield the real surgical instrument within the patient's body. Might visual servoing enhance the capabilities of these machines and ultimately improve the surgeon's dexterity ? Pretty much so, believes Alexandre Krupa, a French researcher credited with a fair share of spadework in the field.

After first resorting to optical modality for controlling a surgical instrument [see box], the scientist turned to live ultrasound imaging. “This modality is less invasive ” and fairly inexpensive. In addition, “contrary to MRI, it allows real-time viewing. And it does not expose the patient to radiation as CT scanners do. For all those reasons, ultrasound-based visual servoing could be exploited in a host of clinical applications ” such as biopsy or cryoablation for instance. “Inserting a needle right in the middle of a tumor is a very complex thing to achieve since surgeons usually don't see what lies behind the observed plane provided by a conventional 2D ultrasound probe.

Tele-echography

Another promising field of application is tele-echography. Inria partook in Prosit, a research project whose aim was to build a portable robot that would enable radiologists to carry out ultrasound exams of patients located at a distance. The machine is placed by a non-specialist on the patient's body and then controlled by the radiologist through a haptic fictive probe. Thanks to visual servoing, “the robotized probe can be moved automatically to maintain an optimal view of moving soft tissues. ”  A mixed vision/teleoperation control mode can be activated on demand to automatically maintain the organ of interest in the ultrasound plane when the expert performs his exploration. The radiologist may also record anatomical sections of interest and, later on, ask the machine to automatically position the probe in such a way to retrieve the same section again.

Speckle Is Not a Noise

Having said that, on the minus side of the ledger, ultrasound delivers a rather poor quality image whose grayish impressionistic style calls for smart detection algorithms. “My first approaches were based on geometrical visual information. ”  In essence, the methods boil down to detecting visual points, segmenting contours and computing geometrical features that are used as input of the robot visual controller. “It works fine except when the observed organ section features complex shapes with boundary hard to detect. Then it's difficult to robustly extract pertinent geometrical information needed to control the robot. During a sabbatical at Johns Hopkins University, in Baltimore, in 2006, I initiated a new class of approaches that exploit tissue speckle information .”  Contrary to a popular belief, “speckle is not a noise. It results from reflection of very small cells contained in soft tissue. Therefore it is spatially coherent. ”  This property can thus be exploited to compute the relative pose between the probe and the target region of the organ.
Such speckle-based approach is good at exhibiting otherwise hard-to-detect structures. It has been considered in USComp, a Krupa-led research project  whose aim was to allow real-time compensation of soft tissues motion during ultrasound imaging. “We have also extended this method in the context of target tracking in a sequence of 3D echocardiographic volumes, in collaboration with the Children’s Hospital, Boston and the Harvard Medical School, Boston. We provided a solution based on dense ultrasound visual servoing to detect the tip of a novel tool  that has been invented there in order to enable the surgical removal of tissue from inside the beating heart. This instrument is integrated with a steerable curved concentric tube robot that can enter the heart through the vasculature. ”  This combo of cutting-edge technologies has been validated through in vivo porcine experiments.

Multimodal Image Registration

Another task at which density-based approaches excel is image registration. This opens the door to yet another host of medical applications. Namely: the overlay of stored preoperative data with intraoperative image. The surgeon could retrieve an area of interest that he had identified previously and superimposed it on the current view. Going even further, the scene could be enriched with information stemming from other imaging modalities. First experiments of such multimodal registration are already underway.

Instrument Positioning

Circa 2000, the genesis of Krupa's approach was a vision-based system that could automatically position a surgical instrument during a laparoscopic surgery . “In the beginning of such operation, when the instrument starts entering the abdomen through a trocar at an incision point, it does not appear in the endoscope's field of view. It might also accidentally go off screen later on, during the surgery gesture. At these critical junctures, there is a risk of undesirable contact with an organ. So we mounted a laser-pointing device on the tip of the instrument. It projects a light dot on the surface of the organ. This optical marker is visible in the endoscopic video feed even if the instrument is still off screen. It allows the localizing of the instrument with respect to the scene. The robot is then controlled by visual servoing to bring the instrument in the center of the endoscopic field of view.
A desired distance to the surface of the targeted organ can also be specified. From there, germinated the next idea: “Compensate for the patient's motions. For instance, automatically synchronize not only the instrument but also the camera to the heart beat during cardiac surgery. ”  Providing such stabilized mode would enable the surgeon to perform gestures with better accuracy.

Top