Fantastic voyage

26 October 2018



Tiny robots travelling around inside the human body with a degree of autonomy, capturing images and helping with diagnosis; like so much of modern medicine, what was fantastical science fiction only a few decades ago is now a fast-approaching reality. Tim Gunn talks to mechanical engineer Dr Mark Rentschler, from the University of Colorado Boulder, about his work on robotic endoscopies.


For all the hype around self-driving vehicles, engineers haven’t quite worked out how they should deal with humans. When it comes to identifying other road users and accounting for the unpredictable, illogical things they might do, people are still much better than AI. The road stays in one place, but humans complicate matters. Oddly, it may be travelling through the human body that gives autonomous vehicles their chance.

Dr Mark Rentschler of the University of Colorado Boulder (CUB) is a mechanical engineer who develops capsule robots for robotic endoscopies, known as robotic endoscope platforms (REPs).

“Driving a robot inside the body is nothing like driving a car on the street,” he explains. “You’re in a deformable environment, so everything is sort of squishy, and at any instant, what was there a minute ago could be somewhere else. You’re driving up and down steep hills; if you’re in the gastrointestinal tract, you’re driving in a cave; there are no highway markers, no GPS, no other vehicles – no points of reference.

So we’ve realised over the past few years that we need to give autonomy, at least in some capacity, to the robot, so physicians can focus on what they’re good at: diagnosis and treatment, instead of just trying to get the robot to where it is needed.”

Rentschler and his CUB computer science collaborator Christoffer Heckman’s ultimate aim is to introduce a reliable mobility system and real-time navigation to capsule endoscopes, thus transforming them “from observational devices into active surgical tools, offering biopsy and therapeutic capabilities, and even autonomous navigation in a single, minimally invasive device”.

This requires far more than precision motion control, as the tissue with which endoscopes interact is not only delicate, but viscoelastic, adhesive, liquid-coated and deformable. Moreover, this tissue creates a challenging, sparse and dynamic environment, that changes subtly with each breath the patient takes and radically between different inspections.

Sensor fusion

To confront the challenges of navigating inside the body, the REP prototype combines inputs from the traditional endoscopic camera with an inertial-measurement unit, locationtracking systems and a magnetometer, among other sensors, to gain an accurate working understanding of its surroundings. In this way, the platform as a whole is not befuddled by the way its camera readings change as it traverses the sharp rises and dips of the haustral folds, and the encoder on the motor allows the device to monitor its progress against the peristaltic motion of the gastrointestinal tract, which changes the environment to such an extent that it might invalidate past visual data.

We’re also beginning to incorporate off-theshelf elements and self-driving-car technology to not only tell the robot where to go, but also create a map of where it has already been.

Rentschler can’t avoid using the relevant buzzwords when it comes to this technology: it’s “sensor fusion”. For him, bringing together all of the REP’s inputs with specific code is the key to its precise closed-loop motion control – its ability to correct for changes in its environment. He explains the approach as “using visual data primarily, with additional sensordata points for ‘checks’ along the way to correct position estimates from only visual feedback”.

So far, the team has had success using magnetic tracking as the ground truth the REP can reference back to, but, in order to shrink the device as much as possible, they are working on an approach that allows all processing functions to be done over a Wi-Fi connection.

“Not all the computational power needs to be on the robot,” notes Rentschler. “We can leverage the power of phones to get the data and feed it right back to a physician.”

With no break in his enthusiasm, Rentschler continues, “We’re also beginning to incorporate off-theshelf elements and self-driving-car technology to not only tell the robot where to go, but also create a map of where it has already been, so if we want to go back to a certain spot, we can. The difficulty is that we have to be able to track those positions as they deform in time.” That challenge granted, Rentschler is particularly excited by the fact his research has potential beyond the REP. Applied to more traditional endoscopes, for instance, this technology “would allow doctors to focus on looking for polyps and diseased tissue and actually treating it, rather than trying to keep track of where they are and where the last lesion was”.

But it’s not merely that location tracking can help surgeons focus. In bench-top tests using their modular endoscopy simulation apparatus (MESA), Rentschler’s team have found that human operators are consistently outperformed by “fairly rudimentary tracking systems”. The on-board microcontroller can make decisions faster than a remote human operator, who is unlikely to be able to consider all of the feedback simultaneously. This was a key realisation for Rentschler in the way he thinks about his work as a device developer and mechanical engineer, but in hindsight, it’s almost obvious – even fighter pilots rely on computers to do their jobs.

Intestinal fortitude

Moving a long tube through the colon is an inexact science. Unsurprisingly, this is the main reason for patient discomfort during colonoscopies. The most common side effect, responsible for as much as 90% of pain, is looping: when the tip of the endoscope stops advancing and the sides of the tube press against the walls of the colon, forcing it out of shape. This image alone is enough to show why only 56% of Americans over 50 have colonoscopies as regularly as they should. Swallowable capsule endoscopes were designed to address the problems of discomfort and avoidance, but their motion is entirely passive, meaning they are less accurate and cannot interact with the tissue like a traditional endoscope.

In this context, Rentschler and Heckman’s most recent paper makes a stark case for the potential of capsule robots to save lives, “Although colorectal cancer is treatable with a relative five-year survival rate of 90% if detected in the early and localised stage, it remains the third leading cause of cancer deaths in the US for men and women.”

Although it is currently too large to be swallowed, the team’s REP fits neatly between the extremes of possibly painful manual control and possibly inaccurate passive motion. Moving on microtextured tank tracks, the latest prototype looks something like a homemade lunar module that can keep moving while it’s upside down. It also has all the capabilities of a conventional endoscope: suction, irrigation, biopsy, a camera and lighting. Remarkably, it would be impossible to feel an REP moving through the body, as the GI only transmits pain signals when it becomes distended (by looping, for instance).

Screening versus diagnosis

As he’s married to an endocrinologist, Rentschler quickly learned how to pitch his work to doctors. He’s not Geoffrey Hinton, a prominent psychologist and computer scientist at Google, who told radiologists that AI image analysis has made them “the coyote that’s already over the edge of the cliff”. Nor does his device placate doctors by focusing on remote control and, as Wired put it, “making them feel like Iron Man”. Instead, Rentschler says he’s “trying to add value so doctors can improve what they do with their patients”. It’s his belief that “a lot of physicians aren’t interested in screening patients so much as figuring out the treatment. There’s a lot of repetitive grunt work before you get to that point, and that’s what robotics is good at.”

Over the next year, Rentschler and Heckman plan to start a company and begin raising the capital necessary to refine and minimise their REP. The latest prototype is already a tiny 105×66×66mm, and versions have been tested on animals, as well as in the team’s MESA.

As Rentschler puts it, “There are plenty of software things we need to optimise, but that’s more of a time issue than a money issue. I think shrinking the hardware is simply a money issue. If we were to make custom motors we could probably shrink the device by at least 50%, not quite at the point you could swallow it, but close.”

Shrinking the hardware is a money issue. If we made custom motors we could probably shrink the device by at least 50%, not quite at the point you could swallow it, but close.

As anyone with a colonoscopy ahead will attest, the REP is an attractive prospect. Robotic capsule endoscopes can minimise patient discomfort while ensuring that physicians are working as efficiently and effectively as possible. The way they cut out “grunt work” reflects some of Rentschler’s own concerns as an academic and a developer of medical devices. “I can write papers and patents and train students – and I like all of that – but I really want to get to the patient. I want to try to make a positive impact for the patient.”

With self-driving car technology stalling, his commitment raises an interesting question: will we have autonomous vehicles inside us before we start climbing into them?

An REP capable of travelling through the body unnoticed and performing all the usual capabilities of a conventional endoscope.


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.