How I Became a Robot in London—From 5,000 Miles Away
I am but a babe, exploring the world for the first time. Wearing a computerized glove, I reach forward in pursuit of a little toy basketball. A robotic arm and hand do the same, mimicking my every move. Slowly I grasp the object, lift it, swing my arm over, and let go, dropping the ball—ploink!—into a plastic cup.
I am very, very proud of myself. Applause erupts from the computer in front of me. But this is no American applause here in San Francisco, this is British applause. The robotic hand and ball are actually in London—I’ve just bossed around some hardware clear across the Atlantic.
My tool in this feat was a Shadow Hand, perhaps the most complex robotic hand on Earth. On each of its fingertips is a sensor that allows the robot to feel, a sensation that’s piped across the world into my haptic glove. If I merely brush the Shadow Hand against a ball, I get a subtle sensation. When I grip the ball, the sensation grows more intense. Amazingly, there’s very little latency between my movement and the robot’s, even though the system is running through a 4G phone sitting on the table beside me.
With the glove on my hand, I am both there in London and not there. I can feel the ball, but also not feel it, because what I’m getting is a reproduction of sensation. The gentle proddings are kind of like having a bunch of pixies dancing on each fingertip.
Welcome to the eerie and improbable frontier of “telerobotics”: the act of piloting machines from afar. Surgery and bomb-disposal robots already have simple haptics for operators—mostly to telegraph collisions—but they pale in comparison to this rich, elaborate sense of robotic touch.
This new system is made of components from three different groups, each with their own area of research expertise: the glove with haptics was designed by HaptX, the robotic hand came from Shadow Robot Company, in England, outfitted with fingertips from SynTouch. The project is funded by ANA Holdings, the parent company of All Nippon Airways. (They’re in the business of connecting people, after all. But this is certainly a nontraditional approach.)
The WIRED Guide to Robots
First, that Shadow Hand. It looks a bit like the Terminator’s hand when he rips off his skin, only less metallic. It’s meant to replicate the major movements of the human hand, and it does so with hypnotic accuracy. “But there are some subtle details—how the palm curls, the way the base of the thumb moves around, the way skin covers joints—that we haven’t yet managed to get into the robot design,” says Rich Walker, managing director of Shadow Robot Company. “One of the really interesting benefits of this sort of project is we can see what is necessary to be able to do things, and what is just nice to have.”
On each of the hand’s fingertips is a dome dotted with 24 electrodes, on top of which is a skin of silicon. When SynTouch injects saline, it creates a kind of sea between the skin and the electrodes. Put pressure on the fingertip and the electrodes detect the change in resistance in the saline, giving the hand the power to sense touch in fine detail.
When I pull on the glove, I’m thrust into a perspective that’s disorienting at first, made up of two camera feeds on side-by-side screens. One of them is zoomed out and pointed at the arm; the other is sitting on the table and looking closely at the objects I’m manipulating. That’s not how you or I see the world—we’re used to looking directly down at our hands. But you get used to watching the zoomed-out camera to maneuver the hand right up to an object, then switching to the table-level camera when you’ve almost made contact.
Once you get a hang of the perspective, it truly feels like you’ve reached your arm across the Atlantic, go go gadget style. “A light brush will trigger a partial inflation of our actuators, lightly displacing the skin on the user’s fingertips,” says Michael Eichermueller, director of R&D and the lead on HaptX’s telerobotics project. “A full squeeze of a ball will trigger a full inflation and activate our force-feedback exoskeleton, simultaneously pressing the skin and restricting finger motion around the edges of the ball.” That restriction replicates the feeling of holding a solid object, when in reality there’s nothing in my hand.
This is not haptics as you and I are accustomed to. Haptic vibration is all well and good for phones and game controllers, but those devices aren’t trying to reproduce how an object feels. They’re just communicating text messages or, in the case of video games, nearby explosions.
Replicating a sense of touch demands more subtlety: You drag your fingers across a surface to divine its texture, or squeeze an object to determine its softness. “Humans subconsciously use many subtle cues, such as the pressure or force an object applies to the skin, to manipulate objects and perform dexterous tasks,” says Jake Rubin, founder and CEO of HaptX.
The technology is in its early days, but it dovetails with one of the greatest promises of advanced robotics: keeping humans out of dangerous situations. While the Shadow Hand isn’t a perfect analog for the miracle that is the human hand, it’s pretty impressive, and robotic hands will only develop defter manipulation skills from here. So there might come a day where we can send highly dexterous robots into sticky situations by remotely piloting them as physical avatars of ourselves, instead of trusting them to find their own way.
“Robots without touch are forced to either work in environments where everything is in a known position with known properties, or are forced to move very slowly so problems can be detected before they are too severe,” says Jeremy Fishel, co-founder and CTO of SynTouch. “The sense of touch solves this.”
Here’s where things get even stranger: Should a robot telegraph pain? Pain, after all, keeps us from doing stupid things with our bodies. If you’re operating a very expensive robot, you’ll probably also want it to tell you if you’re pushing it to the point of injury. Researchers are actually working on this with prostheses, first of all by figuring out how to get robots to experience “pain,” which as non-biological entities they’re incapable of feeling, and then how to communicate that to an amputee.
Let’s get stranger still: We may well tumble into the uncanny valley of robot touch some day, caught in a scenario where the simulated touch feels super realistic, but not quite realistic enough. “Especially touch between people or touch between a person and another living creature, I believe is where the uncanny valley is going to come in,” says roboticist Heather Culbertson, who studies haptics at University of Southern California, but who wasn’t involved in this new research. “Where you’re touching something that doesn’t quite feel alive, it doesn’t quite feel real, but it feels not mechanical either.”
So what begins as a very strange sensation—feeling objects as a robot would—gets ever stranger. We are but babes, exploring the world anew.