Keio University Graduate School of Media Design, the University of Tokyo

Connectivity

Meet the guy with four arms, two of which someone else controls in VR

These robotic limbs could someday help people work together when they’re far apart.

Yamen Saraiji has four arms, and two of them are giving him a hug.

The limbs embracing Saraiji are long, lanky, and robotic, and they’re connected to a backpack he’s wearing. The arms are actually controlled remotely by another person, who’s wearing an Oculus Rift VR headset, with which they can see the world from Saraiji’s perspective (cameras linked to the backpack ensure a good view), and wield handheld controllers to direct the non-human arms and connected hands.

After the hug, the robotic arms release Saraiji. Then the right hand gives him a high five, and Saraiji smiles.

Saraiji, an assistant professor at Tokyo-based Keio University’s Graduate School of Media Design, led the development of this robotic-arms-on-a-backpack project, called Fusion, to explore how people may be able to work together to control (or augment) one person’s body. Though some of the actions Saraiji shows me via video chat from his lab in Japan are silly, he thinks the device could be useful for things like physical therapy and instructing people from afar.

Besides hugging and high-fiving, the operator of the robotic arms and hands can pick things up or move around the arms and hands of the human wearing the backpack. The mechanical hands can be removed and replaced with straps that go around the backpack-wearer’s wrists if you want to truly remote control their arms. The device, which Saraiji created with colleagues at Keio University and the University of Tokyo, will be shown off at the Siggraph computer graphics and tech interaction conference in Vancouver in August.

The robotic arms can be strapped to the wearer's wrists, giving the remote operator more control over the wearer's arm movements.

There have been plenty of other efforts to create extra limbs that you can wear, and in fact this isn’t Saraiji’s first time making robotic limbs meant to attach to a human: he and most of the other Fusion researchers previously built a wearable set of arms and hands called MetaLimbs that a wearer controlled with their feet.

Having the limbs controlled by someone else—someone who can be in another room or another country, and in VR to boot—is a little different, however. Saraiji says he wanted to see what would happen if someone else could, in a sense, dive into your body and take control.

The backpack includes a PC that streams data wirelessly between the robotic arm-wearer and the person controlling the limbs in VR. The PC also connects to a microcontroller, letting it know how to position the robotic arms and hands and how much torque to apply to the joints.

The robotic arms, each with seven joints, jut out of the backpack, along with a connected head, of sorts. The head has two cameras that show the remote operator, in VR, a live feed of everything the backpack-wearer is seeing. When the operator moves their head in VR, sensors track that motion and cause the robotic head to move in response (it can turn left or right, tilt up and down, and pivot from side to side, Saraiji says).

The wearable system is powered by a battery that lasts about an hour and a half. It’s pretty heavy, weighing in at nearly 21 pounds.

“Of course, it’s still a prototype,” Saraiji points out.

While I’m talking to him, Saraiji puts on the backpack and enlists a graduate student to wear the VR headset and help demonstrate how it works. I call out a few commands, such as asking the robot-limb operator to pick something up. At first, he fumbles with a squeaky yellow toy with cartoon eyes, then manages to grab it and hand it to Saraji; then one of the robot hands takes the toy back, and gives it back to Saraji again. At one point, Saraiji walks behind the guy operating the arms in VR, so the operator can tap himself on the shoulder with one of the robot’s fingers and give himself an abbreviated neck rub.

Different buttons on the Oculus Rift controllers enable different finger functions: the operator can move the pinky, ring, and middle finger of each robotic hand simultaneously with a single button, while the thumb and index finger each have their own controls.

Hermano Igo Krebs, a principal research scientist at MIT who has spent decades studying rehabilitation robotics, doesn’t think the project would be practical for rehab. But he can imagine it being helpful in a lot of different situations—to assist an astronaut in outer space, for instance, or a paramedic with an unfamiliar medical procedure.

Saraiji says that he’d like to turn the project into an actual product, and he and his collaborators are in the process of pitching it to a Tokyo-based startup accelerator.