New wrist-mounted device augments the human hand with two robotic fingers.
Twisting a screwdriver, removing a bottle cap, and peeling a banana are just a few simple tasks that are tricky to pull off single-handedly. Now a new wrist-mounted robot can provide a helping hand — or rather, fingers.
Researchers at MIT have developed a robot that enhances the grasping motion of the human hand. The device, worn around one’s wrist, works essentially like two extra fingers adjacent to the pinky and thumb. A novel control algorithm enables it to move in sync with the wearer’s fingers to grasp objects of various shapes and sizes. Wearing the robot, a user could use one hand to, for instance, hold the base of a bottle while twisting off its cap.
View: Photos of the Day: Seven Fingers Are Better than Five
“This is a completely intuitive and natural way to move your robotic fingers,” says Harry Asada, the Ford Professor of Engineering in MIT’s Department of Mechanical Engineering. “You do not need to command the robot, but simply move your fingers naturally. Then the robotic fingers react and assist your fingers.”
Ultimately, Asada says, with some training people may come to perceive the robotic fingers as part of their body — “like a tool you have been using for a long time, you feel the robot as an extension of your hand.” He hopes that the two-fingered robot may assist people with limited dexterity in performing routine household tasks, such as opening jars and lifting heavy objects. He and graduate student Faye Wu presented a paper on the robot this week at the Robotics: Science and Systems conference in Berkeley, Calif.
Biomechanical Synergy
The robot, which the researchers have dubbed “supernumerary robotic fingers,” consists of actuators linked together to exert forces as strong as those of human fingers during a grasping motion.
To develop an algorithm to coordinate the robotic fingers with a human hand, the researchers first looked to the physiology of hand gestures, learning that a hand’s five fingers are highly coordinated. While a hand may reach out and grab an orange in a different way than, say, a mug, just two general patterns of motion are used to grasp objects: bringing the fingers together, and twisting them inwards. A grasp of any object can be explained through a combination of these two patterns.
The researchers hypothesized that a similar “biomechanical synergy” may exist not only among the five human fingers, but also among seven. To test the hypothesis, Wu wore a glove outfitted with multiple position-recording sensors, and attached to her wrist via a light brace. She then scavenged the lab for common objects, such as a box of cookies, a soda bottle, and a football.
Wu grasped each object with her hand, then manually positioned the robotic fingers to support the object. She recorded both hand and robotic joint angles multiple times with various objects, then analyzed the data, and found that every grasp could be explained by a combination of two or three general patterns among all seven fingers.
The researchers used this information to develop a control algorithm to correlate the postures of the two robotic fingers with those of the five human fingers. Asada explains that the algorithm essentially “teaches” the robot to assume a certain posture that the human expects the robot to take.
Bringing Robots Closer to Humans
For now, the robot mimics the grasping of a hand, closing in and spreading apart in response to a human’s fingers. But Wu would like to take the robot one step further, controlling not just position, but also force.
“Right now we’re looking at posture, but it’s not the whole story,” Wu says. “There are other things that make a good, stable grasp. With an object that looks small but is heavy, or is slippery, the posture would be the same, but the force would be different, so how would it adapt to that? That’s the next thing we’ll look at.”
Wu also notes that certain gestures — such as grabbing an apple — may differ slightly from person to person, and ultimately, a robotic aid may have to account for personal grasping preferences. To that end, she envisions developing a library of human and robotic gesture correlations. As a user works with the robot, it could learn to adapt to match his or her preferences, discarding others from the library. She likens this machine learning to that of voice-command systems, like Apple’s Siri.
“After you’ve been using it for a while, it gets used to your pronunciation so it can tune to your particular accent,” Wu says. “Long-term, our technology can be similar, where the robot can adjust and adapt to you.”
“This is breaking new ground on the question of how humans and robots interact,” says Matthew Mason, director of the Robotics Institute at Carnegie Mellon University, who was not involved in the research. “It is a novel vision, and adds to the many ways that robotics can change our perceptions of ourselves.”
Down the road, Asada says the robot may also be scaled down to a less bulky form.
“This is a prototype, but we can shrink it down to one-third its size, and make it foldable,” Asada says. “We could make this into a watch or a bracelet where the fingers pop up, and when the job is done, they come back into the watch. Wearable robots are a way to bring the robot closer to our daily life.”