‘Finger
HomeHome > News > ‘Finger

‘Finger

May 13, 2023

From Boston Dynamics’ Atlas to Google's SayCan, most hand-wielding robots don't have the dexterity necessary to "feel" what they’re holding. (If they did, maybe a 7-year-old boy wouldn't have had his finger snapped by a chess robot last year.) Getting a robot to "see," move toward, and grasp an object is complicated enough; adding the ability to feel that object and adjust its grip accordingly is a whole other challenge. But after five years of experimentation, a group of researchers at Columbia University appear to have done just that.

In a paper shared via arXiv (a server dedicated to preprint academic journal entries), computer scientists and mechanical engineers say they’ve built a robot hand that uses tactile and proprioceptive feedback. Proprioception is the ability to sense movement and location physically, and while it's typically discussed in reference to living creatures’ muscles and joints, the robot hand proves it isn't exclusive to animals. Paired with the ability to leverage tactile feedback, proprioception allows the robot hand to sense the object it's holding and adjust its grip accordingly without help from a passive support surface, like a table.

The team writes that they used reinforcement learning (RL) paired with sampling-based planning (SBP) algorithms to train the robot. Using RL, the robot was given "reward" signals when it did something the researchers wanted and similarly "scolded" when it did something it wasn't supposed to do. The team could have technically used RL alone, but because this technique leaves room for error—the tiniest deviation from its anticipated "shoulds" and "shouldn'ts" would throw it off—they used SBP as a supplement. Every time the robot was rewarded for doing something it was supposed to do, SBP enabled it to add a branch to an ever-expanding digital web, which serves as a set of choices the robot can run through when presented with a new opportunity.

This training paves the way for a robot hand that does far more than just pick up objects. After it grasps something, the Columbia team's robot can use its proprioceptive skills to get the gist of what it's holding. This allows the robot to adjust how much pressure it's using to maintain a grip. The robot can also engage in "finger-gaiting," in which it moves individual fingers to better grip what it holds. While it adjusts, the robot keeps at least three fingers on the object to prevent it from falling, eliminating the need for tables or other surfaces. Because the robot doesn't rely on a visual sensor, it's just as capable of adjusting and maintaining grip in the dark as in a well-lit area.

The robot hand is just that—a hand—so we’ve got a long way to go before we see humanoid robots that use similar techniques to "feel" what they’re holding. Once we get closer to that point, though, we might see more capable helper robots that can grasp, hold, and adjust the position of objects better than today's predecessors.