Freed from restlessness, robotic hands manipulate everyday objects


Yale researchers are teaching robotic hands to move, increasing object adaptability and movement control, in newly published research.

Staff reporter

Yale Daily News

When robotic hands—traditionally designed to perform perfectly pre-calculated steps—are programmed by the Yale GRAB Lab to continuously alter their grip, shaking promotes adaptability to the variety of objects and movements needed to navigate the world. real.

Forty-five percent of the human motor cortex is dedicated to manual manipulations. Designing a robot hand to mimic human dexterity is a difficult task, but Yale’s GRAB lab, led by Professor Aaron Dollar, has made a significant breakthrough. By using soft hands, which can bend to their surroundings without receiving motor commands, the robot can adapt to tiny errors in real time. This allows the hands to manipulate multiple types of objects, unlike the majority of published hand designs which are specifically trained for a single task. In addition, the hand is multimodal, that is, it combines several types of movements, such as rotation and translation. The new design is published in IEEE Letters on Robotics and Automation.

“Our system works on several different objects, such as toy bunnies and miniature cars, which have different surfaces, topologies and textures,” said Bowen Wen, Rutgers computer science PhD student and GRAB Lab collaborator. “This means that our system is robust to different scenarios.”

Typically, object manipulation is achieved by fixed contact points with robotic hands, according to Andrew Morgan, a doctoral student at Yale’s GRAB Lab and the paper’s first author. However, the humans fidget, adjusting their grip and continually untying and hanging up the digits to complete a move. This paper describes new algorithms for determining when it’s time to change points of contact with an object, a technique known as finger gait, resulting in robots that better mimic human movement patterns.

Adaptability is crucial for the human hand. Morgan provided the thought experiment of a finger on a wooden table. As there is friction with the surface, the finger does not slip. However, if there was a sheet of paper between the fingertip and the wood, the finger would suddenly slip very far. Such properties are very difficult to predict for a computer. Given the unstructured nature of life, robotic hands cannot rely solely on sensors to function.

“The real world is always going to act differently than a simulation,” Morgan said. “Things change depending on many conditions, such as the humidity in the air. Building better hardware that can take over is incredibly helpful to robotics.

The softness of the underactuated hands used in this work has enabled the development of complex manipulation algorithms that require few system parameters to be known. This makes robot hands generalizable to a wide range of in-hand manipulation tasks. The robot hand successfully picked up and rotated a beach ball, a toy car, a bunny, and a rubber duck in a goal configuration. Each object had a different coloring, texture, compressibility and geometry.

Also, in traditional hard hands, a small mistake can make an object’s ejection worse. Soft hands are much more forgiving of mistakes without sacrificing control. It is impossible to perfectly mimic the 21 degrees of freedom of the human hand with current technology, so a successful robotic hand must be malleable to environmental feedback.

Decoupling the perception and planning aspects of programming not only promotes adaptability, but also decreases training time. While a traditional model needs to be trained for weeks on supercomputers, this hand only requires two days of computing time on a personal computer to run.

“I believe that compliance, or also called softness, of robot manipulators will bring many new possibilities for robot manipulation that would otherwise be impossible.” said Kaiyu Hang, assistant professor of computer science at Rice University and former postdoctoral fellow at GRAB Lab. “If we look at how humans manipulate objects, we don’t always try to manipulate everything very precisely at every step. Instead, we almost always use a lot of imprecise actions and fix small mistakes along the way to ultimately achieve the manipulation goals.

The Yale GRAB Lab created a video showing their robotic hand in action.

The article describing the new finger gait algorithm was written by Andrew S. Morgan, Kaiyu Hang, Bowen Wen, Kostas Bekris, and Aaron M. Dollar.


Valentina Simon covers astronomy, computer science and engineering stories. She is a freshman at Timothy Dwight College majoring in Data Science and Statistics.


Comments are closed.