By Ashwini Sakharkar 28 Jun, 2024

Collected at: https://www.techexplorist.com/breakthrough-development-tactile-robotic-hands/85650/#google_vignette

The enhancement of robot hand dexterity could have a significant impact on automating tasks like handling products in supermarkets or sorting through recyclable materials.

Under the guidance of Professor Nathan Lepora, who specializes in Robotics and AI, the University of Bristol team has developed a robotic hand with four fingers and artificial tactile fingertips. This hand is capable of rotating objects like balls and toys in any direction and orientation, even when the hand is upside down – a feat that has never been achieved before.

In 2019, OpenAI demonstrated human-like dexterity with a robot hand, marking a groundbreaking achievement. However, despite garnering widespread attention, OpenAI disbanded their 20-person robotics team shortly after. A cage holding 19 cameras and over 6,000 CPUs was utilized by OpenAI to train large neural networks capable of controlling hands, but such an endeavor would have incurred substantial expenses.

Professor Lepora and his colleagues aimed to investigate whether comparable outcomes could be attained through more straightforward and cost-effective methods.

Over the past year, four university teams from MIT, Berkeley, New York (Columbia), and Bristol have demonstrated impressive feats of robot hand dexterity, including picking up and passing rods and rotating children’s toys in hand. Remarkably, they accomplished these tasks using uncomplicated setups and desktop computers.

Teams were able to achieve this advancement thanks to the integration of a sense of touch into their robot hands.

The possibility of creating a high-resolution tactile sensor was facilitated by the progress in smartphone camera technology, which has now reached a point where the tiny cameras can be easily installed inside a robot’s fingertip.

“In Bristol, our artificial tactile fingertip uses a 3D-printed mesh of pin-like papillae on the underside of the skin, based on copying the internal structure of human skin,” Professor Lepora explains.

“These papillae are made on advanced 3D printers that can mix soft and hard materials to create complicated structures like those found in biology.

“The first time this worked on a robot hand upside-down was hugely exciting as no one had done this before. Initially, the robot would drop the object, but we found the right way to train the hand using tactile data, and it suddenly worked even when the hand was being waved around on a robotic arm.”

The technology’s next phase involves advancing beyond pick-and-place or rotation tasks and progressing to more complex demonstrations of dexterity, like manually assembling items such as Lego.

Journal reference:

  1. Max Yang et al. AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch. arXiv, 2024; DOI: 10.48550/arxiv.2405.07391

Leave a Reply

Your email address will not be published. Required fields are marked *

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments