‘Breakthrough’ in robotic hands development

The University of Bristol researchers have made a “key breakthrough” in the development of tactile robotic hands.

The research team has created a four-fingered robotic hand with artificial tactile fingertips capable of rotating objects like balls and toys in any direction and orientation.

It can even do it when the hand is upside down, something that has not been achieved before, according to the University of Bristol.

Improving the dexterity of robot hands could have “significant implications” for automating tasks such as handling goods for supermarkets or sorting through waste for recycling.

OpenAI, the company behind ChatGPT, was the first to demonstrate human-like dexterity feats with the robotic hand in 2019 but disbanded its robotics team soon after.

Its set-up included a cage holding 19 cameras and more than 6,000 central processing units (CPUs) to learn huge neural networks which could control the hands.

Prof Nathan Lepora, lead author of the University of Bristol study, wanted to see if similar results could be achieved using simpler and more cost-efficient methods.

University teams from the Massachusetts Institute of Technology, the University of California Berkeley, Columbia University in New York, and Bristol managed to achieve complex feats of robot hand dexterity using simple set-ups and desktop computers.

The key was building a “sense of touch” into the robotic arms, while developing high-resolution tactile sensors was possible due to advances in smartphone cameras which are now tiny enough to comfortably fit inside a robot fingertip.

According to Prof Lepora, his team in Bristol created an artificial tactile fingertip using a 3D-printed mech of pin-like papillae on the underside of the skin, based on copying the internal structure of human skin.

“These papillae are made on advanced 3D-printers that can mix soft and hard materials to create complicated structures like those found in biology,” Lepora said.

He added it was “hugely exciting” when it worked for the first time on a robot hand upside-down as no-one had done it before.

“Initially the robot would drop the object, but we found the right way to train the hand using tactile data and it suddenly worked even when the hand was being waved around on a robotic arm,” he said.

Going forward, researchers will work on improving the technology to go beyond pick-and-place or rotation tasks and move to more advanced examples of dexterity, such as manually assembling items like Lego.