DeepMind researchers have trained a robot to play table tennis. The results were impressive – it was able to beat some of its human opponents.
DeepMind, an artificial intelligence (AI) division of Google’s subsidiary Alphabet, has trained a robotic arm to play table tennis. The experiment showed that the robot mastered the skills, as it won 13 out of 29 matches played against human opponents of varying skill levels in full competitive table tennis games.
Google DeepMind’s researchers trained the system using a two-part method: first, they used computer simulations to develop its hitting skills, and then fine-tuned it with real-world data for continuous improvement.
They created a dataset of table tennis ball states, including position, spin, and speed, and used this data in a simulation mimicking real table tennis physics. The robot learned techniques like returning serves and hitting topspins.
During human matches, the robot collects performance data with cameras and a motion capture system on the opponent’s paddle. This data feeds back into the simulation, allowing the robot to refine its skills, adjust tactics, and improve both during matches and over time.
While the work of the researchers is an impressive step forward in robotics, the robot is far from attending the table tennis championships. While the table tennis robot defeated all beginner-level players and won 55% of matches against amateurs, it struggled against advanced opponents, losing every game.
Scientists acknowledge that training a robot to handle all possible scenarios in a simulated environment is a significant challenge. However, they believe these limitations can be overcome by developing predictive AI models to anticipate the ball’s trajectory and by implementing advanced collision-detection algorithms.
“Even a few months back, we projected that, realistically the robot may not be able to win against people it had not played before. The system certainly exceeded our expectations,” Pannag Sanketi, a senior staff software engineer at Google DeepMind who led the project, commented to MIT Review.
“The way the robot outmaneuvered even strong opponents was mind-blowing.”
Mastering robot kinesthetics is a crucial task that keeps many scientists awake. Last month, the Massachusetts Institute of Technology (MIT) presented a humanoid robot trained to perform a variety of movements on different surfaces, including responding to humans, which enables it to dance together.
Scientists highlighted that the ability to perform friendly movements would be more acceptable in an environment alongside human beings, such as factory assembly lines or hospitals.
In June, humanoid robot developers in China found a way to incorporate enhanced facial expressions and emotions into their already human-like robots.
In April, another Chinese robotics company, Stardust Intelligence, released a video showing a humanoid robot doing chores, like uncorking a bottle of wine, flipping toast in a pan, and then ironing and folding a T-shirt.
Your email address will not be published. Required fields are markedmarked