Watch These AI-Powered Robots Play Soccer With Each Other

watch-these-ai-powered-robots-play-soccer-with-each
other

While the age of robots might seem like a strictly 2020s phenomenon, these humanoids have existed for decades as scientists continually try to make technological advances. Playing soccer in particular is an activity that many robotics engineers have used to tweak their devices for years; the RoboCup, founded in 1996 as a parallel to the World Cup, was created for research teams from around the world to test out their work against each other. 

Google is getting in on the action and ramping up its robots‘ machine learning with its DeepMind platform. In a new study published in the Science Robotics journal, a team of Google DeepMind researchers in the U.K. trained their Robotis OP3 robots to play soccer using their “deep reinforcement learning” technique that combines several different AI training methods. After using this method, they found that the AI-trained robots walked 181 percent faster, turned 302 percent faster, kicked the ball 34 percent faster and took 63 percent less time to recover from falls when they lost balance compared to robots not trained using this technique. 

Soccer players have to master a range of dynamic skills, from turning and kicking to chasing a ball. How could robots do the same? ⚽

We trained our AI agents to demonstrate a range of agile behaviors using reinforcement learning.

Here’s how. ? https://t.co/RFBxLG6SMn pic.twitter.com/4B4S2YiVLh

— Google DeepMind (@GoogleDeepMind) April 11, 2024

Even more interesting is that scientists observed “a variety of emergent behaviors” that the robots seemingly taught themselves, as they would have been extremely difficult for humans to program. This includes “agile movement behaviors such as getting up from the ground, quick recovery from falls, running, and turning; object interaction such as ball control and shooting, kicking a moving ball, and blocking shots; and strategic behaviors such as defending by consistently placing itself between the attacking opponent and its own goal and protecting the ball with its body.” Not only that, but the authors wrote that the robots “transitioned between all of these behaviors fluidly” in a one-on-one match with each other. 

Our players were able to walk, turn, kick and stand up faster than manually programmed skills on this type of robot. ?

They could also combine movements to score goals, anticipate ball movements and block opponent shots – thereby developing a basic understanding of a 1v1 game. pic.twitter.com/1Bty4q9tDN

— Google DeepMind (@GoogleDeepMind) April 11, 2024

More research needs to be done to determine just how effective this AI training method can be. The team has already started experimenting with two-on-two soccer games and found that robots “learned division of labor,” avoiding approaching the ball if its teammate was closer to it. 

Still, it might be a long time before the RoboCup replaces the World Cup. 

You Might Also Like