**Boston Dynamics’ Next-Gen Humanoid Robot Just Got a Serious AI Boost from Google DeepMind**
The stage was set for a robotics revolution at CES 2026, and Boston Dynamics has just taken a major leap forward with its next-generation humanoid robot, Atlas. The company, known for its quadruped robot Spot and warehouse robotic Stretch, has partnered with Google’s AI analysis lab, DeepMind, to create the world’s most advanced robotic foundation model.
The partnership aims to make Atlas even more human-like in its interactions with people, and the potential implications are huge. We’re talking about a robot that can learn from its experience, generalize new conditions, and get better over time – just like us humans. The goal is to create a robot that can perform tasks that require true general-purpose human needs.
“We’re trying to combine our cutting-edge AI foundation models with Boston Dynamics’ new Atlas robots, and we’ll aim to develop the world’s most advanced robotic foundation model to meet the promise of true general-purpose human needs,” said Carolina Parada, senior director of robotics at Google DeepMind.
This partnership is huge news for the robotics world, and it’s not just about the tech – it’s about real-world applications. Boston Dynamics already has products like Spot, which is in customers’ hands in over 40 countries, and Stretch, which has unloaded over 20 million containers globally since its launch in 2023. Now, the company is gearing up for the next generation of robots, starting with Atlas.
The Atlas prototype made its debut onstage at the Hyundai press conference, showcasing its impressive mobility. But, as Alberto Rodriguez, director of Atlas behavior at Boston Dynamics, noted, making a humanoid robot into a product requires more than just athletic performance. It requires natural interaction with people.
That’s where DeepMind’s expertise comes in. The company’s AI foundation models will help Atlas learn how to interact with people in a more natural way. It’s not just about the hardware – it’s about the software that makes the robot understand the physical world.
“Rather than having a set of predefined, loaded tasks onto the robot, we think robots should understand the physical world the same way we do,” Parada said. “They should be able to learn from their experience. They should be able to generalize new conditions and get better over time. So whether it’s to assemble a new car part or to tie your shoelaces, robots should learn the same way we do from a handful of examples, and then get better quickly with a little bit of practice.”
The implications of this technology are huge. Imagine a robot that can safely interact with people, perform repetitive tasks, and learn from its experience. It’s not just about the robots themselves – it’s about the industries that will be transformed by this technology.
Hyundai, which plans to bring Atlas to its manufacturing facility this year and eventually deploy them for tasks like parts sequencing by 2028, has also developed protocols to increase safety and efficiency. The company is opening a new facility called the Robotic Metaplant Utility Center, or RMAC, which will train robots how to map actions like lifts and turns. This training data will be combined with real-world data collected through a software platform used in its Georgia manufacturing facility to continually improve the robots.
This partnership between Boston Dynamics and DeepMind is a game-changer for the robotics industry. It’s a reminder that the future of work is not just about humans, but also about robots that can work alongside us. The possibilities are endless, and we can’t wait to see what the future holds.
