Robotics company Figure just dropped a demo video showing its flagship robot in action—and that robot has a brain powered by models from OpenAI.
In the video, the robot seamlessly completes tasks like understanding spoken words, making connections between visual inputs, responding verbally to commands, and even reasoning to fulfill requests like handing a human an apple when told "I'm hungry."
It's just the latest powerful signal that the robotics explosion could be arriving on schedule.
So what does the swift arrival of intelligent robots mean for the future of work and society?
I got the answers from Marketing AI Institute founder/CEO Paul Roetzer on Episode 88 of The Artificial Intelligence Show.
On Episode 87 of The Artificial Intelligence Show, Roetzer predicted we'd see a "robotics explosion" anywhere between 2026 and 2030.
He envisioned major investments in humanoid robots from players like OpenAI, Tesla and Figure, coupled with advancements in both robotic hardware and the "brains,” or multimodal AI models.
The Figure robotics demo released this week aligns perfectly with that vision. The company has built an impressive humanoid robot and teamed up with OpenAI to power it with advanced AI. The result is a major step towards the future of intelligent robots Roetzer sees on the horizon.
But remember, says Roetzer, it's still very early.
The Figure demo using OpenAI's models is incredibly impressive. But it doesn't change his timeline for the real explosion in robotics development.
And Figure isn't the only player in this space. Roetzer expects Tesla to infuse its Optimus robot with its own large language model, Grok, to name just one likely robotics heavyweight.
What's truly groundbreaking about the Figure demo is the way it combines cutting-edge robotics with state-of-the-art AI. By embodying OpenAI's multimodal AI in a physical robot, Figure has created a machine that can understand and generate language, reason, plan, and learn.
For instance, in one segment of the demo video, a human simply tells the robot "I'm hungry." The bot is able to deduce that of the objects in front of it, only an apple is edible. It picks up the apple and hands it to the person.
"A robot couldn't do that a year or two ago," says Roetzer. "The fact that it can now see things, understand things, that's really what's going on here."
For a more technical analysis of how Figure's system works, Roetzer points to a great X thread from Corey Lynch, Figure's Senior AI Engineer of Robot Manipulation.
Some key details:
While the Figure demo doesn't change Roetzer's 2026 to 2030 timeline, he believes it's a powerful demonstration of the exponential pace of technological progress in AI and robotics—a demonstration we should take seriously.
"I talk to people all the time who think things are decades away," says Roetzer. "What I tell people is you can't say anything is decades away right now, because we don't comprehend what an exponential growth curve feels like."