A team of researchers at Duke University have invented a program that uses written prompts to build personalized robots.
Born in Duke’s General Robotics Lab, the program, “Text2Robot,” merges AI-generated models with real-world physics. Duke student and co-author Zachary Charlick said the rise of generative AI actually inspired the project.
“We were really excited about the advent of some of these new large language models and generative tools that were being introduced to consumers to kind of let everybody use these technologies and democratize AI in a really awesome way,” he said. “We wanted to kind of take that approach, but with robotics.”
While their generative model isn’t something they invented, Duke Professor Boyuan Chen, who led the project, said it’s the combination of parts that makes their program so unique.
“I think the way we use (AI) is very novel, that people really haven't tried to combine the generative model, have 3-D meshes, and try to figure out how to place the motors and actuate its body together with that co-efficient algorithm,” Chen said. “And so that's why, you know, even though the idea has been there, there are really no existing tools. We can’t just download the software and combine them together. We literally had to write thousands of lines of code just to enable that functionality and give it a taste of more than science.”
The process starts when someone enters a description for their robot—what it should look like and be able to do. Then, the AI program generates models, gives them joints, and trains them to perform an action, like walking.
But, Charlick said it doesn't just generate one robotic model.
"Training one robot to walk only takes a few minutes, but our algorithm actually trains, like, tens of thousands of robots to walk and chooses the best walking robot amongst those,” he said. “So the actual full ‘evolutionary loop,’ that can take several hours, up to a full day, depending on if you want really, really good performance.”
Once the best model is chosen by the algorithm, it's printed 3-D, and the parts are snapped together with mechanical components to create a walking robot.
Text2Robot has so far focused on quadrupedal robots—meaning many of the completed model designs are based on animals, like frogs or bugs. But in order to make the models walk, the AI algorithm often had to borrow traits from other species. Chen said these designs could inspire more innovation in robotics.
“This process also tells us, ‘what is even a robot?’” he said. “So, it actually triggers human creativity by rethinking and prompts us about what are the better designs we can borrow from these AI-created robots.”
Last year, Text2Robot won first place in the innovation category in the Artificial Life Conference’s Virtual Creatures Competition. It’s now set to appear at the Institute of Electrical and Electronics Engineers (IEEE)’s International Conference on Robotics and Automation in May—one of the biggest events of its kind.