Designing robots is usually a laborious process, but researchers at MIT have developed a system that automates the task. Once it’s been made aware of the parts you have – like wheels, joints, and body segments – and the terrain that the robot will need to navigate, RoboGrammar is on the case, generating structures and programs to optimized control.
To rule out “absurd” conceptions, the researchers developed an animal-inspired “graph grammar” – a set of rules for how parts can be connected, says Allan Zhao, a doctoral student at the Computer Science and Intelligence Lab artificial. The rules were particularly informed by the anatomy of arthropods such as insects and lobsters, all of which have a central body with a varying number of segments which may have attached legs. (Grammar also allows wheels.)
RoboGrammar can generate thousands of potential structures based on these rules. To choose among them, one must simulate the performance of each robot with a controller – the instructions governing the sequence of movement of a robot’s motors. Using an algorithm that prioritizes fast forward movements, the researchers developed an individual controller for each robot. Then they released the simulated robots and let a neural network determine the most efficient designs.
Zhao, whose team plans to test some of the winning designs in the real world, describes RoboGrammar as “a tool for robot designers to expand the space of robot structures they rely on.” To his surprise, however, most of the structures he came up with were on all fours, as were the majority designed by humans.