Computer-aided creativity in robot design by Daniel Ackerman for MIT News Boston MA (SPX) Dec 01, 2020
So, you need a robot that climbs stairs. What shape should that robot be? Should it have two legs, like a person? Or six, like an ant? Choosing the right shape will be vital for your robot's ability to traverse a particular terrain. And it's impossible to build and test every potential form. But now an MIT-developed system makes it possible to simulate them and determine which design works best. You start by telling the system, called RoboGrammar, which robot parts are lying around your shop - wheels, joints, etc. You also tell it what terrain your robot will need to navigate. And RoboGrammar does the rest, generating an optimized structure and control program for your robot. The advance could inject a dose of computer-aided creativity into the field. "Robot design is still a very manual process," says Allan Zhao, the paper's lead author and a PhD student in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He describes RoboGrammar as "a way to come up with new, more inventive robot designs that could potentially be more effective." Zhao is the lead author of the paper, which he will present at this month's SIGGRAPH Asia conference. Co-authors include PhD student Jie Xu, postdoc Mina Konakovic-Lukovic, postdoc Josephine Hughes, PhD student Andrew Spielberg, and professors Daniela Rus and Wojciech Matusik, all of MIT.
Ground rules Zhao's team speculated that more innovative design could improve functionality. So they built a computer model for the task - a system that wasn't unduly influenced by prior convention. And while inventiveness was the goal, Zhao did have to set some ground rules. The universe of possible robot forms is "primarily composed of nonsensical designs," Zhao writes in the paper. "If you can just connect the parts in arbitrary ways, you end up with a jumble," he says. To avoid that, his team developed a "graph grammar" - a set of constraints on the arrangement of a robot's components. For example, adjoining leg segments should be connected with a joint, not with another leg segment. Such rules ensure each computer-generated design works, at least at a rudimentary level. Zhao says the rules of his graph grammar were inspired not by other robots but by animals - arthropods in particular. These invertebrates include insects, spiders, and lobsters. As a group, arthropods are an evolutionary success story, accounting for more than 80 percent of known animal species. "They're characterized by having a central body with a variable number of segments. Some segments may have legs attached," says Zhao. "And we noticed that that's enough to describe not only arthropods but more familiar forms as well," including quadrupeds. Zhao adopted the arthropod-inspired rules thanks in part to this flexibility, though he did add some mechanical flourishes. For example, he allowed the computer to conjure wheels instead of legs.
A phalanx of robots With these inputs, RoboGrammar then uses the rules of the graph grammar to design hundreds of thousands of potential robot structures. Some look vaguely like a racecar. Others look like a spider, or a person doing a push-up. "It was pretty inspiring for us to see the variety of designs," says Zhao. "It definitely shows the expressiveness of the grammar." But while the grammar can crank out quantity, its designs aren't always of optimal quality. Choosing the best robot design requires controlling each robot's movements and evaluating its function. "Up until now, these robots are just structures," says Zhao. The controller is the set of instructions that brings those structures to life, governing the movement sequence of the robot's various motors. The team developed a controller for each robot with an algorithm called Model Predictive Control, which prioritizes rapid forward movement. "The shape and the controller of the robot are deeply intertwined," says Zhao, "which is why we have to optimize a controller for every given robot individually." Once each simulated robot is free to move about, the researchers seek high-performing robots with a "graph heuristic search." This neural network algorithm iteratively samples and evaluates sets of robots, and it learns which designs tend to work better for a given task. "The heuristic function improves over time," says Zhao, "and the search converges to the optimal robot." This all happens before the human designer ever picks up a screw. "This work is a crowning achievement in the a 25-year quest to automatically design the morphology and control of robots," says Hod Lipson, a mechanical engineer and computer scientist at Columbia University, who was not involved in the project. "The idea of using shape-grammars has been around for a while, but nowhere has this idea been executed as beautifully as in this work. Once we can get machines to design, make and program robots automatically, all bets are off." Zhao intends the system as a spark for human creativity. He describes RoboGrammar as a "tool for robot designers to expand the space of robot structures they draw upon." To show its feasibility, his team plans to build and test some of RoboGrammar's optimal robots in the real world. Zhao adds that the system could be adapted to pursue robotic goals beyond terrain traversing. And he says RoboGrammar could help populate virtual worlds. "Let's say in a video game you wanted to generate lots of kinds of robots, without an artist having to create each one," says Zhao. "RoboGrammar would work for that almost immediately." One surprising outcome of the project? "Most designs did end up being four-legged in the end," says Zhao. Perhaps manual robot designers were right to gravitate toward quadrupeds all along. "Maybe there really is something to it."
Research Report: "RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design"
Machine learning guarantees robots' performance in unknown territory Princeton NJ (SPX) Nov 18, 2020 A small drone takes a test flight through a space filled with randomly placed cardboard cylinders acting as stand-ins for trees, people or structures. The algorithm controlling the drone has been trained on a thousand simulated obstacle-laden courses, but it's never seen one like this. Still, nine times out of 10, the pint-sized plane dodges all the obstacles in its path. This experiment is a proving ground for a pivotal challenge in modern robotics: the ability to guarantee the safety and success ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |