Twisted soft robots navigate mazes without human or computer guidance by Staff Writers Raleigh NC (SPX) May 24, 2022
Researchers from North Carolina State University and the University of Pennsylvania have developed soft robots that are capable of navigating complex environments, such as mazes, without input from humans or computer software. "These soft robots demonstrate a concept called 'physical intelligence,' meaning that structural design and smart materials are what allow the soft robot to navigate various situations, as opposed to computational intelligence," says Jie Yin, corresponding author of a paper on the work and an associate professor of mechanical and aerospace engineering at NC State. The soft robots are made of liquid crystal elastomers in the shape of a twisted ribbon, resembling translucent rotini. When you place the ribbon on a surface that is at least 55 degrees Celsius (131 degrees Fahrenheit), which is hotter than the ambient air, the portion of the ribbon touching the surface contracts, while the portion of the ribbon exposed to the air does not. This induces a rolling motion in the ribbon. And the warmer the surface, the faster it rolls. "This has been done before with smooth-sided rods, but that shape has a drawback - when it encounters an object, it simply spins in place," says Yin. "The soft robot we've made in a twisted ribbon shape is capable of negotiating these obstacles with no human or computer intervention whatsoever." The ribbon robot does this in two ways. First, if one end of the ribbon encounters an object, the ribbon rotates slightly to get around the obstacle. Second, if the central part of the robot encounters an object, it "snaps." The snap is a rapid release of stored deformation energy that causes the ribbon to jump slightly and reorient itself before landing. The ribbon may need to snap more than once before finding an orientation that allows is to negotiate the obstacle, but ultimately it always finds a clear path forward. "In this sense, it's much like the robotic vacuums that many people use in their homes," Yin says. "Except the soft robot we've created draws energy from its environment and operates without any computer programming." "The two actions, rotating and snapping, that allow the robot to negotiate obstacles operate on a gradient," says Yao Zhao, first author of the paper and a postdoctoral researcher at NC State. "The most powerful snap occurs if an object touches the center of the ribbon. But the ribbon will still snap if an object touches the ribbon away from the center, it's just less powerful. And the further you are from the center, the less pronounced the snap, until you reach the last fifth of the ribbon's length, which does not produce a snap at all." The researchers conducted multiple experiments demonstrating that the ribbon-like soft robot is capable of navigating a variety of maze-like environments. The researchers also demonstrated that the soft robots would work well in desert environments, showing they were capable of climbing and descending slopes of loose sand. "This is interesting, and fun to look at, but more importantly it provides new insights into how we can design soft robots that are capable of harvesting heat energy from natural environments and autonomously negotiating complex, unstructured settings such as roads and harsh deserts." Yin says. The paper, "Twisting for Soft Intelligent Autonomous Robot in Unstructured Environments," will be published the week of May 23 in the Proceedings of the National Academy of Sciences. The paper was co-authored by NC State Ph.D. students Yinding Chi, Yaoye Hong and Yanbin Li; as well as Shu Yang, the Joseph Bordogna Professor of Materials Science and Engineering at the University of Pennsylvania. Video of the ribbon-like soft robots can be found here.
Research Report:Twisting for Soft Intelligent Autonomous Robot in Unstructured Environments
Using everyday WiFi to help robots see and navigate better indoors San Diego CA (SPX) May 20, 2022 Engineers at the University of California San Diego have developed a low cost, low power technology to help robots accurately map their way indoors, even in poor lighting and without recognizable landmarks or features. The technology consists of sensors that use WiFi signals to help the robot map where it's going. It's a new approach to indoor robot navigation. Most systems rely on optical light sensors such as cameras and LiDARs. In this case, the so-called "WiFi sensors" use radio frequency sign ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |