DyRET robot can rearrange its body to walk in new environments by David Howard and Charles Martin | Researchers ANU Canberra, Australia (SPX) Mar 18, 2021
Imagine running on a cement footpath, and then suddenly through dry sand. Just to keep upright, you would have to slow down and change the way you run. In the same way, a walking robot would have to change its gait to handle different surfaces. Generally, we humans and most robots can only change how we run. But what if we could also change the shape of our bodies to run as fast and safely as possible on any surface? We would like to rely on robots for difficult and dangerous tasks, from inspecting failed nuclear reactors to space exploration. For these tasks, a static body could limit the robot's adaptability. A shape-shifting body could make the difference between success and failure in these unexpected environments. Even better, a shape-shifting robot could learn the best body shape for different environments and adapt to new environments as it encounters them. In collaboration with the University of Oslo, we have successfully tested this idea with a four-legged robot that adapts its body to walk on new surfaces as it sees them, performing better than a static-body robot. Our research is published in Nature Machine Intelligence.
A shape-shifting quadruped The motors can change the height of DyRET by around 20%, from 60cm to 73cm tall. That 13cm makes a dramatic difference to the robot's walk. With short legs, DyRET is stable but slow, with a low centre of gravity. In its tallest mode, DyRET is more unstable while it walks but its stride is much longer, allowing it to travel faster and to step over obstacles. DyRET also has sensors to keep track of what it's walking on. Each of DyRET's feet has a force sensor that can feel how hard the ground is. A 3D camera points at the ground between DyRET's front legs to estimate how rough the ground is.
Learning to adapt We explored two ways for DyRET to learn the best leg configuration for different situations: a controlled environment, indoors with known surfaces, and a real-world test outside. In our controlled tests, DyRET walked inside boxes about 5 metres long containing different walking surfaces: sand, gravel, and hard fibre-cement sheeting. The robot walked on each material in each of 25 different leg configurations to record the efficiency of its movement. Given this data, we tested the robot's ability to automatically sense a change in the walking surface within the boxes, and to choose the best body shape. While our controlled experiments showed DyRET could adapt its body successfully to surfaces it had walked on before, the real world is a much more variable and unpredictable place. We showed this method could be extended to unseen terrain by estimating the best body-shape for any surface that the robot encounters. In our outdoor experiments, DyRET used a machine learning model, seeded with knowledge about the best leg configuration for a given combination of terrain hardness and roughness taken from the controlled tests. As the robot walks, it continuously predicts the best body shape for the terrain as it encounters it, while updating its model with measurements of how well it can walk. In our experiments, DyRET's predictions improve as it walks, allowing it to quickly generate efficient movements, even for terrain it hasn't seen before.
Are shape-shifting robots the future? This is incredibly beneficial, especially when we can't predict the exact environmental conditions beforehand, which makes picking a single "good" robot shape very challenging. Instead, these robots would adapt to a wide variety of environmental conditions through shape-change. Our proof of concept has powerful implications for the future of robotic design, unlocking currently impossible environments that are very challenging and variable. Future shape-shifting robots might be used on the sea floor, or for long-term missions in space.
Robots learn faster with quantum technology Vienna, Austria (SPX) Mar 17, 2021 Robots solving computer games, recognizing human voices, or helping in finding optimal medical treatments: those are only a few astonishing examples of what the field of artificial intelligence has produced in the past years. The ongoing race for better machines has led to the question of how and with what means improvements can be achieved. In parallel, huge recent progress in quantum technologies have confirmed the power of quantum physics, not only for its often peculiar and puzzling theories, ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |