Tiny robots, ranging from tens to a few hundred grams, present interesting real-world applications due to their lightweight nature, making them extremely safe even in case of accidental bumps. Their small size allows them to navigate through narrow spaces, and if produced cost-effectively, they can be deployed in large numbers to cover extensive areas, such as greenhouses for early pest or disease detection.
However, the challenge lies in making these tiny robots operate autonomously, given their limited resources compared to larger robots. While external infrastructure like GPS satellites or wireless communication beacons can assist, these systems have limitations, especially indoors or in cluttered environments. GPS can be inaccurate in urban areas, and setting up beacons can be expensive or impractical, particularly in search-and-rescue operations.
Autonomous navigation AI designed for larger robots, such as self-driving cars, often relies on heavy, power-intensive sensors like LiDAR, which are unsuitable for tiny robots. Vision-based approaches, though more power-efficient, typically require detailed 3D maps of the environment, necessitating large amounts of processing power and memory, which small robots lack.
Learning from Nature: Step Counting and Visual Snapshots
Nature provides a solution through insects, which navigate effectively over distances relevant to many applications despite their limited sensing and computing resources. Insects combine odometry (tracking their own motion) with visually guided behaviors based on their low-resolution, yet nearly omnidirectional visual systems. While odometry is well understood, the precise mechanisms of visual memory remain less clear. One theory, the "snapshot" model, suggests insects take periodic visual snapshots of their environment, which they later use to navigate by minimizing the visual difference between their current view and the snapshot.
"Snapshot-based navigation is akin to Hansel's method in the fairy tale of Hansel and Gretel, where stones dropped on the ground served as markers to find the way back home," explained Tom van Dijk, first author of the study. "For a robot, snapshots act as these markers. If the visual environment changes too much from the snapshot location, the robot may navigate incorrectly. Thus, enough snapshots are needed, but too many consume excessive memory."
"Our main insight is that spacing snapshots further apart is possible if the robot uses odometry between snapshots," noted Guido de Croon, Full Professor in bio-inspired drones and co-author of the article. "This approach allows the robot to travel further by relying on odometry to get close to a snapshot location before using visual homing, reducing the frequency of snapshot updates."
The researchers demonstrated their insect-inspired navigation strategy with a 56-gram "CrazyFlie" drone equipped with an omnidirectional camera, successfully covering distances up to 100 meters using only 0.65 kiloByte of memory. All visual processing was done on a small micro-controller found in many inexpensive electronic devices.
Practical Applications of Tiny Robot Technology
"The proposed insect-inspired navigation strategy is a significant step towards the real-world application of tiny autonomous robots," said Guido de Croon. "Although this strategy does not generate detailed maps and only facilitates returning to the starting point, it is sufficient for many applications. For example, in warehouse stock tracking or greenhouse crop monitoring, drones could collect data and return to a base station, storing mission-relevant images on a small SD card for later analysis."
Research Report:Visual Route-following for Tiny Autonomous Robots
Related Links
Delft University of Technology
All about the robots on Earth and beyond!
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |