NeuroMechFly: A digital twin of Drosophila by Staff Writers Lausanne, Switzerland (SPX) May 12, 2022
"We used two kinds of data to build NeuroMechFly," says Professor Pavan Ramdya at EPFL's School of Life Sciences. "First, we took a real fly and performed a CT scan to build a morphologically realistic biomechanical model. The second source of data were the real limb movements of the fly, obtained using pose estimation software that we've developed in the last couple of years that allow us to precisely track the movements of the animal." Ramdya's group, working with the group of Professor Auke Ijspeert at EPFL's Biorobotics Laboratory, has published a paper in Nature Methods showcasing the first ever accurate "digital twin" of the fly Drosophila melanogaster, dubbed "NeuroMechFly".
Time flies Continuing with deep-learning, in 2021 Ramdya's team published LiftPose3D, a method for reconstructing 3D animal poses from 2D images taken from a single camera. These kinds of breakthroughs have provided the exploding fields of neuroscience and animal-inspired robotics with tools whose usefulness cannot be overstated. In many ways, NeuroMechFly represents a culmination of all those efforts. Constrained by morphological and kinematic data from these previous studies, the model features independent computational parts that simulate different parts of the insect's body. This includes a biomechanical exoskeleton with articulating body parts, such as head, legs, wings, abdominal segments, proboscis, antennae, halteres (organs that help the fly measure its own orientation while flying), and neural network "controllers" with a motor output.
Why build a digital twin of Drosophila? "When we do experiments, we are often motivated by hypotheses," he adds. "Until now, we've relied upon intuition and logic to formulate hypotheses and predictions. But as neuroscience becomes increasingly complicated, we rely more on models that can bring together many intertwined components, play them out, and predict what might happen if you made a tweak here or there."
The testbed The researchers first made 3D measurements of real walking and grooming flies. They then replayed those behaviors using NeuroMechFly's biomechanical exoskeleton inside a physics-based simulation environment. As they show in the paper, the model can actually predict various movement parameters that are otherwise unmeasured, such as the legs' torques and contact reaction forces with the ground. Finally, they were able to use NeuroMechFly's full neuromechanical capabilities to discover neural network and muscle parameters that allow the fly to "run" in ways that are optimized for both speed and stability. "These case studies built our confidence in the model," says Ramdya. "But we are most interested in when the simulation fails to replicate animal behavior, pointing out ways to improve the model." Thus, NeuroMechFly represents a powerful testbed for building an understanding of how behaviors emerge from interactions between complex neuromechanical systems and their physical surroundings.
A community effort "More and more, progress in science depends on a community effort," he adds. It's important for the community to use the model and improve it. But one of the things NeuroMechFly already does is to raise the bar. Before, because models were not very realistic, we didn't ask how they could be directly informed by data. Here we've shown how you can do that; you can take this model, replay behaviors, and infer meaningful information. So this, I think, is a big step forward."
Research Report:NeuroMechFly, a neuromechanical model of adult Drosophila melanogaster.
Shaping the future of photonic sensing: Advanced Navigation acquires Vai Photonics Sydney, Australia (SPX) May 06, 2022 Advanced Navigation, one of the world's most ambitious innovators in AI robotics, and navigation technology has announced the acquisition of Vai Photonics, a spin-out from The Australian National University (ANU) developing patented photonic sensors for precision navigation. Vai Photonics share a similar vision to provide technology to drive the autonomy revolution and will join Advanced Navigation to commercialise their research into exciting autonomous and robotic applications across land, air, ... read more
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us. |