Scientists have made significant improvements to their NeuroMechFly model, which now simulates the movement of fruit flies in realistic settings. This enhanced version, known as NeuroMechFly v2, incorporates both vision and smell, aiding our understanding of the coordination between the brain and body, and paving the way for neuroengineering applications in robotics and AI.
All creatures, regardless of size, must move with remarkable accuracy to interact effectively with their environment. A key question in neuroscience is how the brain manages movement. While larger animals present challenges due to their intricate brains and nervous systems, fruit flies, specifically Drosophila melanogaster, have simpler brains that can be mapped with greater ease, providing scientists with valuable insights into the relationship between their nervous systems and behaviors.
To explore how the nervous system governs actions, research led by Pavan Ramdya at EPFL has developed a simulated environment where a virtual fly can behave similarly to actual flies. This software, called NeuroMechFly v2, features a neuromechanical model that transcends basic motor actions. By integrating visual and olfactory perceptions, varying terrains, and delicate motor responses, NeuroMechFly v2 effectively simulates how a fruit fly navigates its surroundings while responding to visual and olfactory stimuli as well as obstacles.
Ramdya’s work has focused on digitally modeling the mechanisms behind Drosophila motor control. In 2019, his team introduced DeepFly3D, a tool utilizing deep learning to analyze fly leg movements based on images captured by multiple cameras. The following year, they unveiled LiftPose3D, a technique for generating 3D animal poses from single-camera images. These advancements were complemented by the 2022 launch of NeuroMechFly, which created a highly accurate digital representation of Drosophila.
In the latest version of NeuroMechFly, the researchers enhanced the model by incorporating detailed features that accurately reflect real fly anatomy and physiology. They meticulously updated the angles of legs and joints to align more closely with the biomechanics of actual fruit fly movements. Furthermore, the model’s “brain” can now process information from visual and olfactory inputs using its virtual eyes and antennae, providing a sensory experience akin to that of a real fruit fly.
This setup enables NeuroMechFly v2 to simulate various control strategies for everyday tasks, such as walking on uneven surfaces or turning in response to odors and visual signals. The team has successfully shown realistic fruit fly behaviors under different scenarios. For example, the model can visually track a moving object or navigate towards a smell while avoiding obstacles along the way.
NeuroMechFly also allows the research team to deduce neural activities within the fly’s brain based on its experiences in the simulated world. “By linking NeuroMechFly v2 with a new computational model of the fly’s visual system, researchers can ascertain not just what the fly perceives in the simulation, but also how real neurons may react,” explains Sibo Wang-Chen, who led the research.
With insights into these neural activities, scientists modeled how a fly might pursue another fly, such as during courtship, in a biologically plausible manner. This was made possible by the hierarchical control structure of the model, allowing higher-level “brain” functions to engage with lower-level motor functions—an organization that closely resembles how actual animals process sensory information and control their movements.
Additionally, researchers can utilize NeuroMechFly v2 to investigate how the brain harmonizes sensory signals to maintain an awareness of the animal’s condition. To illustrate this, Ramdya’s team imitated the fly’s capability to use feedback from leg movements to track its position—a behavior referred to as path integration. This function enables the simulated fly to “know” its location, even with limited visual input. Such closed-loop sensory processing is a hallmark of biological intelligence and a significant achievement for neuroengineering.
In summary, NeuroMechFly v2 empowers researchers to analyze how the brain manages crucial behaviors through computational models. It opens the door to deeper understanding of brain-body coordination, especially for species with complex sensory-motor systems. Looking ahead, this model could serve as a framework for developing robots that rely on sensory signals, such as following scents or modifying movements to stabilize images, similarly to real animals exploring their environments.
By refining machine learning models that govern these simulations, researchers can also uncover insights into how animal intelligence can influence the creation of AI systems that are more independent, resilient, and adaptable to their surroundings.