The Century of Celestial Superflares: A Phenomenon Worth Watching

Stars similar to the Sun produce a gigantic outburst of radiation on average about once every hundred years per star. Such superflares release more energy than a trillion hydrogen bombs and make all previously recorded solar flares pale in comparison. This estimate is based on an inventory of 56450 sun-like stars. It shows that previous
HomeEnvironmentExploring the Insect Mind: How Virtual Reality Gaming Influences Bug Behavior

Exploring the Insect Mind: How Virtual Reality Gaming Influences Bug Behavior

Humans often find excitement in the realms of gaming and augmented reality, but now researchers are exploring how these cutting-edge technologies affect small animals, such as hoverflies and crabs. This study aims to better understand the aerodynamic abilities of flying insects and delve into behaviors of animals that remain largely unexplored. The research is shedding light on how invertebrates engage with and navigate through virtual environments crafted by sophisticated entertainment technology.

Humans often find excitement in the realms of gaming and augmented reality, but now researchers are exploring how these cutting-edge technologies affect small animals, such as hoverflies and crabs.

Led by Flinders University, the research seeks to enhance our understanding of the aerodynamic abilities of flying insects and other less studied animal behaviors. This study is uncovering fresh insights into how invertebrates react to, engage with, and navigate virtual ‘worlds’ fashioned by modern entertainment tech.

This research was published in the journal Methods of Ecology and Evolution. The innovative gaming software was developed by professionals at Flinders University, in partnership with coauthor Professor Karin Nordström, who heads the Hoverfly Motion Vision Lab there, and collaborators from Western Australia and Germany.

The study aims to support ongoing efforts regarding advanced technologies, which include aviation and precision instruments, providing researchers worldwide with access to this unique software platform.

Team members comprised biologists, neuroscientists, and software specialists, including Dr. Yuri Ogawa, Dr. Richard Leibbrandt, and Raymond Aoukar from Flinders University, along with Jake Manger and his team from The University of Western Australia.

“We created software that offers a virtual reality experience for the animals to explore,” explained Dr. Ogawa, a Research Fellow in Neuroscience at the Flinders Health and Medical Research Institute.

“By employing machine learning and computer vision techniques, we were able to monitor the animals and analyze their actions, such as a hoverfly trying to turn left during flight, or a fiddler crab dodging a virtual bird overhead.

“The software dynamically adjusts the visual environment to reflect the movements of the animal.”

Coauthor Dr. Richard Leibbrandt, a lecturer at Flinders University’s College of Science and Engineering, remarked that the machine learning tools used in this study are already transforming sectors like agriculture, which benefit from automated crop and livestock monitoring and the development of agricultural robots.

“Virtual and augmented reality are also crucial in various fields, including healthcare, architecture, and transportation,” he added.

This innovative virtual environment for invertebrates is paving the way for a deeper examination of animal behavior than has ever been possible,” stated Mr. Aoukar, a computer science graduate from Flinders University.

“The past twenty years have witnessed rapid advancements in algorithms and computer tech, such as virtual reality, gaming, artificial intelligence, and high-speed computations utilizing specialized graphics hardware,” explained Mr. Aoukar.

“These technologies have matured and become sufficiently user-friendly to run on consumer-grade computers, allowing researchers to observe animal behavior in a meticulously controlled but more natural environment than standard lab settings.”

This new observational technique helps identify visual stimuli that trigger specific behaviors in animals.

Professor Nordström noted that other research teams are already interested in utilizing this new platform, which is detailed in and available for download from the recent publication.

“This effort has been collaborative, with each author playing a critical role in the development of the VR technology.

“We are excited to use VR to examine the processes involved in decision-making among insects,” Professor Nordström concluded.

The intuitive Unity Editor interface simplifies experimental design and data management without the need for programming skills. CAVE, an open-source project developed by the Hoverfly Motion Vision Lab, is aimed at facilitating the setup of a Tethered Flight Arena.