Revolutionizing Human Action Recognition: A Groundbreaking Leap in AI Technology

Researchers develop an AI-driven video analyzer capable of detecting human actions in video footage with precision and intelligence. What if a security camera could not only capture video but understand what's happening -- distinguishing between routine activities and potentially dangerous behavior in real time? That's the future being shaped by researchers at the University of
HomeSocietyRevolutionizing Flight: A New Tool for Evaluating Pilot Performance and Mental Workload...

Revolutionizing Flight: A New Tool for Evaluating Pilot Performance and Mental Workload in Augmented Reality

Researchers have created HuBar, an innovative visual analytics tool that analyzes and summarizes task performance sessions in augmented reality (AR), focusing on the behavior and mental workload of participants. Using aviation as an example, the team showcased how HuBar sheds light on pilot behavior and psychological states, aiding trainers and researchers in identifying trends, recognizing challenges, and enhancing AR-assisted training programs to boost learning and performance in real-world situations.

In aviation, a pilot’s ability to manage stress is critical; it can be the key to a safe flight or a catastrophe. Hence, comprehensive training is essential to equip pilots with the necessary skills for handling stressful scenarios.

Pilot instructors utilize augmented reality (AR) systems for training by guiding pilots through various scenarios, allowing them to learn appropriate responses. These systems perform optimally when they are customized to the mental conditions of the individual trainee.

This is where HuBar comes into play. It is a cutting-edge visual analytics tool created to summarize and compare task performance sessions in AR, like AR-assisted flight simulations, by evaluating performer behavior and cognitive workload.

HuBar offers valuable insights into pilot behavior and mental states, which allows researchers and trainers to detect patterns, identify problem areas, and refine AR-assisted training programs to enhance learning and real-world performance.

Developed by a team from NYU Tandon School of Engineering, HuBar will be presented at the 2024 IEEE Visualization and Visual Analytics Conference on October 17, 2024.

“While pilot training is one application, HuBar is not limited to aviation,” stated Claudio Silva, the lead investigator and NYU Tandon Institute Professor in the Computer Science and Engineering (CSE) Department, in partnership with Northrop Grumman Corporation (NGC). “HuBar visualizes a wide range of data from AR-assisted tasks, and this extensive analysis enhances performance and learning results across various complex scenarios.”

“HuBar could significantly benefit training in fields like surgery, military operations, and industrial tasks,” added Silva, who co-directs the Visualization and Data Analytics Research Center (VIDA) at NYU.

The team introduced HuBar in a paper that illustrates its capabilities using aviation as a case study, analyzing data from various helicopter co-pilots in an AR flying simulation. Additionally, they produced a video to demonstrate the system.

By focusing on two pilot subjects, the analysis revealed notable contrasts: one subject displayed mostly optimal attention levels with minimal errors, whereas the other frequently made mistakes and exhibited underload conditions.

HuBar’s in-depth analysis, enriched with video evidence, showed that the underperforming co-pilot often referred to a manual, hinting at a lack of familiarity with the tasks. Ultimately, HuBar enables trainers to target specific areas where copilots may face challenges, offering insights to enhance AR-assisted training methodologies.

What sets HuBar apart is its capability to analyze non-linear tasks associated with varying sequences of actions leading to success, while simultaneously integrating and visualizing multiple complex data streams.

This encompasses brain activity (fNIRS), body motions (IMU), gaze tracking, task procedures, errors, and classifications of cognitive workload. HuBar’s comprehensive methodology allows for an all-encompassing examination of performer behavior in AR-assisted tasks, allowing researchers and trainers to identify links between cognitive states, physical actions, and task performance across multiple completion strategies.

HuBar’s interactive visualization system also enables comparisons across different performance sessions and individuals, helping to unveil patterns and irregularities in complicated, non-linear processes that would likely be overlooked using traditional analysis techniques.

“We can now pinpoint exactly when and why someone might experience mental overload or underload during a task,” mentioned Sonia Castelo, a VIDA Research Engineer and Ph.D. student who authored the HuBar paper. “Such detailed analysis has never been achievable across such a broad range of applications; it’s akin to having X-ray vision into a person’s mind and body during a task, providing data to customize AR assistance systems to cater to individual user needs.”

As AR technologies, including headsets like Microsoft Hololens, Meta Quest, and Apple Vision Pro, evolve, tools like HuBar will be essential for comprehending how these innovations influence human performance and cognitive load.

“Future AR training systems could adapt in real-time based on a user’s mental state,” observed Joao Rulff, a Ph.D. student in VIDA involved with the project. “HuBar is helping us understand how such adaptations can work across various applications and intricate task structures.”

HuBar is part of the research Silva is conducting under the Defense Advanced Research Projects Agency (DARPA) Perceptually-enabled Task Guidance (PTG) program. Supported by a $5 million DARPA contract, the NYU team aims to foster AI technologies that assist individuals with complex tasks while enhancing their versatility and reducing errors. The pilot data for this study originated from NGC as part of the DARPA PTG initiative.