Revolutionizing Human Action Recognition: A Groundbreaking Leap in AI Technology

Researchers develop an AI-driven video analyzer capable of detecting human actions in video footage with precision and intelligence. What if a security camera could not only capture video but understand what's happening -- distinguishing between routine activities and potentially dangerous behavior in real time? That's the future being shaped by researchers at the University of
HomeHealthEnhancing Prosthetic Hand Precision Through Thought-Controlled Technology

Enhancing Prosthetic Hand Precision Through Thought-Controlled Technology

Researchers have introduced a groundbreaking training protocol for brain-computer interfaces through a study involving rhesus monkeys. This innovative method allows for accurate control of prosthetic hands solely using brain signals. The researchers discovered that the neural signals responsible for various hand positions are crucial for this control, contrary to prior beliefs that suggested signals managing movement speed were more important. These findings are vital for enhancing the precise functionality of neural hand prostheses, potentially restoring some or all of the mobility for paralyzed individuals. (Neuron)

Researchers from the German Primate Center — Leibniz Institute for Primate Research in Göttingen have created a groundbreaking training protocol for brain-computer interfaces in a study with rhesus monkeys. This technique enables accurate control of prosthetic hands using only brain signals. For the first time, the study demonstrated that the neural signals regulating different hand positions in the brain are essential for this control, rather than the previously assumed signals that manage movement speed. This discovery is crucial for enhancing the precision of neural hand prostheses, which could help paralyzed patients regain some or all of their mobility.

Carrying shopping bags, threading a needle — power and precision grips are fundamental to our daily activities. We often take our hands for granted until we can no longer use them due to conditions like paraplegia or diseases such as ALS that lead to progressive muscle paralysis.

To assist patients, researchers have dedicated decades to developing neuroprostheses. These artificial limbs can help individuals with disabilities regain mobility. Brain-computer interfaces facilitate the process by bridging damaged nerve connections, translating brain signals into movements that control the prosthetics. However, many hand prostheses have historically struggled with the fine motor skills needed for everyday use.

“The effectiveness of a prosthesis relies primarily on the neural data that the computer interface reads,” explains Andres Agudelo-Toro, a scientist in the Neurobiology Laboratory at the German Primate Center and the lead author of the study. “Previous research on arm and hand movements has emphasized the signals that dictate the velocity of a grasping action. Our goal was to determine if signals related to hand postures might be more effective in controlling neuroprostheses.”

The research involved rhesus monkeys (Macaca mulatta). Like humans, they possess a sophisticated nervous and visual system alongside advanced fine motor skills, making them ideal subjects for studying grasping movements.

To prepare for the main experiment, the scientists trained two rhesus monkeys to maneuver a virtual hand displayed on a screen. During this training phase, the monkeys moved their own hands while observing the corresponding movements of the virtual avatar on the screen. A data glove filled with magnetic sensors captured the monkeys’ hand movements throughout the task.

After the monkeys mastered the task, they progressed to controlling the virtual hand by “imagining” the grip. The researchers monitored the activity of groups of neurons in the brain’s cortical areas that directly oversee hand movements. Their focus was on signals representing various hand and finger positions, prompting them to adjust the brain-computer interface’s algorithm to better interpret this neural data for movement.

“By deviating from the traditional protocol, we modified the algorithm to account for not just the end goal of a movement, but also the journey to achieve it,” says Andres Agudelo-Toro. “This adjustment ultimately yielded the most accurate outcomes.”

The researchers then compared the actions of the virtual hand with the previously recorded movements of the real hand, confirming that both were executed with similar precision.

“Our study illustrates that the signals governing hand posture are particularly vital for controlling a neuroprosthesis,” notes Hansjörg Scherberger, head of the Neurobiology Laboratory and senior author of the study. “These findings can now be utilized to enhance future brain-computer interfaces and improve the fine motor capabilities of neural prostheses.”

This research received support from the German Research Foundation (DFG, grants FOR-1847 and SFB-889), as well as the European Union Horizon 2020 project B-CRATOS (GA 965044).