Experts believe that the future of human evolution may involve merging technology with the human body. A recent study used virtual reality to determine if people can experience a sense of embodiment towards prosthetic “hands” that look like a pair of tweezers. The results showed that participants felt the same level of embodiment for both the tweezer-hands and a virtual human hand. Additionally, they were more efficient and accurate when performing motor tasks in virtual reality with the tweezer-hands compared to the virtual human hand.Researchers have tested whether humans can experience a sense of embodiment towards prosthetic “hands” that resemble a pair of tweezers using virtual reality. They found that participants felt an equal degree of embodiment for the tweezer-hands and were also faster and more accurate in completing motor tasks in virtual reality compared to when they were equipped with a virtual human hand. The findings were reported on June 6 in the journal iScience. First author and cognitive neuroscientist, Ottavia stated, “For our biology to merge seamlessly with tools, we need to feel that the tools are part of our body.”Maddaluno, who worked at the Sapienza University of Rome and the Santa Lucia Foundation IRCCS with Viviana Betti, found that humans can perceive a grafted tool as a part of their own body. Previous research has shown that using tools can change the human brain, as can using prosthetic limbs that resemble human body parts. However, it is still unclear whether humans can feel a connection to bionic tools or prostheses that do not resemble human anatomy.
To explore this question, the researchers used virtual reality to carry out a series of experiments on healthy participants. In thIn a virtual reality setting, individuals were given either a hand resembling a human hand or a “bionic tool” that looked like a large pair of tweezers attached to their wrist. They were then instructed to pop bubbles of a specific color using either their tweezers or their index finger and thumb to test their motor skills and dexterity. The researchers discovered that participants were quicker and more accurate at popping virtual bubbles when they used the tweezer-hands for this simple task.
The team then utilized the “cross-modal congruency task” to compare the unconscious embodiment for the virtual hand and bionic tool.During the experiment, the scientists used small vibrations on the subjects’ fingertips and asked them to identify which fingers were being stimulated. At the same time, a flickering light appeared on the virtual reality screen, either on the same finger as the tactile stimulus or on a different one. By comparing the participants’ accuracy and reaction times during trials with matching and non-matching stimuli, the researchers were able to gauge how much the visual stimulus distracted them.
“This indicates the level of discrepancy in your brain between what you sense and what you see,” Maddaluno explains. “But this discrepancy could only occur if your brain recognizes what you see as a part of your own body; if I do not perceive the bionic tool that I am viewing through virtual reality as part of my own body, the visual input should not cause any interference.”
In both situations, the participants were quicker and more precise at determining which of their actual fingers were stimulated during trials with coordinated tactile and visual stimuli, indicating that the participants felt a sense of embodiment toward both the virtual human hand and the tweezer-hands.
However, there was a noticeable contrast between matched and mismatched tr rnrnials when participants used tweezer-like hands instead of human hands, suggesting that the non-human-like prosthesis led to an even stronger feeling of embodiment. The researchers believe that this may be because the tweezer-like hands are less complex than human-like hands, making it easier for the brain to process and accept.
“In terms of the pinching task, the tweezers are functionally similar to a human hand, but simpler, and simple is also better computationally for the brain,” says Maddaluno.
They also suggest that this could be related to the “uncanny valley” hypothesis, as the virtual human hands may have been too eSimilar yet different for the perfect representation. The researchers tested a wrench-shaped bionic tool and a virtual human hand holding tweezers in addition to the tweezer-hands. They found evidence of embodiment in all cases, but participants felt more connected and skilled when the tweezers were attached directly to their virtual wrists rather than being held in their virtual hand. Participants also felt a stronger sense of embodiment for the bionic tools when they were able to explore the virtual reality environment before taking the cross-modal congruency test. “During thIn the cross-modal congruency task, participants had to remain still, while in the motor task, they actively interacted with the virtual environment, which can create a sense of agency,” Maddaluno explains. The researchers believe that this study could have implications for robotics and prosthetic limb design. “The next step is to determine if these bionic tools can be integrated into patients who have lost limbs,” Maddaluno states. “We also want to explore the neurological changes that this type of bionic tool can cause in the brains of both healthy individuals and amputees.”