The Health Benefits of Cranberries: Essential Insights for Your Thanksgiving Feast

Are cranberries good for you? What to know before Thanksgiving. Are you team canned or team fresh cranberry sauce? This Thanksgiving, we're answering plenty of your burning, commonly-searched food questions. Here, we're tackling the nutritional facts behind cranberries. Here's how certain cranberry dishes may or may not boost your nutrition this holiday season. And remember
HomeAnimalAI-Powered Robotics: Training Dogs to Respond to Their Masters

AI-Powered Robotics: Training Dogs to Respond to Their Masters

An international team is working together to transform the way a robot dog interacts with its owner by utilizing AI and edge computing known as edge intelligence.

Funded by a one-year seed grant from the Institute for Future Technologies (IFT), a collaboration between New Jersey Institute of Technology (NJIT) and Ben-Gurion University of the Negev (BGU) is driving this project forward.

Assistant Professor Kasthuri Jayarajah from NJIT’s Ying Wu College of Computing is leading the research to create a socially interactive model of the Unitree Go2 robotic dog. This model will adjust its behavior and communication style based on the unique characteristics of the individuals it interacts with.

The ultimate goal of the project is to enhance the robot dog’s responsiveness to stimuli by utilizing wearable sensors that can identify physiological and emotional cues specific to each person’s personality and traits, such as introversion, pain levels, and comfort preferences.

This innovation is expected to have a significant impact on addressing loneliness among the elderly in home and healthcare environments and supporting therapy and rehabilitation efforts. Jayarajah’s initial work focusing on robotic dogs interpreting and responding to gestures from their partners will be showcased at the upcoming International Conference on Intelligent Robots and Systems (IROS).

Co-principal investigator Shelly Levy-Tzedek, an associate professor in the Department of Physical Therapy at BGU, brings expertise in rehabilitation robotics and specializes in studying how age and health conditions affect body control.

The researchers emphasize the growing accessibility of wearable devices, noting that common items like earphones can be repurposed to monitor wearers’ brain activity and subtle facial expressions for a deeper understanding of individual states. The project aims to combine these wearable sensors with traditional robot sensors, such as vision and audio capabilities, to effectively and unobtrusively track user attributes.

While socially assistive robots hold promise, Jayarajah acknowledges the challenge of long-term usage due to cost and scalability limitations. She mentions that robots like the Unitree Go2 currently lack the processing power, memory, and battery life needed for complex AI tasks compared to larger GPU clusters.

The initial stages of the project involve enhancing traditional sensor fusion techniques and exploring sophisticated deep learning models to facilitate the development of affordable wearable sensors for capturing user attributes and refining motion controls.