Can we interpret emotions based on movements? How can emotions be analyzed externally using empirical methods? To address these questions, a diverse international research team led by the Max Planck Institute for Empirical Aesthetics (MPIEA) in Germany has created a comprehensive scientific approach. By utilizing technologies like motion capture, the researchers developed EMOKINE software to quantify the objective kinematic aspects of emotional movements. These findings were recently published in the journal Behavior Research Methods.
A professional dancer was instructed to convey various emotions – anger, contentment, fear, happiness, neutrality, and sadness – through short dance routines in front of a green screen. To convert these movements into data, the researchers utilized motion capture technology, where the dancer wore a suit embedded with 17 sensors. The researchers recorded the dynamic body movements using a film camera, extracted movement parameters, and then programmed the EMOKINE software to analyze and present these parameters effortlessly.
Utilizing Computerized Tracking for analyzing Whole-Body Movements
A set of 32 statistics from 12 movement parameters were extracted from a trial dance dataset. These parameters included speed, acceleration, and limb contraction, among others.
Lead author Julia F. Christensen from MPIEA mentioned, “We pinpointed 12 kinematic features of emotional whole-body movements previously studied separately and incorporated them into the EMOKINE software.”
Objective movement tracking has gained popularity in various fields as it provides insights into individuals’ intentions, emotions, and mental state. However, a theory-based methodology is essential to draw meaningful conclusions from this data.
Co-first author Andrés Fernández from the Max Planck Institute for Intelligent Systems highlighted, “This research showcases how art, psychology, and computer science can collaborate effectively to develop methods for studying human cognition.”
The software package’s methodology, which employs dance movements to explore emotions, stands out from prior approaches that often used videos of emotional actions like hand gestures or walking.
Senior author Gemma Roig, a Professor at Goethe University, expressed excitement about the project, stating, “This work, involving experts from various disciplines like psychology, neuroscience, computer science, empirical aesthetics, dance, and film, marks a significant milestone.”
The Availability of Open-Source Software Package
EMOKINE is accessible on ZENODO and GitHub, and its adaptability to other motion capture systems with minor adjustments makes it a valuable tool for analyzing emotional expression in dancers, artists, and everyday movements.
The researchers anticipate EMOKINE’s application in fields like experimental psychology, affective neuroscience, and computer vision, particularly in AI-driven visual media analysis, aiding in understanding how kinematic parameters in body movements convey intentions and emotions to observers.