Computer Laboratory

Body movement analysis

Daniel Bernhardt & Peter Robinson

Motion primitives
Motion primitives for knocking action

Full-body human-computer interaction is becoming increasingly interesting for industry and academia alike. In addition to the common view that it is more engaging and natural than traditional methods, affordable hardware is rapidly becoming available.

This project goes beyond commonly implemented gesture-based interfaces by looking at the fuzzier concept of emotional content in human body motions. Our goal was to create a system which affords an emotionally engaging experience. It allows users to control a mix of emotional music pieces through unconstrained and natural body motions.

It was important to us not to prescribe any body motions (gestures) to represent particular emotions. As our experiments show, people tend to express emotions in very diverse ways. In order to facilitate a fluid and natural interaction, we are using Machine Learning to capture how different people naturally express different emotions. The individually trained models are then used in the music control system.

We asked six untrained subjects to express a set of five emotions using only their body. The expressions were stimulated using emotional music and labels. All performances were recorded using a motion capture system. We then used the captured data to train Support Vector Machines and found that:

  • Emotions expressed through unconstrained body motions can be distinguished significantly above chance level.
  • To achieve accurate enough classification for practical applications, we need to train classifiers for each user individually.

The system is designed to target both the cognitive and physical aspects of a user’s emotional experience. By mirroring the emotional body expressions through music, we create a positive feedback effect which enhances the emotionally immersive experience.

Selected publications