Department of Computer Science and Technology

Technical reports

Emotion inference from human body motion

Daniel Bernhardt

October 2010, 227 pages

This technical report is based on a dissertation submitted January 2010 by the author for the degree of Doctor of Philosophy to the University of Cambridge, Selwyn College.

DOI: 10.48456/tr-787


The human body has evolved to perform sophisticated tasks from locomotion to the use of tools. At the same time our body movements can carry information indicative of our intentions, inter-personal attitudes and emotional states. Because our body is specialised to perform a variety of everyday tasks, in most situations emotional effects are only visible through subtle changes in the qualities of movements and actions. This dissertation focuses on the automatic analysis of emotional effects in everyday actions.

In the past most efforts to recognise emotions from the human body have focused on expressive gestures which are archetypal and exaggerated expressions of emotions. While these are easier to recognise by humans and computational pattern recognisers they very rarely occur in natural scenarios. The principal contribution of this dissertation is hence the inference of emotional states from everyday actions such as walking, knocking and throwing. The implementation of the system draws inspiration from a variety of disciplines including psychology, character animation and speech recognition. Complex actions are modelled using Hidden Markov Models and motion primitives. The manifestation of emotions in everyday actions is very subtle and even humans are far from perfect at picking up and interpreting the relevant cues because emotional influences are usually minor compared to constraints arising from the action context or differences between individuals.

This dissertation describes a holistic approach which models emotional, action and personal influences in order to maximise the discriminability of different emotion classes. A pipeline is developed which incrementally removes the biases introduced by different action contexts and individual differences. The resulting signal is described in terms of posture and dynamic features and classified into one of several emotion classes using statistically trained Support Vector Machines. The system also goes beyond isolated expressions and is able to classify natural action sequences. I use Level Building to segment action sequences and combine component classifications using an incremental voting scheme which is suitable for online applications. The system is comprehensively evaluated along a number of dimensions using a corpus of motion-captured actions. For isolated actions I evaluate the generalisation performance to new subjects. For action sequences I study the effects of reusing models trained on the isolated cases vs adapting models to connected samples. The dissertation also evaluates the role of modelling the influence of individual user differences. I develop and evaluate a regression-based adaptation scheme. The results bring us an important step closer to recognising emotions from body movements, embracing the complexity of body movements in natural scenarios.

Full text

PDF (8.7 MB)

BibTeX record

  author =	 {Bernhardt, Daniel},
  title = 	 {{Emotion inference from human body motion}},
  year = 	 2010,
  month = 	 oct,
  url = 	 {},
  institution =  {University of Cambridge, Computer Laboratory},
  doi = 	 {10.48456/tr-787},
  number = 	 {UCAM-CL-TR-787}