Computer Laboratory

Rhythmic gestures corpus

Abstract

Prolonged durations of rhythmic body gestures were proved to be correlated with different types of psychological disorders. To-date, there is no automatic descriptor that can robustly detect those behaviours. In this paper, we propose a cyclic gestures descriptor that can detect and localise rhythmic body movements by taking advantage of both colour and depth modalities. We show experimentally how our rhythmic descriptor can successfully localise the rhythmic gestures as: hands fidgeting, legs fidgeting or rocking, significantly higher than the majority vote classification baseline. Our experiments also demonstrate the importance of fusing both modalities, with a significant increase in performance when compared to individual modalities.

Corpus

We evaluate our approach on a dataset of acted gestures that include different rhythmic gestures (hands fidgeting, legs fidgeting and rocking). The labelled corpus is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License. Please send e-mail to Peter Robinson confirming your agreement to this and we will send you a password to download the corpus.

Citing our work

If you use any of the resources provided on this page in any of your publications we ask you to cite the following work.

Automatic Multimodal Descriptors of Rhythmic Body Movement
Marwa Mahmoud, Louis-Philippe Morency, and Peter Robinson
in International Conference on Multimodal Interaction, Sydney, Australia, December 2013

Bibtex

@inproceedings{Mahmoud2013,
  author	=	{Marwa Mahmoud and Louis-Philippe Morency and Peter Robinson},
  title		=	{Automatic Multimodal Descriptors of Rhythmic Body Movement},
  booktitle	=	{International Conference on Multimodal Interaction},
  year		=	2013,
}