Cam3D corpus
3D corpus of spontaneous complex mental states
Abstract
Hand-over-face gestures, a subset of emotional body language, are overlooked by automatic affect inference systems. We propose the use of hand-over-face gestures as a novel affect cue for automatic inference of cognitive mental states. Moreover, affect recognition systems rely on the existence of publicly available datasets, often the approach is only as good as the data. We present the collection and annotation methodology of a 3D multimodal corpus of 108 audio/video segments of natural complex mental states. The corpus includes spontaneous facial expressions and hand gestures labelled using crowd-sourcing and is publicly available.
Corpus
The labelled corpus is available on the University's open access repository, licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License.
Citing our work
If you use any of the resources provided on this page in any of your publications we ask you to cite the following work.
3D corpus of spontaneous complex mental states
Marwa Mahmoud, Tadas Baltrušaitis, Peter Robinson, and Laurel Riek
in Conference on Affective Computing and Intelligent Interaction, Memphis, TN, October 2011
Bibtex
@inproceedings{Mahmoud2011, author = {Marwa Mahmoud and Tadas Baltru\v{s}aitis and Peter Robinson and Laurel Riek}, title = {3D corpus of spontaneous complex mental states}, booktitle = {Conference on Affective Computing and Intelligent Interaction}, year = 2011, }