Facial expressions, head gestures and gaze are very important in everyday human-human communication. People are very good at recognising faces and reading and interpreting non-verbal cues made by others. We signal our inner mental states to others through these non-verbal channels of communication. In addition, they help with turn taking during conversations, convey intent, and signal affection.
My current research interests are three fold:
- First, I am interested in automatic ways of tracking facial expressions and head pose using various Computer Vision approaches. I am currently exploring various deformable models (Active Appearance Models, Constrained Local Models) for the task of facial expression tracking in naturalistic environments. I am also very interested in the use of depth data in addition to colour to help us with the task of expression tracking.
- Secondly, I am interested in the automatic interpretation of such tracked features from an emotional perspective. How do we train machines to recognise facial expressions incorporating both static and dynamic signals.
- Finally, I am exploring ways of synthesising such tracked expressions or emotions on various virtual characters.