Facial affect inference
Analysing a discouraging expression
Can you read minds? The answer is most likely ‘yes’. You may not consider it mind reading but our ability to understand what people are thinking and feeling from their facial expressions and gestures is just that. People express their mental states all the time through facial expressions, vocal nuances and gestures. We have built this ability into computers to make them emotionally aware.
The ability to attribute mental states to others from their behaviour and then to use that information to guide our own actions or predict those of others is known as the ‘theory of mind’. Although research on this theory has been around since the 1970s, it has recently gained attention due to the growing number of people with Autism conditions, who are thought to be ‘mind-blind’. That is, they have difficulty interpreting others’ emotions and feelings from facial expressions and other non-verbal cues.
Our computer system is based on the latest research in the theory of mind by Professor Simon Baron-Cohen, Director of the Autism Research Centre at Cambridge. His research provides a taxonomy of facial expressions and the emotions they represent. In 2004, his group published the Mind Reading DVD, an interactive computer-based guide to reading emotions from the face and voice. The DVD contains videos of people showing 412 different mental states. We have developed computer programs that can read facial expressions using machine vision, and then infer emotions using probabilistic machine learning trained by examples from the DVD.
Machine vision is getting machines to ‘see’, giving them the ability to extract, analyze and make sense of information from images or video, in this case footage of facial expressions. Probabilistic machine learning describes the mechanism of enabling a machine to learn an association between features of an image such as facial expression and other classes of information, in this case emotions, from training examples. The most likely interpretation of the facial expressions is then computed using probability theory.
The project was featured at the Royal Society’s Summer Science Exhibitions in July and September 2006.
- Real-time inference of complex mental states from facial expressions and head gestures
A book chapter giving technical details of the mind-reading system
- Generalization of a vision-based computational model of mind-reading
A conference paper describing tests with members of the public
- Mind-reading machines
Rana el Kaliouby’s PhD dissertation
- Affective video data collection using an automobile simulator
A conference paper discussing an application to monitoring car drivers
- The emotional hearing aid – an assistive tool for autism
A conference paper describing another possible application