Computer Laboratory

Affective robotics

Laurel Riek, Andra Adams & Peter Robinson

One of our robots, Charles, making various facial expressions
One of our robots, Charles, making various facial expressions

As robots start to leave factories and begin to enter our schools, workplaces, and homes, it is important that people are able to interact with them in a way that is comfortable and natural to them. Eventually this might be via natural language dialogue, but given the complexities of language that may be not be available for a while. In the meantime, another approach is to allow people to communicate with robots using non-verbal communication, such as gestures and facial expressions. This involves making robots able to both accurately sense what humans are expressing (recognition) but also generating such expressions themselves (synthesis).

Our research uses natural affect data to synthesize realistic facial expressions and gestures on zoomorphic, humanoid, and android robots. Then, to validate these expressions, we perform empirical human-robot interaction experiments. We explore aspects of emotional interaction such as empathy, rapport building, and cooperation.

However, expressions aren’t the whole story! In order to sustain interaction with people, it is also important interactive robots express the right thing at the right time. This is very difficult problem that requires fundamental research into communication patterns in human-human interaction, and then seeing if we can apply them to human-robot interaction. To do this we are using techniques from the emerging field of social signal processing.

Selected publications