Computer Laboratory

Project suggestions from the Graphics & Interaction Group

Originator: Hatice Gunes

Computational Analysis of Personality

Individuals’ interactions with others are shaped by their personalities and their impressions regarding others' behaviours and personalities. Automatic personality analysis from nonverbal behaviours has therefore an impact on improving humans' interaction experience with socially intelligent systems including humanoid robots. There is already a significant body of work focusing on personality analysis from video and audio data modalities. This project aims to explore other data modalities such as bio-signals and novel multimodal fusion methods in a dyadic interaction scenario.

Requirements: This project would benefit from studying machine learning.


Originator: Hatice Gunes

Joint rhythms have been found to be positive for a sense of belongingness, kindness to strangers, and liking. Synchronised movement such as singing, dancing, and even laughing, is now understood as social clue that has benefits for communities but also individual emotions and well-being. This project focuses on computational data mining of a recorded public event called Transference where around 100 people were recorded using multiple sensors while they were wearing robot-lets. Computational analysis will focus on changes in rhythm and well-being indicators (relaxation, calmness, boredom) for which the the recordings have been annotated by experts.

Requirements: This project would benefit from previous experience with computer vision.

Live coded video processing architecture

Originator: Alan Blackwell with Sam Aaron

Sam Aaron's Sonic Pi language for live-coded music is a local product that has achieved huge popularity and media coverage. Funded by the Raspberry Pi Foundation, it is also availabnle for Mac and PC platforms.

The goal of this project is to create an architecture for interacting with multiple video streams, modifiable in real time through live-coding styles of software development. The starting point will be modelled on the Supercollider architecture for real-time processing of audio and event streams, but extended to support image and video frame data. It may not be necessary to use the Supercollider source code, although this will provide a source of guidance. The implementation language is likely to be C++.

Evaluation can be done in the Lab, based on video throughput, time resolution, latency and jitter from filter operations. As an extension, it would also be possible to package "Video Pi" (or choose your own name if you like) into the Sonic Pi editing environment so that kids can create their own video mixers or filters connected to an external camera. There will also be opportunities for future funded research in this area.

Novel interactive data visualisations

Originators: Alan Blackwell with Advait Sarkar and Ben Azvine (BT research)

Large networks of the kind operated by BT are a source of huge quantities of time-varying data with many variables. A wealth of information can be extracted from such data, but initial exploration of the dataset may be formidable, particularly when the features of the dataset are unknown. There are a few standard means of data visualisation including trend graphs, bubble diagrams, network diagrams, pie charts, geographical maps, sun ray diagrams, and radial views. However, these represent a relatively limited range of statistical methods. The goal of this project is to build on the capabilities of statistical analysis packages such as R and NumPy, to create new visualisations as an intuitive alternative to existing statistics packages and open source visualisation tools.

Musical interfaces to programming languages

Originators: Alan Blackwell and Sam Aaron

Live Coding is an exciting music genre featured not only in experimental art music, but in jazz improvisation and Algoraves (featured in Wired Magazine in August). Cambridge is one of the international centres of Live Coding research and performance. The goal of this project will be to integrate some newly acquired musical equipment (including a MIDI drumkit) into the programming environments used in improvised programming.

Control room attention models

Originators: Peter Robinson

Many complex industrial systems are run from central control rooms where operators monitor information on multiple screens to identify anomalous conditions. Current design tools for control rooms are limited to 3D models of the hardware which can be used to assess the physical ergonomics, but do not help understand the work of human operators.

This project focuses on developing computational models for predicting the operators' attention so that the human-machine interface could be evaluated and configured properly during control room design. These models are expected to improve arrangement of information shown through the HMI and lessen the operators' risk of missing important information in critical situations. This will involve predicting visual search patterns over an extended display.

Visual services

Originator: Peter Robinson

Tangible user interfaces could be built using a standard mobile phone and 2D bar-codes [Controlled availability of pervasive Web services, IEEE ICDCS 2003]. Design, build and evaluate such a system.

Propose your own project

The Graphics & Interaction Group has a range of interesting hardware. Consider the useful research that could be done if you had access to this and propose something novel and interesting.