MPhil and Part II projects 2012/2013

Physical Control of a Programming Editor for Live Coding

Monome Drum Kit

Background
Although a wide range of interaction peripherals are now available for workstations, most programming editors stick with the keyboard and mouse. Do these standard peripherals offer the most effective interface to programming or are they just a surviving relic? What might be the requirements of a programming environment for more esoteric physical interfaces and how might we measure their efficacy?
Objective
This project will explore the opportunities alternative interfaces offer to the programming workflow. As the context for this exploration we shall consider "live coding" where programmers construct and manipulate code on-the-fly for artistic means such as making music: http://vimeo.com/22798433. Tools for live coding a musical performance provide a novel set of constraints on programming workflow where live, highly reactive and feedback-oriented interfaces are an important requirement.
Suggested work programme
The technical starting point will be to expand the typical REPL-oriented approach to performing music with live coding toolkits such as Overtone. This would require obtaining a basic understanding of the Clojure programming language and live musical performance with algorithmic synthesisers. One possible approach would be to extend the programmer's editor Emacs with a range of new syntax-directed editing interfaces controlled by external devices such as rotary controllers, monomes, MIDI surfaces or instruments.
As an extension, it would be interesting to explore how this new approach can be directly included in performance workflows such that new interfaces may be designed and created at runtime with little or no interference to the ongoing performance. It would also be interesting to explore how these interface techniques might be applicable to more traditional programming contexts.
Contact:
sja55@cl.cam.ac.uk, or afb21@cl.cam.ac.uk for MPhil

Functional Representation of Time for Control Applications

Conveyor Ladder Diagram

Background
Clojure is a new functional language that executes on the JVM, and has recently come into high demand as a professional implementation platform. http://onlinelabor.blogspot.co.uk/2012/02/high-wage-skills-on-odesk-or-why-you.html
New research by Sam Aaron and Jeff Rose has created a Clojure extension for real-time applications called Overtone (see http://vimeo.com/42540495)
Although originally developed for time-based description of musical structure, Overtone includes explicit models of time that potentially make it well-suited to industrial real-time control applications.
Objective
The goal of this project is to create a graphical front-end to Overtone programs, for example in a format related to the ladder logic diagrams that are traditionally used by engineers to program factory automation systems. Ladder diagrams are poor in their representation of time (for example, they can suffer from race conditions), meaning that an Overtone/Clojure back end may be a good choice to express and debug execution and timing constraints.
Suggested work programme
The first step will be to learn Clojure, and create an Overtone simulation of the mechanical behaviour of a simple assembly line - for example a rotating stamp, short conveyor belt and packing robot. Timing and mechanical variability will be expressed in Overtone, resulting in a time sequence of simulated sensor outputs to be used by an automated controller.
The next step will be to create a graphical interface in ladder logic form, that can be edited by the user to define responses to the sensor outputs. This interface can be created in Java, with a Clojure back end that generates Clojure code equivalent to the ladder logic.
Finally, the ladder logic controller will be connected to the assembly line simulation, in order to test the ladder logic program under various conditions of machine performance and simulated faults.
As an extension, the simulated assembly line can also be presented to the user as a Java animation, so that the performance of the controller can be viewed as a live animation. In principle, the assembly line could also be configured via a graphical interface, although this would most likely be beyond the scope of a Part II project.
Evaluation:
Take a small number of test cases based on specifications of actual machines, and demonstrate that these can be simulated, and the real- time simulation correctly controlled, using the system. Conduct simulated experiments in which the relative timing of machine components is varied, with randomly injected fault conditions corresponding to component failure.
The success criterion is for the control logic should operate as expected within a defined safety/performance envelope, over a range of simulation conditions. The evaluation phase will collect simulation data in order to demonstrate that this goal has been reached.
Design of evaluation scenarios will involve some discussion with one or more engineers from a specialist industrial control company such as Quin Systems or TAP Biosystems. An optional additional evaluation exercise could evaluate the relative usability of Clojure and ladder logic with respect to other automation languages used by companies like these.
Contact:
sja55@cl.cam.ac.uk, or afb21@cl.cam.ac.uk for MPhil

Video Processing Language for the Raspberry Pi

Raspberry Pi Kodu

Background
The Raspberry Pi architecture essentially provides a set-top box with an end-user programmable architecture. However, most users of the device are not yet exploiting its video processing capabilities. In part, this is because the programming APIs for video codecs under Linux are not accessible via educational programming languages like Scratch.
Objective
The objective of this project is to extend an existing visual programming language with syntax to support live video processing. The general syntactic style could be based on systems such as Microsoft Research Kodu (pictured), although several alternatives are also available within the Computer Laboratory.
Suggested work programme
First step will be to experiment with the existing API to video codecs used on the Raspberry Pi, and evaluate the performance of the GPU under live transformations such as image scaling and warping, streaming from online video sources, processing of input from a camera module, and compositing of video data with 2d or 3d graphic renderings. Based on this evaluation, a selection of end-user programmable functions will be designed that are suitable for the Raspberry Pi graphics architecture.
A visual language syntax will then be selected, derived from one of a number of data flow, functional and constraint languages that have been used for similar signal processing and image processing tasks in the past.
A minimal visual syntax will be defined. In order to keep the scope of the project manageable, it is important that the initial syntax be sufficient only to support the programmable functions and their parameters, with sufficient expressiveness to modify and bind alternative values under user-program control (possibly extended via external I/O to allow users to create novel physical controllers).
A basic editor for this visual syntax will be implemented using the graphics facilities on the Raspberry Pi. Ideally, this should allow dynamic drawing and drag and drop of syntax components, but if necessary can be keyboard controlled. Dynamic composition of programs can be considered as an extension in the project plan.
A more ambitious extension would be to enhance the language syntax with more general purpose compute functions, for example sufficient to express a simple algorithm animation such as Quicksort using basic video blocks, or create turtle-graphics style geometry.
Evaluation:
A test suite will be defined for the language, representing a range of end-user video-processing scenarios (mashups, live VJ performance, creation of titles and fades etc). This will be validated by interviews with a small number of sample users. The main success criterion will be the successful creation of sample programs in the new language that demonstrate the video processing behaviours in the test suite.
As an extension, a user study could be carried out with a small sample of typical users, comparing the usability of the new language to others in this class.
Contact:
alan.blackwell@cl.cam.ac.uk


Click to return to Alan Blackwell's home page.