
CRASSH (Centre for Research in the
Arts, Social Sciences and Humanities)
http://www.crassh.cam.ac.uk/
|
AHRC
ICT Methods Network Workshop
Co-sponsored
by CRASSH
(Centre for Research in the Arts, Social Sciences and Humanities)
Collaborative
Performance
Technology
cambridge university / AnglIA Ruskin
University
CAmBRIDGE, 20- 21 December 2006
|
ORGANISED BY |
Julio
d'Escrivan, Anglia Ruskin University, j.d'escrivan [at] anglia.ac.uk
Richard Hoadley, Anglia Ruskin University, r.j.hoadley
[at] anglia.ac.uk
Nick Collins, University of Sussex, nc272 [at] cam.ac.uk
Ian Cross, Centre for Music & Science, Uni. of Cambridge,
ic108 [at] cam.ac.uk
Alan Blackwell, Crucible Network for Research in Interdisciplinary
Design
(Computer Laboratory, Uni. of Cambridge, Alan.Blackwell
[at] cl.cam.ac.uk)
Catherine Hurley, CRASSH, University of Cambridge, ch335 [at] cam.ac.uk
Neil Grindley,
Snr. Project Officer, Methods Network, neil.grindley [at]
kcl.ac.uk
|
RECORDED BY |
Chris
Nash, University
of Cambridge, cmn34 [at] cam.ac.uk
|
PARTICIPANTS |
Simon
Blackmore, Artist, simon [at] simonblackmore.net
Alan Blackwell, Computer Lab., Cambridge, Alan.Blackwell
[at] cl.cam.ac.uk
Nick Collins, University of Sussex, nc272 [at] cam.ac.uk
Ian Cross, Centre for Music and Science, Cambridge,
ic108 [at] cam.ac.uk
Jason Dixon, University of East Anglia, jason [at]
mutantsounds.com
Julio d'Escrivan, Anglia Ruskin University, j.d'escrivan
[at] anglia.ac.uk
Jamie Forth, Royal College of Music, jforth [at] rcm.ac.uk
Iris Garrelfs, Artist, iris [at] sprawl.org.uk
Owen Green, City University, London, owen [at] owengreen.net
Tom Hall, Anglia Ruskin University, t.r.hall [at] anglia.ac.uk
Alex Harker, University of York, ajharker [at] gmail.com
Richard Hoadley, Anglia Ruskin University, r.j.hoadley
[at] anglia.ac.uk
Paul Jones, Anglia Ruskin University, pauljones_mds
[at] yahoo.com
Jin Hyun Kim, University of Cologne, jinhyun.kim [at]
uni-koeln.de
Andrew Lovett, Composer, andrew.lovett [at] ntlworld.com
Anton Lukoszevieze, King’s College, Cambridge, al449
[at] cam.ac.uk
Chris Nash, Computer Laboratory, Cambridge, cmn34 [at]
cam.ac.uk
Rui Penha, University of Aveiro (Portugal), ruipenha
[at] momentumensemble.org
Sam Salem, Manchester University, sam.salem [at] gmail.com
Dan Tidhar,
University of Cambridge, dut20 [at] cam.ac.uk
|
APOLOGIES |
Ian Cross, Centre for Music and Science, Cambridge,
ic108 [at] cam.ac.uk
Sam Salem, Manchester University, sam.salem [at] gmail.com |
|
|
PROGRAMME
|
Afternoon
17:00
19:30 |
20
December
Out-of-town participants check in at Newnham College
Convene at Darwin College Old Library for introductions and
planning session, at which final programme for the 21st will be
confirmed, based on shared interests, resources, and ambitions.
Arrive for dinner at St Catherine's College
|
09:00
10:30
13:00
15:00
16:30
17:00 |
21 December
Convene at Anglia Ruskin University
Morning break (and review presentation)
Lunch (and review presentation)
Afternoon break (and review presentation)
Closing review session
Formal close (but work can continue for another hour or
two, if tasks remain to be finished)
|
|
|
OVERVIEW
The aim of this workshop
was to train practice-based researchers in the performing arts,
to make better use of ICT technologies that support live collaboration
in performance situations. The workshop leaders were Julio D'Escrivan
and Richard Hoadley of Anglia Ruskin University; Ian Cross of
the Cambridge University Centre for Music and Science; and Alan
Blackwell of the Crucible network for research in interdisciplinary
design.
The workshop was based
in the recently-upgraded music technology teaching facility at
Anglia Ruskin University, which contained two group studios each
equipped with nine G5 dual-processor Macintoshes. Collaborative
facilities included networked sound processing with studio monitors,
local MIDI keyboards and audio processors on each workstation,
shared headphones for pair work, and central video projection
facilities.
The workshop was structured
to include a range of participants including technical specialists
in the use of SuperCollider and Max/MSP, professional exponents
of mixed genre performance and a small selection of practitioners
from other performance genres such as poetry, live video art and
others. Participants were provided at the start of the day with
a broad range of inexpensive sensors, and a short hands-on introduction
to the process of interfacing these to performance software such
as SuperCollider. The workshop was then be divided into mixed
discipline teams for hands-on development and instruction, ensuring
that programmers do not ‘race ahead’ of performing collaborators.
In the course of the day, the whole group shared experiences during
unstructured breaks, with a structured sharing of experiences
in the final session. Throughout the day, work in progress was
captured and shared by facilitators moving from team to team.
The overall ambition was to emulate a ‘collaboration masterclass’
as the most appropriate model for practice-based research workshops
applying technology in the performing arts.
For more information,
please contact Neil Grindley neil.grindley [at] kcl.ac.uk.
|
|
|
PEOPLE
There were a wide variety of backgrounds and roles for those attending
the workshop, with candidates representing cultures and music
across the globe. Everyone brought a fresh perspective on musical
collaboration and technology and a wealth of experience in many
composition and performance techniques and technologies...
|
 |
Alan Blackwell ORGANISER / OBSERVER
Crucible Centre for Research in Interdisciplinary Design / Computer
Laboratory, University of Cambridge |
Alan's long and distinguished career in academic
and industrial research has taken in many seemingly diverse fields,
including computer science, music, psychology, religion and medieval
history. From his current base in the Computer Laboratory at Cambridge,
he has recently being attempting to bring them together, and encourage
work in interdisciplinary subjects, through Crucible. He saw the
workshop as an opportunity to explore practise-based research
and bring both similar and differing minds together to collaborate
on a meeting of music and technology. Alan, in his role as organiser,
also looked forward to honing his skills as a facilitator. |
 |
Simon Blackmore PARTICIPANT
Exhibiting / Performing Artist |
Simon has had several years experience integrating
micro-controllers and other electronics into installations and
musical performances, and has used SuperCollider on many occassions.
He has worked with kids in educational environments and is eager
to adapt non-musical objects for use in video and music art. He
would also like to see a move from the loop- and sample-based
aesthetic of modern electronica to an approach more akin to re-orchestration.
His goal at the workshop was to investigate what activities other
people in the field were (and, critically, weren't) working on. |
 |
Nick Collins ORGANISER / PARTICIPANT
Lecturer in Music Informatics, Department of Informatics, University
of Sussex |
Having recently completed his PhD, Nick brings
his considerable experience in live computer music to the workshop.
His knowledge extends to both audio and video, specialising in
SuperCollider, for which he has developed several machine listening
plugins. He also has interests in algorithmic composition, psychoacoustics,
beat tracking and is, himself, a live performer and SuperCollider
plug-in coder. |
 |
Jason Dixon PARTICIPANT
Live Performer / Composer / PhD Student, University of East Anglia
(UEA) |
Jason is an Irish Composer, currently studying for a PhD in composition
in Norwich, England. Previously at the Sonic Arts Research Centre
(SARC) in Belfast, he has significant performance experience using
electronics and music technology. His aim at the workshop was to
define his artistic focus and also build a network of contacts to
collaborate with on future works. |
 |
Julio d'Escrivan ORGANISER / PARTICIPANT
Senior Lecturer in Music Technology, Department of Music, Anglia
Ruskin University (ARU) |
Originally from Venezuela and now a Pathway
Leader at ARU, Julio has significant experience in music technology
and is an expert in electro-acoustic composition. As such, he
has a considerable knowledge of SuperCollider and Max/MSP. He
has written for TV and games - in the latter, specialising in
interactive music, or music 'with a goal'. In addition to his
role as host and organiser of the event, he was very interested
to learn and draw inspiration from what other's are doing in the
field. |
 |
Jamie Forth PARTICIPANT
PhD Student in Composition, Royal College of Music (RCM) |
Representing the Royal College of Music,
Jamie's studies in composition take him beyond the normal role
of practising composer and have involved him in informatics and
the development of music systems, for performance, improvisation
and live coding. Admitting that he has not yet delved deeply into
live performance himself, he is eager to move into this area and
explore what technology has to offer in this regard. He has much
experience in SuperCollider and is looking for to extending and
sharing this at the workshop. He was also interested to find out
what strategies and paradigms are in use for musical instrument
and interface design. |
 |
Iris Garrelfs PARTICIPANT
Exhibiting / Performing Artist |
Iris is a popular, prolific and celebrated
professional performing artist and composer on the London electro-acoustic
scene, who has collaborated with many other artists throughout
the world. Her aesthetic draws from not only more-earnest (or
ernste) electro-acoustic styles, but also more popular
modern music. She has much experience in using technology in performance
and installation work, including realtime sound processing with
Max/MSP, but limited experience with SuperCollider and interactive
computer music methods such as live coding. At this workshop,
she is interested to explore how her interaction with the computer
might be improved in live performance. |
 |
Owen Green PARTICIPANT
PhD Student in Music, City University (London) |
Owen's current studies in composition centre
around dynamic, long-term collaborative endeavours. A specific
focus of his is to expand the scope for improvisation in established
popular music fields, such as hip-hop. He is hopeful that the
answer lies in technology, with technologies such as Max/MSP,
which he has been exploring for several years now. At the workshop,
he was looking for further collaborations and hoped to expand
his experience to SuperCollider and the use of micro-controllers.
|
 |
Neil Grindley ORGANISER / OBSERVER
ACT Methods Network, AHRC ICT |
Neil represented the ACT Methods Network,
with whose invaluable assistance the workshop was made possible.
He has significant experience of organising, pioneering and supporting
similar projects for the Arts and Humanities Research Council
(AHRC), designed to foster creativity and originality between
disciplines. Although he has past experience with musical collaborations,
the format for this workshop is something new and Neil was very
interested to see not only how people get along, but possibly
how the endeavour might become a model for future events. |
 |
Tom Hall PARTICIPANT / SUPPORT
Senior Lecturer in Music Technology, Department of Music, Anglia
Ruskin University (ARU) |
As well as lecturing at ARU, who hosted the
event, Tom is an active composer and performer of electro-acoustic
music, and researcher in the fields of algorithmic composition,
sonification and multimedia work. He brings his musical and technological
experience to the workshop, and also his invaluable inside knowledge
of the equipment and facilities at ARU. |
 |
Alex Harker PARTICIPANT
PhD Student in Composition, Department of Music, University of York |
Currently based in York, Alex's composition
studies take in elements of both the acoustic and electro-acoustic.
His aesthetic centres around contemporary classical, and he is eager to exploit
the possibilities that electronics afford in this genre, especially
in the case of live performance and improvisation. With a deep
knowledge of Max/MSP, he has on occasion delved into C programming,
but is interested to extend his experience of working with electronics,
sensors and micro-controllers. In addition, he saw the workshop
as an opportunity to see how people combine different technologies,
such as Max/MSP, SuperCollider and hardware. |
 |
Richard Hoadley ORGANISER / PARTICIPANT
Senior Lecturer in Music Technology, Department of Music, Anglia
Ruskin University (ARU) |
A lecturer and researcher at ARU, Richard's
interests lie equally in performance, composition and the supporting
technologies. He is especially interested in fostering technologies
and interfaces that allow for spontaneity and originality in musical
practices, and ensure that the influx of technology use does not
negatively impact the enjoyability of music. Helping to organise
and host the event, Richard was keen to see where the field is
at in terms of technology and musical aesthetic, and see how people
use the tools available. |
 |
Paul Jones PARTICIPANT
Student in Audio and Creative Music Technology, Department of Music,
Anglia Ruskin University (ARU) |
Formerly a DJ, Paul has been using technology
in musical performance for a number of years. He has a working
knowledge of both audio and video processing in Max/MSP, Jitter
and SuperCollider, often using the tools in combination. His current
interest is in the use of electronics and technology to improve
musical interfaces for performance. At the workshop, he looked
forward to extending his knowledge of music software (notably
SuperCollider) and hardware, and developing skills that would
allow him to collaborate on future projects. |
 |
Jin Hyun Kim PARTICIPANT
Researcher, Department of Systematic Musicology, University of Cologne
(Köln) |
Arriving from Cologne, Kim's interests give
her a different perspective on the field. Her research focuses
on interaction in music, specifically the gestural aspects of
coupling sound and movement, but she states that her approach
has been more theoretical than practical to date, only touching
on technologies like Max/MSP and SuperCollider. As such, she saw
the workshop as an opportunity to experience musical interactions
first hand, and also exchange views on artistic, technological
scientific, psychological and philosophical aspects of physical
interaction in music. |
 |
Andrew Lovett PARTICIPANT / PERFORMER
Pianist / Composer |
As a composer, Andrew has been studying and
practising for years, oscillating between Cambridge and London.
From his background as a pianist, he grew an increasing interest
in all forms of music technology, recently culminating in his
first major opera, Abraham On Trial, in 2005, which used
both audio and video technologies and included well over 1000
different sounds in the score. In addition to his considerable
compositional experience, Andrew was able to bring a performers
perspective to the workshop. For him, the event provided insights
into alternative ways of working with electronics in music. |
 |
Anton Lukoszevieze PARTICIPANT
/ PERFORMER
Cellist / Composer / Photographer |
A popular and renowned performer, Anton's
main focus is the cello, but his passions extend further. He is
a celebrated video and sound artist, talented photographer, and
the subject of Jayne Parkers 2005 film, FoxFire Eins.
His aesthetic tends toward acoustic and analogue methods, but
not necessarily using conventional methods. The workshop, for
him, was an opportunity to see how people used technology in artistic
endeavours and work with artists of differing, yet complimentary
aesthetics. |
 |
Chris Nash RECORDER / OBSERVER
/ SUPPORT
PhD Student in Music HCI, Computer Laboratory and Faculty of Music,
University of Cambridge |
Chris's background began in computers, but
an enduring passion for music lead to the combination of music
and technology in both his professional and academic pursuits.
Through his own efforts in computer music and his experience of
writing software for others, he has developed a specific interest
in the interfaces of music software, and is currently researching
ways to improve the spontaneity and dynamism of computer-based
composition tools. To this end, one of his goals at the workshop
was to see how musicians and people from different backgrounds
used existing software and, specifically, how expertise in such
software develops. |
 |
Rui Penha PARTICIPANT
PhD Student in Music, Department of Communication and Art, University
of Aveiro (Portugal) |
Rui's studies in composition focus on the electro-acoustic,
going slightly beyond the remit of his more traditionally-oriented
institute. As such, his considerable experience with music technology
(including Max/MSP) is self-taught. He is eager to unite all creative
minds in collaborative exercises - acoustic, electro-acoustic,
electronic, young, old, etc. - and sees technology as the key
to this goal. His objective at the workshop was to become more
familiar with technologies that afforded collaboration, such as
SuperCollider, and share knowledge with others. |
 |
Dan Tidhar PARTICIPANT
Harpsichordist / Post-doctoral Researcher in Computational Linguistics,
University of Cambridge |
With an extensive and varied background in
music and technology, Dan's current focus is information retrieval
in music systems, with a view to using the computer as performer.
A harpsichordist himself, Dan is keen to integrate art and technology,
often combining his passions for baroque music and computing.
To this end, he has experience of numerous technologies, including
Max/MSP and Jitter, but is looking forward to learning more about
the role of SuperCollider in performance, and its use in combination
with other technologies. |
|
|
PROJECTS
On the first day, participants met to get to know each other,
talk about their expertise and backgrounds and establish directions
and goals to pursue on the coming workshop day. Several participants
had worked with each other before; some tended towards artistic
perspective, others towards the more technical; and each candidate
had experience and knowledge of different technologies to varying
degrees. As such, participants were broken into three groups with
which to collaborate with respect to their background and interest.
Efforts were made so that each group had similar goals, but varied
experience and perspectives.
Technologies and equipment were available from several
sources. In addition to the facilities available at ARU, many
participants brought their own software and hardware. A large
variety of sensors, micro-controllers and interface circuitry
boards was available to be used with each participants own Apple
Macintosh laptop. These were at times linked with other more traditional
acoustic instrument, belonging to the department and musicians,
including a cello, harpsichord and (most of) a piano. Combining
so many diverse technologies, some technical problems had to be
expected, but aside from occasional computer glitches, difficulties
were minimal and participants were quick to adapt.
The workshop day was divided into several sessions and
each group proceeded with a similar methodology, for each session:
set-up hardware, experiment with different combinations of hardware
and software, establish a goal and realise it. The objective of
the day wasn't to produce a polished concert or market-ready interaction
tool, but to explore the possibilities of collaboration. As such,
the fruits of the participants' labours came in the form of small
presentations during the scheduled breaks, where a group would
demonstrate and explain their ideas.
Some projects built on the previous work of candidates,
utiltising software plug-ins or hardware of their own design.
In several cases, it is expected that the collaborations and working
relationships here will, in turn, fuel future projects...
|
PROJECT
I : "A Little Light Music"
Simon Blackmore, Jason Dixon,
Richard Hoadley, Paul Jones, Anton Ludoszevieze
Lit by computer screens, ARU's darkened computer
room was the venue for the workshop's son et lumière.
The project, building on Paul's previous experience with video
tracking, placed Anton centre-stage with his cello. In his hand,
he clasped both the bow (a custom design, crafted for greater
friction and thicker sonic textures) and a glow ball. A few feet
away, a computer recorded and processed both the sound and the
image. A firewire camera tracked the different coloured light
of the balls and used it to control different elements of the
sound processing, with, for example, the ball's vertical height
affecting the volume, pitch, etc. of the recorded acoustic sound.
Similarly, experiments were made using the sound to drive its
own manipulation - such as detecting the recorded pitch and using
it to re-synthesise the recording at an inversely proportional
frequency, and then layering it with the original sound. To achieve
this, the group used a combination of software technologies, including
Max/MSP, Jitter and SuperCollider.
Thanks to the expertise and experience of the
participants, the project progressed quickly and the group was
able to demonstrate their ideas during the morning break. After
the presentation, they were able to spend time widening their
exploration, experimenting with different synthesis models. Different
input and control devices were also looked at, including a multiple-axis
game controller and pressure-sensitive glove, the latter of allowed
some of the group to shift their focus from software and consumer
hardware to lower level electronics.
Equipment: 3 Apple Mac Powerbooks (running SuperCollider,
Max/MSP and Jitter), cello (with custom bow), light/glow balls,
Shure SM57 microphone, Arduino USB interface board, game controller,
pressure-sensitive glove. |
 |
|
|
(Images
© 2006 Cambridge Performance Technology Workshop - click to enlarge)
PROJECT
II : "DVI VIXI TACVI MORTVA DULCE CANO"
Nick Collins, Julio d'Escrivan, Jamie Forth, Dan
Tidhar (and Iris Garrelfs)
This project takes its title from the latin inscription
written on the department's harpsichord, which translates as "In
life, I was silent; in death, I sweetly sing", originally
an epitaph for the wood used in its construction. Adding a further
lease of life to the instrument, and perhaps to compensate for
its lack of tuning, the group combined the harpsichord with sound
and video computer processing.
The earlier sessions had been spent testing how
various sensors and micro-controllers interacted with each other
and with software such as SuperCollider and Max/MSP. Each member
demonstrated and explained technologies (for some, their own)
to the others, and the group explored the potential of uniting
them. This exchange continued throughout the day, and in the later
sessions culminated with Nick and Dan leading an effort to exploit
the potential of the disused harpsichord lying nearby, using ideas
.
With Dan at the keyboard, a firewire camera was
used to monitor the movement of his hands over the keys, achieved
by tracking the white light reflected by his fingers using a computer
running Max/MSP. Simultaneously, a microphone was positioned under
the instrument's sounding board to record the acoustic output,
which was fed into a second computer running SuperCollider, where
it fed a software (re-)synthesizer. Then, using Open Sound Control
(OSC), the data from the first computer (the video tracker) was
fed into the second to manipulate the synthesizer process, allowing
the keyboardist to not only manipulate the sound directly, using
the harpsichords hammers and strings, but also indirectly, through
his fingers and the computer.
Equipment: 4 Apple Mac Powerbooks (networked,
running SuperCollider, Max/MSP, Jitter and OSC), harpsichord,
microphone, Unibrain Fire-I webcam, Edirol FA101 audio interface,
Korg Kaoss control pad, Logitech Wingman gamepad, custom sensor
breakout box (wrapped into SuperCollider), USB interface board
and various sensors and micro-controllers (including sliders,
benders, rotary knobs, gyroscope, accelerometers). |
 |
|
|
|
(Images
© 2006 Cambridge Performance Technology Workshop - click to enlarge)
PROJECT
III : "The Prosthetic Piano Party"
Owen Green, Tom Hall, Alex Harker, Jin Hyun Kim,
Andrew Lovett, Rui Penha (and Iris Garrelfs)
This project was inspired by the hulk of an old
upright piano, sitting neglected in ARU's corridor. Mute, due
to the removal of its hammers, the piano was otherwise in good
condition and the team set about restoring its musical status,
attempting to retain its input control (the keyboard) and resonant
chamber (strings, chassis), but replace the driver (hammers) with
computers and electronics.
The group took a methodical approach, taking
time to discuss and plan their attack in the morning session.
The group then broke into pairs and threes to tackle the problems
of interface design, electronics and computer processing.
A number of electronic sensors and micro-controllers,
such as sliders and force sensors, were integrated into the keyboard
and pedals of the instrument, feeding a Phidgets USB interface
board before entering a computer running Max/MSP. The computer
would use the input device data to control low-frequency noise-based
sound generators, which were then outputted to loudspeakers. The
speakers, however, did not address the audience, but were placed
against the piano's sound board, so as to induce vibrations in
the piano chassis and strings. Because of the chassis's construction,
a direct coupling was difficult to achieve and vibrations, though
present, were not of the volume required for performance. Undeterred,
the group placed a contact microphone against the strings of the
instrument and piped the recorded sound back to the speakers to
set up a feedback loop that not only amplified the sound, but
also produced a new sonic texture that was at the same time original
and yet not entirely divorced from the familiar piano timbre.
Andrew, as an experienced pianist and composer
of contemporary music, was able to exploit the piano's new character
in the group's presentation. In addition to using the electronic
control methods, he would also rap on the piano's chassis and
pluck the strings directly to further excite the sound caught
in the feedback loop. Following the presentation, the group experimented
with variations on the system, including different synthesis and
re-synthesis techniques, using Max/MSP and SuperCollider. Although
a full concert was not possible, the overwhelming opinion in the
group was that their was a lot of potential for future collaboration
in the project and group.
Equipment: 2 Apple Mac Powerbooks (running Max/MSP),
1 Apple Macbook Pro (running SuperCollider), Mobile I/O audio
interface, Edirol UA25 audio interface, piano chassis, Phidget
USB interface board, contact microphone, Impact game controller,
Marshall Guitar Amp (mini), Genelec monitors, speakers and various
sensors and micro-controllers (including force sensors, sliders,
Apple HDD tilt sensor). |
 |
|
|
(Images
© 2006 Cambridge Performance Technology Workshop - click to enlarge)
|
|
FEEDBACK AND CONCLUSIONS
At the end of the day, all participants met one last time over
coffee to talk about the workshop: what they had achieved, whether
it met their expectations and whether it was a good model for
future events.
The response was overwhelmingly positive - everyone had
found the event both educational and enjoyable. Many had made
contacts that they looked forward to working with in the future;
others had got a taste of technologies that could take their work
and art in new directions.
One criticism shared by nearly all the participants, however,
was that the workshop was simply too short, and that one day was
not enough. In addition to being a little rushed, the participants
felt that the limited time led to a focus on developing and evaluating
technologies, and didn't allow for much artistic exploration of
how such technologies could be used. Many would have liked to
have longer, so that they could refine their projects, perhaps
with the aim of preparing polished performances or a concert.
The general consensus was that the event should be between two
days and one week long. Within these extra days, other participants
said they would also welcome masterclass sessions, from experts
in different fields on different subjects.
These factors aside, as an initial foray into the field,
the event proved highly successful and demonstrated the value
of practise-based research in live coding and performance, as
well as music (and media) technology in general. |
|
|