Archive | Live Coding RSS for this section

Live Coding, Algorave & 3D Visuals

In the months since ICLC I have been reconfiguring the way in which I perform my live coding sets, and as Live Coding forms an integral part of the research and work I am doing for my MA I have chosen to improve on my skills of improvising from a greater pool of musical materials during performance, as well as incorporating a more cohesive and considered visual element to my performances.

I have recently played for OXJAM Newcastle/Gateshead takeover, NARC. Magazine Presents… and Manchester Algorave, and I am due to perform at Jam Jah Minifest, London Algorave and others.

I am currently developing a program in openFrameworks to create a visual backdrop to my live coding sets. The impetus for this development was for the NARC Magazine night which was focused around audiovisual performance (featuring analogue film collective Filmbee). This program currently displays a number of sound-responsive elements, and can be sequenced easily using tasks within SuperCollider. As I have recently switched to using Ubuntu as my main system I can display the visuals behind my sets using the transparency option on Compiz.

I will be uploading a full sound and screen capture of a live coding set to my Vimeo account soon.

I have also been re-developing the sounds I use when performing. I am experimenting with tonal ideas based on the harmonic series, synthesis using Nicholas Collins’s chaos theory UGens and generally having a more flexible approach to tempo and rhythm, inspired by Drum and Bass and Hip-Hop, rather than the rigid techno tempos I have been using in my music for the past few years. A couple of my most recent sets are uploaded to Bandcamp here. Check for a rolling archive of recorded live sets.

I have also been using Live Coding as a tool for improvising with other musicians, including techno duo Motmot with Tim Shaw, as well as using Tidal in a number of free improvisation sessions featuring Elvin Brandhi, John Bowers and Charlie Bramley

co¥ᄀpt live releases

Here are a few ‘live albums’ of live coding sets i’ve been performing. I’m hoping this will form a rolling archive of sets I perform, so keep an eye out for future releases.

My final degree recital will be taking place on the 8th of may at Culture Lab as part of a gig alongside SNAILS WITH NAILS and COOKING WITH FAYE. Starting at 7pm.

I’m going to be performing a 40 minute set of live coded dance music and live coded lighting.

BABBLE – 27th April 2015


I’ll be performing at BALTIC on the 27th of April for BABBLE, an event focusing on interconnection between sound and text. I’ll be live-coding vocal processing of poems read by Charlie Dearnley and Lauren Vevers using SuperCollider, accompanied by music from TURC.

Gateshead Algorave – 14th March 2015

This is a screen and audio recording of my SuperCollider techno set for the gateshead Algorave, which also featured Ballotts, JoAnne, Miss Blueberry, Section_9, Tim Shaw, Yeah You, Alo Allik, Sick Lincoln and Mariam Rezeai.

“Algorave is made from “sounds wholly or predominantly characterised by the emission of a succession of repetitive conditionals“. These days just about all electronic music is made using software, but with artificial barriers between the people creating the software algorithms and the people making the music. Using systems built for creating algorithmic music, such as IXI Lang, overtone, puredata, Max/MSP, SuperCollider, Impromptu, Fluxus and Tidal, these barriers are broken down, and musicians are able to compose and work live with their music as algorithms. This has good and bad sides, but a different approach leads to interesting places.”

‘And all we said was “Saved”!’ – for on-the-fly codepoetry


Site about the event by RI RV –

photo by RI RV

‘And all we said was “Saved”!’ is a recreation of a set performed at the ‘on-the-fly codepoetry’ event in Leeds at Wharf Chambers, hosted by RI RV on the 25th of January

This live coding set uses a pool of sixteen Public Domain poems, which are cut into lines, and then sequenced from SuperCollider by using Tasks and displayed using Processing as a shifting, live-codeable, generative seven line poem.

I built up the poem over time, adding a task for each line to perform, and as the poem built up I live coded a soundtrack based on words I observed in the fleeting dialogue, and an attempt to capture those words through mimetic sound. This version of the set ascends and descends over half an hour, and when everything has stopped we are left with a final poem, a snapshot of the shifting dialogue seen throughout the set.

The final poem was:


And dead men, dead men,

New fetters on our hoped-for liberty:

Hardening to rage in a flame of chiselled stone,

There came a dreadful, piercing sound,

Far in the hillside camp, in slumber lies

A slow sweet comer,

And all we said was “Saved”!


The poems prepared for the set were:

Aldous Huxley – Stanzas
Charles Baudelaire – The Albatross
Felix Jung – Anemophobia
Felix Jung – Indecision
Laurence Hope – Mahomed Akram’s Appeal to the Stars
Walt Whitman – A Hand-Mirror
Bertolt Brecht – A Worker Reads History
Christina Georgina Rossetti – May
Christina Georgina Rossetti – Tempus Fugit
Elizabeth Jennings – Absence
Ella Wheeler Wilcox – Europe
Emily Elizabeth Dickinson – Saved!
Johann Wolfgang von Goethe – To Originals
Madison Julius Cawein – Imperfection
Paul Cameron Brown – Skootematta
Thomas Fredrick Young – A Dream

The setup code, and the resulting code created during the performance can be downloaded here

State of Grace – Dance City Training Lab 17th-18th January

I was invited back as the musical director for a State of Grace Psychophysical training lab on the weekend of the 17th-18th of January. Focusing on developing emergent narratives through improvisation, A group of performers (dancers, poets, artists) performed in several exercises to develop character-based narratives over the course of two days, leading up to a half-hour open improvisation at the conclusion of the weekend.

For the weekend I was improvising musical dialogues with the performers, engaged in a symbiotic relationship both drawing on and contributing to the physical performance by the group of performers. I used live coding extensively throughout the course of the weekend, using SuperCollider to create organic but very flexible musical structures, with sets of simple rules and mimetic  sound-performance relationships developing and dissolving into and out of complexity over time, all while in direct reaction to the kinds of performative strands occurring at the time.

This video is an edited set of footage taken of the improvisation and events surrounding it by Matt Jamie. The weekend was led by Ben Ayerton and Lizzie Klotz

Live Coding – 13th Jan 2015 – The ‘setup’ file used, the filepaths at the bottom are specific to my system, which were folders filled with percussive samples – The code written during the performance. Samples b,c,e,f were live-recorded voice snippets.

Here is a video and the code produced of my last live coding performance at Newcastle University on the 13th January. The performance was a response to two briefs given for a project for the first semester of the year – ‘the voice’ and ‘repetition’. The first half of the performance builds up a texture by manipulating and layering four live-recorded voice samples, and the second half of the performance introduces only percussive elements, layering conditional-based percussive patterns on top of the skeleton of the ‘voice’ section to create very little verbatim repetition, but a repetitive framework based on simple mathematical structures.

I’ve also uploaded a piece of livecoding based on the harmonic series of a 200Hz drone

along with code – 

On the 25th of January i’ll be going to leeds for a live coding performance using cut-up public domain poetry and semantically derived musical content – ‘on-the-fly codepoetry’ at Wharf Chambers, Leeds on the 25h of January –


Live Coding – 11th December 2014 – ICMuS Student Concert

For my Major Project this year I am developing the discipline of Live Coding using the SuperCollider programming language.

After practicing live electronic music for a few years i’ve decided to do this partly to perform using Open Source software, but also as live coding allows me to have a much more tactile approach to electronic performance. Previously my performances had been very predetermined, in the sense that most of the material I play was fixed with relatively little scope for determining the arc of the performance on the fly.

With live coding I can to a much greater degree merge electronic music performance and improvisation, by playing music determined by code and algorithms written live, it gives me scope to improvise performances, as well as using programmed randomness to perform electronic music that organically grows and can be edited on the fly, in real time