In the months since ICLC I have been reconfiguring the way in which I perform my live coding sets, and as Live Coding forms an integral part of the research and work I am doing for my MA I have chosen to improve on my skills of improvising from a greater pool of musical materials during performance, as well as incorporating a more cohesive and considered visual element to my performances.
I have recently played for OXJAM Newcastle/Gateshead takeover, NARC. Magazine Presents… and Manchester Algorave, and I am due to perform at Jam Jah Minifest, London Algorave and others.
I am currently developing a program in openFrameworks to create a visual backdrop to my live coding sets. The impetus for this development was for the NARC Magazine night which was focused around audiovisual performance (featuring analogue film collective Filmbee). This program currently displays a number of sound-responsive elements, and can be sequenced easily using tasks within SuperCollider. As I have recently switched to using Ubuntu as my main system I can display the visuals behind my sets using the transparency option on Compiz.
I will be uploading a full sound and screen capture of a live coding set to my Vimeo account soon.
I have also been re-developing the sounds I use when performing. I am experimenting with tonal ideas based on the harmonic series, synthesis using Nicholas Collins’s chaos theory UGens and generally having a more flexible approach to tempo and rhythm, inspired by Drum and Bass and Hip-Hop, rather than the rigid techno tempos I have been using in my music for the past few years. A couple of my most recent sets are uploaded to Bandcamp here. Check co34pt.bandcamp.com for a rolling archive of recorded live sets.
I have also been using Live Coding as a tool for improvising with other musicians, including techno duo Motmot with Tim Shaw, as well as using Tidal in a number of free improvisation sessions featuring Elvin Brandhi, John Bowers and Charlie Bramley
Here are a few ‘live albums’ of live coding sets i’ve been performing. I’m hoping this will form a rolling archive of sets I perform, so keep an eye out for future releases.
My final degree recital will be taking place on the 8th of may at Culture Lab as part of a gig alongside SNAILS WITH NAILS and COOKING WITH FAYE. Starting at 7pm.
I’m going to be performing a 40 minute set of live coded dance music and live coded lighting.
This is a screen and audio recording of my SuperCollider techno set for the gateshead Algorave, which also featured Ballotts, JoAnne, Miss Blueberry, Section_9, Tim Shaw, Yeah You, Alo Allik, Sick Lincoln and Mariam Rezeai.
“Algorave is made from “sounds wholly or predominantly characterised by the emission of a succession of repetitive conditionals“. These days just about all electronic music is made using software, but with artificial barriers between the people creating the software algorithms and the people making the music. Using systems built for creating algorithmic music, such as IXI Lang, overtone, puredata, Max/MSP, SuperCollider, Impromptu, Fluxus and Tidal, these barriers are broken down, and musicians are able to compose and work live with their music as algorithms. This has good and bad sides, but a different approach leads to interesting places.”
Site about the event by RI RV – http://cargocollective.com/onfcopoe/otfCp00
photo by RI RV
‘And all we said was “Saved”!’ is a recreation of a set performed at the ‘on-the-fly codepoetry’ event in Leeds at Wharf Chambers, hosted by RI RV on the 25th of January
This live coding set uses a pool of sixteen Public Domain poems, which are cut into lines, and then sequenced from SuperCollider by using Tasks and displayed using Processing as a shifting, live-codeable, generative seven line poem.
I built up the poem over time, adding a task for each line to perform, and as the poem built up I live coded a soundtrack based on words I observed in the fleeting dialogue, and an attempt to capture those words through mimetic sound. This version of the set ascends and descends over half an hour, and when everything has stopped we are left with a final poem, a snapshot of the shifting dialogue seen throughout the set.
The final poem was:
And dead men, dead men,
New fetters on our hoped-for liberty:
Hardening to rage in a flame of chiselled stone,
There came a dreadful, piercing sound,
Far in the hillside camp, in slumber lies
A slow sweet comer,
And all we said was “Saved”!
The poems prepared for the set were:
Aldous Huxley – Stanzas
Charles Baudelaire – The Albatross
Felix Jung – Anemophobia
Felix Jung – Indecision
Laurence Hope – Mahomed Akram’s Appeal to the Stars
Walt Whitman – A Hand-Mirror
Bertolt Brecht – A Worker Reads History
Christina Georgina Rossetti – May
Christina Georgina Rossetti – Tempus Fugit
Elizabeth Jennings – Absence
Ella Wheeler Wilcox – Europe
Emily Elizabeth Dickinson – Saved!
Johann Wolfgang von Goethe – To Originals
Madison Julius Cawein – Imperfection
Paul Cameron Brown – Skootematta
Thomas Fredrick Young – A Dream
The setup code, and the resulting code created during the performance can be downloaded here https://www.dropbox.com/s/9jmtlvjfvhzn724/on-the-fly%20codepoetry%20Sean%20Cotterill.zip?dl=0
For the ZENDEH Micro Award I was commissioned by a panel of young people to produce pieces of work for an event at the Literary and Philosophical Society based on the script for Zendeh’s new piece CINEMA, a personal and intimate play surrounding the tragic Cinema Rex fire in Abadan, Iran.
The event showcased installations by the artists who received the award, as well as excerpts from CINEMA, poetry readings, an Iranian tar recital and public workshops and activities also hosted by the artists.
The video above is the culmination of the workshop that I held for the ZENDEH Micro Award. I hosted a ten person workshop using small, self-contained synthesizers with speakers based on 555 timers and LM-386 op-amps and built on breadboards. These synthesizers had one potentiometer to control frequency, which simply ranged from low frequency to high freqency. Through the course of the workshop I introduced the participants to the synthesizers, and guided them through customising the frequency range to their by switching out capacitors in the circuit.
Once everyone had customised their instrument, I introduced the graphic score for the performance – a graph of the population of Abadan, Iran (the place of the Cinema Rex catastrophe). The idea was to use the graph as an inspiration for the performance – rather than a dogmatic 1:1 score which must be obeyed, there would then be a 5 minute performance with the synthesizers, inspired by the graphic score.
The workshop was designed to be simple and open, with no obligation at any point to submit to any fixed rules surrounding the performance. The result was an engaging performance which everyone enjoyed, and an incredible sound in the rich acoustic space of the James Knott room in the Lit and Phil. The synthesizers were taken home by the workshop participants, along with information about the construction of the circuit, and further modifications that were possible.
The other part of my work for the ZENDEH Micro Award was a proximity-activated four channel sound and light installation, a personal response to the script and the cinema rex fire, which responded dynamically to interaction with participants and observers, creating a personal, immersive and interactive experience. Documentation of this piece is forthcoming.
Thanks to ZENDEH for curating, commissioning and organising the event, and Matt Jamie for the video/audio documentation.
http://pastebin.com/WTxh15uM – The ‘setup’ file used, the filepaths at the bottom are specific to my system, which were folders filled with percussive samples
http://pastebin.com/vp4qG8K0 – The code written during the performance. Samples b,c,e,f were live-recorded voice snippets.
Here is a video and the code produced of my last live coding performance at Newcastle University on the 13th January. The performance was a response to two briefs given for a project for the first semester of the year – ‘the voice’ and ‘repetition’. The first half of the performance builds up a texture by manipulating and layering four live-recorded voice samples, and the second half of the performance introduces only percussive elements, layering conditional-based percussive patterns on top of the skeleton of the ‘voice’ section to create very little verbatim repetition, but a repetitive framework based on simple mathematical structures.
I’ve also uploaded a piece of livecoding based on the harmonic series of a 200Hz drone
along with code – pastebin.com/6SSbtEnd
On the 25th of January i’ll be going to leeds for a live coding performance using cut-up public domain poetry and semantically derived musical content – ‘on-the-fly codepoetry’ at Wharf Chambers, Leeds on the 25h of January – http://www.wharfchambers.org/events/icalrepeat.detail/2015/01/25/549/78/on-the-fly-codepoetry.html
A fantastic audio/video recording by Gustav Thomas and Elvin Brandhi of a Red Pools set I performed for October’s Blue Rinse as part of Newcastle University Music Festival.
Probably the last set of it’s kind, performing material from all three current Red Pools releases. In addition to the music I designed a simple Max for Live patch which filtered the audio that was playing into three bands and turned the amplitude of the three bands into DMX RGB values, creating a very simple but effective light show which responded dynamically to the music using two LED lights, and this can be seen in the video.
For my Major Project this year I am developing the discipline of Live Coding using the SuperCollider programming language.
After practicing live electronic music for a few years i’ve decided to do this partly to perform using Open Source software, but also as live coding allows me to have a much more tactile approach to electronic performance. Previously my performances had been very predetermined, in the sense that most of the material I play was fixed with relatively little scope for determining the arc of the performance on the fly.
With live coding I can to a much greater degree merge electronic music performance and improvisation, by playing music determined by code and algorithms written live, it gives me scope to improvise performances, as well as using programmed randomness to perform electronic music that organically grows and can be edited on the fly, in real time
After a long gestation period, I have finally released the latest Red Pools album on We Are All Ghosts.
Conceived in the manner of a two-‘sided’ release, Corrugate is two expansive tracks, which expand and contract organically using found song, recorded samples, feedback and dense electroacoustic sculpting mediated through a muscular rhythmic pulse