Gateshead Algorave – 14th March 2015

This is a screen and audio recording of my SuperCollider techno set for the gateshead Algorave, which also featured Ballotts, JoAnne, Miss Blueberry, Section_9, Tim Shaw, Yeah You, Alo Allik, Sick Lincoln and Mariam Rezeai.

“Algorave is made from “sounds wholly or predominantly characterised by the emission of a succession of repetitive conditionals“. These days just about all electronic music is made using software, but with artificial barriers between the people creating the software algorithms and the people making the music. Using systems built for creating algorithmic music, such as IXI Lang, overtone, puredata, Max/MSP, SuperCollider, Impromptu, Fluxus and Tidal, these barriers are broken down, and musicians are able to compose and work live with their music as algorithms. This has good and bad sides, but a different approach leads to interesting places.”

http://algorave.com

‘And all we said was “Saved”!’ – for on-the-fly codepoetry

_002_558

Site about the event by RI RV – http://cargocollective.com/onfcopoe/otfCp00

photo by RI RV

‘And all we said was “Saved”!’ is a recreation of a set performed at the ‘on-the-fly codepoetry’ event in Leeds at Wharf Chambers, hosted by RI RV on the 25th of January

This live coding set uses a pool of sixteen Public Domain poems, which are cut into lines, and then sequenced from SuperCollider by using Tasks and displayed using Processing as a shifting, live-codeable, generative seven line poem.

I built up the poem over time, adding a task for each line to perform, and as the poem built up I live coded a soundtrack based on words I observed in the fleeting dialogue, and an attempt to capture those words through mimetic sound. This version of the set ascends and descends over half an hour, and when everything has stopped we are left with a final poem, a snapshot of the shifting dialogue seen throughout the set.

The final poem was:

*/

And dead men, dead men,

New fetters on our hoped-for liberty:

Hardening to rage in a flame of chiselled stone,

There came a dreadful, piercing sound,

Far in the hillside camp, in slumber lies

A slow sweet comer,

And all we said was “Saved”!

*/

The poems prepared for the set were:

Aldous Huxley – Stanzas
Charles Baudelaire – The Albatross
Felix Jung – Anemophobia
Felix Jung – Indecision
Laurence Hope – Mahomed Akram’s Appeal to the Stars
Walt Whitman – A Hand-Mirror
Bertolt Brecht – A Worker Reads History
Christina Georgina Rossetti – May
Christina Georgina Rossetti – Tempus Fugit
Elizabeth Jennings – Absence
Ella Wheeler Wilcox – Europe
Emily Elizabeth Dickinson – Saved!
Johann Wolfgang von Goethe – To Originals
Madison Julius Cawein – Imperfection
Paul Cameron Brown – Skootematta
Thomas Fredrick Young – A Dream

The setup code, and the resulting code created during the performance can be downloaded here https://www.dropbox.com/s/9jmtlvjfvhzn724/on-the-fly%20codepoetry%20Sean%20Cotterill.zip?dl=0

State of Grace – Dance City Training Lab 17th-18th January

I was invited back as the musical director for a State of Grace Psychophysical training lab on the weekend of the 17th-18th of January. Focusing on developing emergent narratives through improvisation, A group of performers (dancers, poets, artists) performed in several exercises to develop character-based narratives over the course of two days, leading up to a half-hour open improvisation at the conclusion of the weekend.

For the weekend I was improvising musical dialogues with the performers, engaged in a symbiotic relationship both drawing on and contributing to the physical performance by the group of performers. I used live coding extensively throughout the course of the weekend, using SuperCollider to create organic but very flexible musical structures, with sets of simple rules and mimetic  sound-performance relationships developing and dissolving into and out of complexity over time, all while in direct reaction to the kinds of performative strands occurring at the time.

This video is an edited set of footage taken of the improvisation and events surrounding it by Matt Jamie. The weekend was led by Ben Ayerton and Lizzie Klotz

ZENDEH Micro Award – Abadan 10-06

For the ZENDEH Micro Award I was commissioned by a panel of young people to produce pieces of work for an event at the Literary and Philosophical Society based on the script for Zendeh’s new piece CINEMA, a personal and intimate play surrounding the tragic Cinema Rex fire in Abadan, Iran.

The event showcased installations by the artists who received the award, as well as excerpts from CINEMA, poetry readings, an Iranian tar recital and public workshops and activities also hosted by the artists.

The video above is the culmination of the workshop that I held for the ZENDEH Micro Award. I hosted a ten person workshop using small, self-contained synthesizers with speakers based on 555 timers and LM-386 op-amps and built on breadboards. These synthesizers had one potentiometer to control frequency, which simply ranged from low frequency to high freqency. Through the course of the workshop I introduced the participants to the synthesizers, and guided them through customising the frequency range to their by switching out capacitors in the circuit.

Once everyone had customised their instrument, I introduced the graphic score for the performance – a graph of the population of Abadan, Iran (the place of the Cinema Rex catastrophe). The idea was to use the graph as an inspiration for the performance – rather than a dogmatic 1:1 score which must be obeyed, there would then be a 5 minute performance with the synthesizers, inspired by the graphic score.

The workshop was designed to be simple and open, with no obligation at any point to submit to any fixed rules surrounding the performance. The result was an engaging performance which everyone enjoyed, and an incredible sound in the rich acoustic space of the James Knott room in the Lit and Phil. The synthesizers were taken home by the workshop participants, along with information about the construction of the circuit, and further modifications that were possible.

The other part of my work for the ZENDEH Micro Award was a proximity-activated four channel sound and light installation, a personal response to the script and the cinema rex fire, which responded dynamically to interaction with participants and observers, creating a personal, immersive and interactive experience. Documentation of this piece is forthcoming.

Thanks to ZENDEH for curating, commissioning and organising the event, and Matt Jamie for the video/audio documentation.

Live Coding – 13th Jan 2015

http://pastebin.com/WTxh15uM – The ‘setup’ file used, the filepaths at the bottom are specific to my system, which were folders filled with percussive samples

http://pastebin.com/vp4qG8K0 – The code written during the performance. Samples b,c,e,f were live-recorded voice snippets.

Here is a video and the code produced of my last live coding performance at Newcastle University on the 13th January. The performance was a response to two briefs given for a project for the first semester of the year – ‘the voice’ and ‘repetition’. The first half of the performance builds up a texture by manipulating and layering four live-recorded voice samples, and the second half of the performance introduces only percussive elements, layering conditional-based percussive patterns on top of the skeleton of the ‘voice’ section to create very little verbatim repetition, but a repetitive framework based on simple mathematical structures.

I’ve also uploaded a piece of livecoding based on the harmonic series of a 200Hz drone

along with code – 

On the 25th of January i’ll be going to leeds for a live coding performance using cut-up public domain poetry and semantically derived musical content – ‘on-the-fly codepoetry’ at Wharf Chambers, Leeds on the 25h of January – http://www.wharfchambers.org/events/icalrepeat.detail/2015/01/25/549/78/on-the-fly-codepoetry.html

 

Red Pools – Live at Blue Rinse 30-10-14

A fantastic audio/video recording by Gustav Thomas and Elvin Brandhi of a Red Pools set I performed for October’s Blue Rinse as part of Newcastle University Music Festival.

Probably the last set of it’s kind, performing material from all three current Red Pools releases. In addition to the music I designed a simple Max for Live patch which filtered the audio that was playing into three bands and turned the amplitude of the three bands into DMX RGB values, creating a very simple but effective light show which responded dynamically to the music using two LED lights, and this can be seen in the video.

RE/CEPTOR – TUSK Festival 2014

10517334_10152294661172611_4487573727166724339_o

This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.

RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.

The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.

The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.

Here are some photos of the prototyping, building and wearing of the sensor

1011104_10204860769310115_8839886129191380440_n 10415597_10204868000330886_74219588007764680_n  IMG_20141127_081039640_HDR10727353_796438013745979_1650551364_n

Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.

Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.

All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.

10644019_652692151516993_1099358373_n

The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.

Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.

Nicole is the creative director of Surface Area Dance Theatre –

http://www.surfacearea.org.uk/company.html

Trans/Human is Adam Denton and Luke Twyman –

http://transhuman.bandcamp.com/

http://imboredofbastards.tumblr.com/

http://dentone.tumblr.com/

Live Coding – 11th December 2014 – ICMuS Student Concert

For my Major Project this year I am developing the discipline of Live Coding using the SuperCollider programming language.

After practicing live electronic music for a few years i’ve decided to do this partly to perform using Open Source software, but also as live coding allows me to have a much more tactile approach to electronic performance. Previously my performances had been very predetermined, in the sense that most of the material I play was fixed with relatively little scope for determining the arc of the performance on the fly.

With live coding I can to a much greater degree merge electronic music performance and improvisation, by playing music determined by code and algorithms written live, it gives me scope to improvise performances, as well as using programmed randomness to perform electronic music that organically grows and can be edited on the fly, in real time

Protected: Gold Lines Are Mineral Veins // Benjamin Freeth and Sean Cotterill

This content is password protected. To view it please enter your password below: