In the months since ICLC I have been reconfiguring the way in which I perform my live coding sets, and as Live Coding forms an integral part of the research and work I am doing for my MA I have chosen to improve on my skills of improvising from a greater pool of musical materials during performance, as well as incorporating a more cohesive and considered visual element to my performances.
I have recently played for OXJAM Newcastle/Gateshead takeover, NARC. Magazine Presents… and Manchester Algorave, and I am due to perform at Jam Jah Minifest, London Algorave and others.
I am currently developing a program in openFrameworks to create a visual backdrop to my live coding sets. The impetus for this development was for the NARC Magazine night which was focused around audiovisual performance (featuring analogue film collective Filmbee). This program currently displays a number of sound-responsive elements, and can be sequenced easily using tasks within SuperCollider. As I have recently switched to using Ubuntu as my main system I can display the visuals behind my sets using the transparency option on Compiz.
I will be uploading a full sound and screen capture of a live coding set to my Vimeo account soon.
I have also been re-developing the sounds I use when performing. I am experimenting with tonal ideas based on the harmonic series, synthesis using Nicholas Collins’s chaos theory UGens and generally having a more flexible approach to tempo and rhythm, inspired by Drum and Bass and Hip-Hop, rather than the rigid techno tempos I have been using in my music for the past few years. A couple of my most recent sets are uploaded to Bandcamp here. Check co34pt.bandcamp.com for a rolling archive of recorded live sets.
I have also been using Live Coding as a tool for improvising with other musicians, including techno duo Motmot with Tim Shaw, as well as using Tidal in a number of free improvisation sessions featuring Elvin Brandhi, John Bowers and Charlie Bramley
In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.
This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.
‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.
‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.
My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.
The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.
Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.
‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.
This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.
RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.
The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.
The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.
Here are some photos of the prototyping, building and wearing of the sensor
Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.
Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.
All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.
The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.
Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.
Nicole is the creative director of Surface Area Dance Theatre –
Trans/Human is Adam Denton and Luke Twyman –
Over the past couple of years I have been experimenting with creating visual art based on some of my own photos processed using various forms of non-image editing. These processing methods (raw data editing, hex editing, sound processing) and so forth exploit the liminal state between order and disorder in data architecture in image protocols, with small changes omissions and insertions in raw data creating effects on the terms of the literal composition of the digital image itself, rather than by any aesthetic rules or preferences I have.
Creating these images is a matter of ‘Curating’ these glitches, and moderating them inside the preset bounds of the images architecture, so as not to break them.
Full album (a growing collection of the more prominent examples I have developed) here: http://s1382.photobucket.com/user/theseancotterill/library/Databending-Glitch%20art-Raw%20editing
All licensed CC BY-SA
Over the past few months I have been involved in the Superdream project, which has been coordinated by ISIS Arts, The Trinity Session, Sticky Situations, The Swallows Foundation, Gateshead Council and others to produce Art for public spaces in both Gateshead and Jeppestown, Johannesburg and to create links between UK and South African artists.
For the Superdream project I produced digital media art for two events, one in Windmill Hills Park, Gateshead and one in Jeppe Park, Johannesburg. Both areas were ‘undeveloped’, and one of the aims of the interventions in these spaces was to redefine public space through art, technology, innovation and a sense of community.
In Windmill Hills on May 2nd 2014 I presented an interactive audiovisual work, in stages of development.
‘Polyfields’ used four HC-SR04 ultrasonic distance sensors on an Arduino Mega 2560 programmed in Arduino’s IDE to output the distance of objects from the sensors, and these distances were used by Pd-extended to be the controls for a four channel audio piece, as well as a four channel video piece.
The audio and video data was taken from four sites around the park, both using conventional microphones as well as contact microphones and wall sensor modules, and video was also taken from these four sites – with one sensor representing one ‘site’.
In this way the participants (purposefully or accidentally) would introduce ‘channels’ of video and audio, to create their own experience of this supernormal locus of identity.
I was then invited to go to Johannesburg with ISIS arts and The Trinity Session along with Lucy Carolan, Richard Glynn, Andrea Masala, Marek Gabrysch and Charlotte Gregory to produce work for the Superdream event in Jeppe Park as part of the British Council’s Connect ZA programme.
For this event I presented a redesigned version of my work for Windmill Hills, using material gathered from Jeppe Park, working on collaboration with South African artists Em Kay and KMSN, and re-coded the installation using Max/MSP/Jitter and the NewPing library. The four sensors were nailed to a tree in Jeppe Park and the participants were invited to interact with it as they pleased, again creating their own experience of the artwork, it lying dormant until interaction was detected.
For this event I also created a piece based around a table football table sourced from a workshop next to the park which was close to a state of disrepair. With the help of Marek Gabrysch the table was able to be restored to it’s full working condition, and I augmented the football table with sensors in the goals, as well as small buffers next to the goals to wire the football table for sound, with a small selection panel on the side of the football table, players could select sounds which were triggered by hitting the buffers, and a random ‘Goal!’ sound would be triggered when a goal was scored.
This installation was created using an Arduino Mega 2560 and a Raspberry Pi running Stanford University’s Satellite CCRMA operating system. I designed a patch in Pd-extended interfacing with the Arduino using the Firmata software, which was then run from a headless Raspberry Pi which handled the sound. It was very much a prototype – a bespoke installation for the park.
The installation was very popular with the local children and thanks to some fantastic refereeing by one of the South African artists there were quite a few hotly contested games!
archo data is a Youtube channel I will be using to upload databent, glitched and wrecked digital videos. These videos are ones that I have edited with brute force methods such as raw data editing and incorrect format processing.
The results of these edits are ephemeral by their very nature, and will manifest differently for different programs, codecs and platforms. As such, what I upload to YouTube will not be what I see before I upload it. Once databent in this way the videos become linguistic anomalies of the digital world – and are rendered differently by different platforms, and often make the platforms themselves function incorrectly.
I will be uploading to this channel intermittently, whenever I produce these videos I will upload them.
CAUTION: Due to the nature of how glitches are read by video programs, there are often very loud sounds and flashing lights.
This video serves as documentation of a composition I wrote with Dario Lozano-Thornton for sine waves and computer controlled strobe. Aether is a 9 minute audiovisual composition, with a score for an arduino-based variable rate and intensity strobe composed based on a composition for pure tones by Dario.
The piece is reflective of demanding physical data extracted from extremely simple conditions – pure frequencies and binary visual states. The audio frequencies due to their nature and spectral placement appear at certain points to be distorting, or coming from multiple sources, however this is not the case, it is simply our ears’ inability to accurately represent that data. Similarly with the strobe light, it is difficult for our eyes (and cameras) to accurately represent binary states past certain frequencies, so visual hallucination and distortion can occur. Unfortunately, this composition is difficult to accurately represent on camera due to extremely fast strobes.
The video is from a showing of Aether at Broadacre House, at the Function event. The composition was adapted for two flashing laptop screens for this event.
Feric is an umbrella name under which I create various sound and visual art works, ranging from tactile contact microphone instruments through to site-specific generative noise pieces
I have various one-off audio/visual projects under this name, and I am continually producing new works.
A guerrilla site-specific piece of sound art based on a bespoke semi-generative feedback construction created in Ableton. Source material derived from laptop microphone recordings in Newcastle City Library
Tactile electroacoustic improvisation created from several pre-recorded object samples fed into a generative music setup in Ableton. Manipulated through generations of processing into forced feedback.
Played on zoviet*france A Duck In A Tree 27th April 2013
A Feric Audiovisual work, comprised of several pieces of artwork and two pieces of music composed in mid-2013, this video was created on the 9th of march 2014
https://soundcloud.com/feric-1/sets/table-music (Unfortunately I cannot embed this soundcloud link)
A trial of a tactile contact-microphone instrument. ‘Table Music’ is the precursor for Spawn, Thot and Ghast. A table with attatched contact microphones used with objects to become a tactile malleable instrument