From the 18th-20th November 2015 I took part in a residency at Access Space organised by Alex McLean as part of the Sonic Pattern and the Textility of Code series of events from Inhabiting the Hack.
As part of this residency I worked with sonic, digital and textile artists Magdalena Halay, Nora O Murchú and Toni Buckby. The residency was open-ended, with no set goals other than to investigate ideas surrounding our disciplines and relating them back to the broader themes of Sonic Pattern.
The residency began with a workshop in Tablet Weaving, a technique using rotations of square card to create patterned weaves. By following simple procedures tablet weaving can create interesting geometrical patterns with immediate results (my own weave is pictured). I plan to add to my openFrameworks-based live coding visual software with a programmable emulation of tablet weaving controlled by musical data.
Then the rest of the residency was used to explore particular projects. In my case I collaborated with interdisciplinary textile artist and weaver Toni Buckby, in order to create a gesture tracking glove using Arduino (pictured), which would graph Toni’s weaving hand gestures of the course of a live performance which was being dictated by a second gesture sensor working in tandem with a program written by Toni and Alex McLean. During the performance the glove would also feed data into SuperCollider, and I live-coded the sounds produced in response to Toni’s hand gestures in the context of a larger group improvisation. The graph produced as a result of the performance will then be laser-cut to form a performance artefact (a test pressing is pictured), and possibly source material for further performances.
Through the residency I experimented with the production of performance artefacts and sonic objects through laser cutting, as well as utilising accurate representation of gestures in space using Euler angles derived from accelerometer and gyroscope data (previously I had only used raw values for comparatively rudimentary ways of controlling sound). I also gained experience in weaving techniques and knowledge about textile skills through collaboration with Toni.
Thanks to Alex McLean and everyone at Access Space.
In the months since ICLC I have been reconfiguring the way in which I perform my live coding sets, and as Live Coding forms an integral part of the research and work I am doing for my MA I have chosen to improve on my skills of improvising from a greater pool of musical materials during performance, as well as incorporating a more cohesive and considered visual element to my performances.
I have recently played for OXJAM Newcastle/Gateshead takeover, NARC. Magazine Presents… and Manchester Algorave, and I am due to perform at Jam Jah Minifest, London Algorave and others.
I am currently developing a program in openFrameworks to create a visual backdrop to my live coding sets. The impetus for this development was for the NARC Magazine night which was focused around audiovisual performance (featuring analogue film collective Filmbee). This program currently displays a number of sound-responsive elements, and can be sequenced easily using tasks within SuperCollider. As I have recently switched to using Ubuntu as my main system I can display the visuals behind my sets using the transparency option on Compiz.
I will be uploading a full sound and screen capture of a live coding set to my Vimeo account soon.
I have also been re-developing the sounds I use when performing. I am experimenting with tonal ideas based on the harmonic series, synthesis using Nicholas Collins’s chaos theory UGens and generally having a more flexible approach to tempo and rhythm, inspired by Drum and Bass and Hip-Hop, rather than the rigid techno tempos I have been using in my music for the past few years. A couple of my most recent sets are uploaded to Bandcamp here. Check co34pt.bandcamp.com for a rolling archive of recorded live sets.
I have also been using Live Coding as a tool for improvising with other musicians, including techno duo Motmot with Tim Shaw, as well as using Tidal in a number of free improvisation sessions featuring Elvin Brandhi, John Bowers and Charlie Bramley
In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.
This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.
‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.
‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.
My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.
The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.
Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.
‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.