In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.
This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.
‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.
‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.
My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.
The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.
Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.
‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.
‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.
This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.
By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.
There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’
This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.
The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.
Photos and event by Josh Borom and Matt Pickering
Site about the event by RI RV – http://cargocollective.com/onfcopoe/otfCp00
photo by RI RV
‘And all we said was “Saved”!’ is a recreation of a set performed at the ‘on-the-fly codepoetry’ event in Leeds at Wharf Chambers, hosted by RI RV on the 25th of January
This live coding set uses a pool of sixteen Public Domain poems, which are cut into lines, and then sequenced from SuperCollider by using Tasks and displayed using Processing as a shifting, live-codeable, generative seven line poem.
I built up the poem over time, adding a task for each line to perform, and as the poem built up I live coded a soundtrack based on words I observed in the fleeting dialogue, and an attempt to capture those words through mimetic sound. This version of the set ascends and descends over half an hour, and when everything has stopped we are left with a final poem, a snapshot of the shifting dialogue seen throughout the set.
The final poem was:
And dead men, dead men,
New fetters on our hoped-for liberty:
Hardening to rage in a flame of chiselled stone,
There came a dreadful, piercing sound,
Far in the hillside camp, in slumber lies
A slow sweet comer,
And all we said was “Saved”!
The poems prepared for the set were:
Aldous Huxley – Stanzas
Charles Baudelaire – The Albatross
Felix Jung – Anemophobia
Felix Jung – Indecision
Laurence Hope – Mahomed Akram’s Appeal to the Stars
Walt Whitman – A Hand-Mirror
Bertolt Brecht – A Worker Reads History
Christina Georgina Rossetti – May
Christina Georgina Rossetti – Tempus Fugit
Elizabeth Jennings – Absence
Ella Wheeler Wilcox – Europe
Emily Elizabeth Dickinson – Saved!
Johann Wolfgang von Goethe – To Originals
Madison Julius Cawein – Imperfection
Paul Cameron Brown – Skootematta
Thomas Fredrick Young – A Dream
The setup code, and the resulting code created during the performance can be downloaded here https://www.dropbox.com/s/9jmtlvjfvhzn724/on-the-fly%20codepoetry%20Sean%20Cotterill.zip?dl=0
For the ZENDEH Micro Award I was commissioned by a panel of young people to produce pieces of work for an event at the Literary and Philosophical Society based on the script for Zendeh’s new piece CINEMA, a personal and intimate play surrounding the tragic Cinema Rex fire in Abadan, Iran.
The event showcased installations by the artists who received the award, as well as excerpts from CINEMA, poetry readings, an Iranian tar recital and public workshops and activities also hosted by the artists.
The video above is the culmination of the workshop that I held for the ZENDEH Micro Award. I hosted a ten person workshop using small, self-contained synthesizers with speakers based on 555 timers and LM-386 op-amps and built on breadboards. These synthesizers had one potentiometer to control frequency, which simply ranged from low frequency to high freqency. Through the course of the workshop I introduced the participants to the synthesizers, and guided them through customising the frequency range to their by switching out capacitors in the circuit.
Once everyone had customised their instrument, I introduced the graphic score for the performance – a graph of the population of Abadan, Iran (the place of the Cinema Rex catastrophe). The idea was to use the graph as an inspiration for the performance – rather than a dogmatic 1:1 score which must be obeyed, there would then be a 5 minute performance with the synthesizers, inspired by the graphic score.
The workshop was designed to be simple and open, with no obligation at any point to submit to any fixed rules surrounding the performance. The result was an engaging performance which everyone enjoyed, and an incredible sound in the rich acoustic space of the James Knott room in the Lit and Phil. The synthesizers were taken home by the workshop participants, along with information about the construction of the circuit, and further modifications that were possible.
The other part of my work for the ZENDEH Micro Award was a proximity-activated four channel sound and light installation, a personal response to the script and the cinema rex fire, which responded dynamically to interaction with participants and observers, creating a personal, immersive and interactive experience. Documentation of this piece is forthcoming.
Thanks to ZENDEH for curating, commissioning and organising the event, and Matt Jamie for the video/audio documentation.
This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.
RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.
The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.
The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.
Here are some photos of the prototyping, building and wearing of the sensor
Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.
Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.
All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.
The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.
Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.
Nicole is the creative director of Surface Area Dance Theatre –
Trans/Human is Adam Denton and Luke Twyman –
Over the past few months I have been involved in the Superdream project, which has been coordinated by ISIS Arts, The Trinity Session, Sticky Situations, The Swallows Foundation, Gateshead Council and others to produce Art for public spaces in both Gateshead and Jeppestown, Johannesburg and to create links between UK and South African artists.
For the Superdream project I produced digital media art for two events, one in Windmill Hills Park, Gateshead and one in Jeppe Park, Johannesburg. Both areas were ‘undeveloped’, and one of the aims of the interventions in these spaces was to redefine public space through art, technology, innovation and a sense of community.
In Windmill Hills on May 2nd 2014 I presented an interactive audiovisual work, in stages of development.
‘Polyfields’ used four HC-SR04 ultrasonic distance sensors on an Arduino Mega 2560 programmed in Arduino’s IDE to output the distance of objects from the sensors, and these distances were used by Pd-extended to be the controls for a four channel audio piece, as well as a four channel video piece.
The audio and video data was taken from four sites around the park, both using conventional microphones as well as contact microphones and wall sensor modules, and video was also taken from these four sites – with one sensor representing one ‘site’.
In this way the participants (purposefully or accidentally) would introduce ‘channels’ of video and audio, to create their own experience of this supernormal locus of identity.
I was then invited to go to Johannesburg with ISIS arts and The Trinity Session along with Lucy Carolan, Richard Glynn, Andrea Masala, Marek Gabrysch and Charlotte Gregory to produce work for the Superdream event in Jeppe Park as part of the British Council’s Connect ZA programme.
For this event I presented a redesigned version of my work for Windmill Hills, using material gathered from Jeppe Park, working on collaboration with South African artists Em Kay and KMSN, and re-coded the installation using Max/MSP/Jitter and the NewPing library. The four sensors were nailed to a tree in Jeppe Park and the participants were invited to interact with it as they pleased, again creating their own experience of the artwork, it lying dormant until interaction was detected.
For this event I also created a piece based around a table football table sourced from a workshop next to the park which was close to a state of disrepair. With the help of Marek Gabrysch the table was able to be restored to it’s full working condition, and I augmented the football table with sensors in the goals, as well as small buffers next to the goals to wire the football table for sound, with a small selection panel on the side of the football table, players could select sounds which were triggered by hitting the buffers, and a random ‘Goal!’ sound would be triggered when a goal was scored.
This installation was created using an Arduino Mega 2560 and a Raspberry Pi running Stanford University’s Satellite CCRMA operating system. I designed a patch in Pd-extended interfacing with the Arduino using the Firmata software, which was then run from a headless Raspberry Pi which handled the sound. It was very much a prototype – a bespoke installation for the park.
The installation was very popular with the local children and thanks to some fantastic refereeing by one of the South African artists there were quite a few hotly contested games!
This video serves as documentation of a composition I wrote with Dario Lozano-Thornton for sine waves and computer controlled strobe. Aether is a 9 minute audiovisual composition, with a score for an arduino-based variable rate and intensity strobe composed based on a composition for pure tones by Dario.
The piece is reflective of demanding physical data extracted from extremely simple conditions – pure frequencies and binary visual states. The audio frequencies due to their nature and spectral placement appear at certain points to be distorting, or coming from multiple sources, however this is not the case, it is simply our ears’ inability to accurately represent that data. Similarly with the strobe light, it is difficult for our eyes (and cameras) to accurately represent binary states past certain frequencies, so visual hallucination and distortion can occur. Unfortunately, this composition is difficult to accurately represent on camera due to extremely fast strobes.
The video is from a showing of Aether at Broadacre House, at the Function event. The composition was adapted for two flashing laptop screens for this event.
Supported by Sound and Music, Arborescent comissioned 3 graphic scores from Richard Dawson, Yvette Hawkins and Mawson Kerr, inspired by legendary composer Iannis Xenakis.
Bringing together artists from multiple disciplines, the commissioned artworks were performed by us as per our interpretation and the artists’ directions
Sean Cotterill – Violin & Electronics
John Pope – Double Bass
Mariam Rezaei – Electronics, Tar