‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.
This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.
By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.
There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’
This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.
The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.
Photos and event by Josh Borom and Matt Pickering
This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.
RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.
The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.
The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.
Here are some photos of the prototyping, building and wearing of the sensor
Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.
Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.
All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.
The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.
Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.
Nicole is the creative director of Surface Area Dance Theatre –
Trans/Human is Adam Denton and Luke Twyman –
Over the past few months I have been involved in the Superdream project, which has been coordinated by ISIS Arts, The Trinity Session, Sticky Situations, The Swallows Foundation, Gateshead Council and others to produce Art for public spaces in both Gateshead and Jeppestown, Johannesburg and to create links between UK and South African artists.
For the Superdream project I produced digital media art for two events, one in Windmill Hills Park, Gateshead and one in Jeppe Park, Johannesburg. Both areas were ‘undeveloped’, and one of the aims of the interventions in these spaces was to redefine public space through art, technology, innovation and a sense of community.
In Windmill Hills on May 2nd 2014 I presented an interactive audiovisual work, in stages of development.
‘Polyfields’ used four HC-SR04 ultrasonic distance sensors on an Arduino Mega 2560 programmed in Arduino’s IDE to output the distance of objects from the sensors, and these distances were used by Pd-extended to be the controls for a four channel audio piece, as well as a four channel video piece.
The audio and video data was taken from four sites around the park, both using conventional microphones as well as contact microphones and wall sensor modules, and video was also taken from these four sites – with one sensor representing one ‘site’.
In this way the participants (purposefully or accidentally) would introduce ‘channels’ of video and audio, to create their own experience of this supernormal locus of identity.
I was then invited to go to Johannesburg with ISIS arts and The Trinity Session along with Lucy Carolan, Richard Glynn, Andrea Masala, Marek Gabrysch and Charlotte Gregory to produce work for the Superdream event in Jeppe Park as part of the British Council’s Connect ZA programme.
For this event I presented a redesigned version of my work for Windmill Hills, using material gathered from Jeppe Park, working on collaboration with South African artists Em Kay and KMSN, and re-coded the installation using Max/MSP/Jitter and the NewPing library. The four sensors were nailed to a tree in Jeppe Park and the participants were invited to interact with it as they pleased, again creating their own experience of the artwork, it lying dormant until interaction was detected.
For this event I also created a piece based around a table football table sourced from a workshop next to the park which was close to a state of disrepair. With the help of Marek Gabrysch the table was able to be restored to it’s full working condition, and I augmented the football table with sensors in the goals, as well as small buffers next to the goals to wire the football table for sound, with a small selection panel on the side of the football table, players could select sounds which were triggered by hitting the buffers, and a random ‘Goal!’ sound would be triggered when a goal was scored.
This installation was created using an Arduino Mega 2560 and a Raspberry Pi running Stanford University’s Satellite CCRMA operating system. I designed a patch in Pd-extended interfacing with the Arduino using the Firmata software, which was then run from a headless Raspberry Pi which handled the sound. It was very much a prototype – a bespoke installation for the park.
The installation was very popular with the local children and thanks to some fantastic refereeing by one of the South African artists there were quite a few hotly contested games!