In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.
This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.
‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.
‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.
My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.
The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.
Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.
‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.
‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.
This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.
By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.
There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’
This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.
The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.
Photos and event by Josh Borom and Matt Pickering
This video serves as documentation of a composition I wrote with Dario Lozano-Thornton for sine waves and computer controlled strobe. Aether is a 9 minute audiovisual composition, with a score for an arduino-based variable rate and intensity strobe composed based on a composition for pure tones by Dario.
The piece is reflective of demanding physical data extracted from extremely simple conditions – pure frequencies and binary visual states. The audio frequencies due to their nature and spectral placement appear at certain points to be distorting, or coming from multiple sources, however this is not the case, it is simply our ears’ inability to accurately represent that data. Similarly with the strobe light, it is difficult for our eyes (and cameras) to accurately represent binary states past certain frequencies, so visual hallucination and distortion can occur. Unfortunately, this composition is difficult to accurately represent on camera due to extremely fast strobes.
The video is from a showing of Aether at Broadacre House, at the Function event. The composition was adapted for two flashing laptop screens for this event.
Feric is an umbrella name under which I create various sound and visual art works, ranging from tactile contact microphone instruments through to site-specific generative noise pieces
I have various one-off audio/visual projects under this name, and I am continually producing new works.
A guerrilla site-specific piece of sound art based on a bespoke semi-generative feedback construction created in Ableton. Source material derived from laptop microphone recordings in Newcastle City Library
Tactile electroacoustic improvisation created from several pre-recorded object samples fed into a generative music setup in Ableton. Manipulated through generations of processing into forced feedback.
Played on zoviet*france A Duck In A Tree 27th April 2013
A Feric Audiovisual work, comprised of several pieces of artwork and two pieces of music composed in mid-2013, this video was created on the 9th of march 2014
https://soundcloud.com/feric-1/sets/table-music (Unfortunately I cannot embed this soundcloud link)
A trial of a tactile contact-microphone instrument. ‘Table Music’ is the precursor for Spawn, Thot and Ghast. A table with attatched contact microphones used with objects to become a tactile malleable instrument