Archive | Sound Art RSS for this section

A Third Exhibit, NICAP Commission

http://www.thelateshows.org.uk/home.html

In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.

http://moussemagazine.it/taac1-b/

This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.

‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.

‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.

My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.

The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.

Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.

‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.

SWINGME @ Square One, 12th March 2015

‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.

This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.

By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.

There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’

This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.

The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.

IMG_3316 IMG_3322 IMG_3290 IMG_3291 IMG_3300 IMG_3315 IMG_3308 IMG_3301 IMG_3317

Photos and event by Josh Borom and Matt Pickering

ZENDEH Micro Award – Abadan 10-06

For the ZENDEH Micro Award I was commissioned by a panel of young people to produce pieces of work for an event at the Literary and Philosophical Society based on the script for Zendeh’s new piece CINEMA, a personal and intimate play surrounding the tragic Cinema Rex fire in Abadan, Iran.

The event showcased installations by the artists who received the award, as well as excerpts from CINEMA, poetry readings, an Iranian tar recital and public workshops and activities also hosted by the artists.

The video above is the culmination of the workshop that I held for the ZENDEH Micro Award. I hosted a ten person workshop using small, self-contained synthesizers with speakers based on 555 timers and LM-386 op-amps and built on breadboards. These synthesizers had one potentiometer to control frequency, which simply ranged from low frequency to high freqency. Through the course of the workshop I introduced the participants to the synthesizers, and guided them through customising the frequency range to their by switching out capacitors in the circuit.

Once everyone had customised their instrument, I introduced the graphic score for the performance – a graph of the population of Abadan, Iran (the place of the Cinema Rex catastrophe). The idea was to use the graph as an inspiration for the performance – rather than a dogmatic 1:1 score which must be obeyed, there would then be a 5 minute performance with the synthesizers, inspired by the graphic score.

The workshop was designed to be simple and open, with no obligation at any point to submit to any fixed rules surrounding the performance. The result was an engaging performance which everyone enjoyed, and an incredible sound in the rich acoustic space of the James Knott room in the Lit and Phil. The synthesizers were taken home by the workshop participants, along with information about the construction of the circuit, and further modifications that were possible.

The other part of my work for the ZENDEH Micro Award was a proximity-activated four channel sound and light installation, a personal response to the script and the cinema rex fire, which responded dynamically to interaction with participants and observers, creating a personal, immersive and interactive experience. Documentation of this piece is forthcoming.

Thanks to ZENDEH for curating, commissioning and organising the event, and Matt Jamie for the video/audio documentation.

RE/CEPTOR – TUSK Festival 2014

10517334_10152294661172611_4487573727166724339_o

This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.

RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.

The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.

The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.

Here are some photos of the prototyping, building and wearing of the sensor

1011104_10204860769310115_8839886129191380440_n 10415597_10204868000330886_74219588007764680_n  IMG_20141127_081039640_HDR10727353_796438013745979_1650551364_n

Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.

Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.

All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.

10644019_652692151516993_1099358373_n

The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.

Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.

Nicole is the creative director of Surface Area Dance Theatre –

http://www.surfacearea.org.uk/company.html

Trans/Human is Adam Denton and Luke Twyman –

http://transhuman.bandcamp.com/

http://imboredofbastards.tumblr.com/

http://dentone.tumblr.com/

Protected: Gold Lines Are Mineral Veins // Benjamin Freeth and Sean Cotterill

This content is password protected. To view it please enter your password below:

Superdream, Connect ZA, Gateshead, Johannesburg and Jeppe Park

http://isitok.net/superdream/

Over the past few months I have been involved in the Superdream project, which has been coordinated by ISIS Arts, The Trinity Session, Sticky Situations, The Swallows Foundation, Gateshead Council and others to produce Art for public spaces in both Gateshead and Jeppestown, Johannesburg and to create links between UK and South African artists.

For the Superdream project I produced digital media art for two events, one in Windmill Hills Park, Gateshead and one in Jeppe Park, Johannesburg. Both areas were ‘undeveloped’, and one of the aims of the interventions in these spaces was to redefine public space through art, technology, innovation and a sense of community.

In Windmill Hills on May 2nd 2014 I presented an interactive audiovisual work, in stages of development.

‘Polyfields’ used four HC-SR04 ultrasonic distance sensors on an Arduino Mega 2560 programmed in Arduino’s IDE to output the distance of objects from the sensors, and these distances were used by Pd-extended to be the controls for a four channel audio piece, as well as a four channel video piece.

The audio and video data was taken from four sites around the park, both using conventional microphones as well as contact microphones and wall sensor modules, and video was also taken from these four sites – with one sensor representing one ‘site’.

In this way the participants (purposefully or accidentally) would introduce ‘channels’ of video and audio, to create their own experience of this supernormal locus of identity.

SEAN-1

SEAN-2

SEAN-3

I was then invited to go to Johannesburg with ISIS arts and The Trinity Session along with Lucy Carolan, Richard Glynn, Andrea Masala, Marek Gabrysch and Charlotte Gregory to produce work for the Superdream event in Jeppe Park as part of the British Council’s Connect ZA programme.

For this event I presented a redesigned version of my work for Windmill Hills, using material gathered from Jeppe Park, working on collaboration with South African artists Em Kay and KMSN, and re-coded the installation using Max/MSP/Jitter and the NewPing library. The four sensors were nailed to a tree in Jeppe Park and the participants were invited to interact with it as they pleased, again creating their own experience of the artwork, it lying dormant until interaction was detected.

P1010571

P1010590

P1010593

For this event I also created a piece based around a table football table sourced from a workshop next to the park which was close to a state of disrepair. With the help of Marek Gabrysch the table was able to be restored to it’s full working condition, and I augmented the football table with sensors in the goals, as well as small buffers next to the goals to wire the football table for sound, with a small selection panel on the side of the football table, players could select sounds which were triggered by hitting the buffers, and a random ‘Goal!’ sound would be triggered when a goal was scored.

This installation was created using an Arduino Mega 2560 and a Raspberry Pi running Stanford University’s Satellite CCRMA operating system. I designed a patch in Pd-extended interfacing with the Arduino using the Firmata software, which was then run from a headless Raspberry Pi which handled the sound. It was very much a prototype – a bespoke installation for the park.

The installation was very popular with the local children and thanks to some fantastic refereeing by one of the South African artists there were quite a few hotly contested games!

P1010548

P1010556

P1010559

P1010561

P1010566

P1010567

P1010581

P1010582

Aether – ‘the upper pure, bright air’

This video serves as documentation of a composition I wrote with Dario Lozano-Thornton for sine waves and computer controlled strobe. Aether is a 9 minute audiovisual composition, with a score for an arduino-based variable rate and intensity strobe composed based on a composition for pure tones by Dario.

The piece is reflective of demanding physical data extracted from extremely simple conditions – pure frequencies and binary visual states. The audio frequencies due to their nature and spectral placement appear at certain points to be distorting, or coming from multiple sources, however this is not the case, it is simply our ears’ inability to accurately represent that data. Similarly with the strobe light, it is difficult for our eyes (and cameras) to accurately represent binary states past certain frequencies, so visual hallucination and distortion can occur. Unfortunately, this composition is difficult to accurately represent on camera due to extremely fast strobes.

The video is from a showing of Aether at Broadacre House, at the Function event. The composition was adapted for two flashing laptop screens for this event.

Spawn, Thot and Ghast – NUCME Live Electronics – 4th March 2014

My new trio, Spawn Thot and Ghast have played two dates so far this month, once at Northern Improv with the Eliot Smith Dance Company, and once at the NUCME live electronics concert.

Spawn, Thot and Ghast is composed of myself (Spawn) on amplified table, electronics and dictaphones, Thot on electronics, objects and dictaphones and Ghast on voice.

effected upon by objects, processed live using ableton controlled using a Launchpad, EMU Xboard 25 and Behringer BCR2000. The live processing can be used to create simple repetition of material, or can be used to push signal into various sampling procedures producing first, second, third, (…) generation processings of the original sound.

All of this is improvised live using the electronics setup, and is improvised with Ghast and Thot.

Performed in Newcatle University recital room on a 6-channel surround sound system.

We are performing with an updated setup on the 20th as St John’s Church as part of the #UNPITCH_Action_3 /// AEQUUS NOX event

Videography by Gustav Thomas

Feric

Feric is an umbrella name under which I create various sound and visual art works, ranging from tactile contact microphone instruments through to site-specific generative noise pieces

I have various one-off audio/visual projects under this name, and I am continually producing new works.

A guerrilla site-specific piece of sound art based on a bespoke semi-generative feedback construction created in Ableton. Source material derived from laptop microphone recordings in Newcastle City Library

Tactile electroacoustic improvisation created from several pre-recorded object samples fed into a generative music setup in Ableton. Manipulated through generations of processing into forced feedback.

Played on zoviet*france A Duck In A Tree 27th April 2013

A Feric Audiovisual work, comprised of several pieces of artwork and two pieces of music composed in mid-2013, this video was created on the 9th of march 2014

https://soundcloud.com/feric-1/sets/table-music (Unfortunately I cannot embed this soundcloud link)

A trial of a tactile contact-microphone instrument. ‘Table Music’ is the precursor for Spawn, Thot and Ghast. A table with attatched contact microphones used with objects to become a tactile malleable instrument