Archive | Visual Art RSS for this section

A Third Exhibit, NICAP Commission

http://www.thelateshows.org.uk/home.html

In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.

http://moussemagazine.it/taac1-b/

This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.

‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.

‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.

My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.

The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.

Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.

‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.

Advertisements

SWINGME @ Square One, 12th March 2015

‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.

This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.

By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.

There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’

This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.

The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.

IMG_3316 IMG_3322 IMG_3290 IMG_3291 IMG_3300 IMG_3315 IMG_3308 IMG_3301 IMG_3317

Photos and event by Josh Borom and Matt Pickering

RE/CEPTOR – TUSK Festival 2014

10517334_10152294661172611_4487573727166724339_o

This is the video (taken by Harry Wheeler) from the TUSK Archive of the first of three performances by RE/CEPTOR at Blank studios.

RE/CEPTOR was a bespoke project for TUSK Festival comprised of long-established duo Trans/Human (with whom i’ve worked before), Contemporary dancer and movement artist Nicole Vivien Watson and myself.

The project is a multifaceted one. The three performances for TUSK festival involved myself and Nicole performing in BLANK studios, with Trans/Human performing in Marseille concurrently. The audio for Trans/Human’s performance was being streamed live to Blank Studios via NiceCast, an OSX internet radio streaming protocol, and their audio was then fed into the performance setup in Blank studios, more on this later. The performances happened on four speaker ‘surround’ sound.

The main focal point for both of the performances was Nicole and her movement, I designed a custom wireless sensor array for her to wear through the performance which tracked light, body capacitance, temperature and movement. These sensors were attached to Nicole at relevant points (hands for movement, body capacitance on fingers and so on) read by an Arduino Nano, which collected the data and fed it wirelessly via an inexpensive nrf24l01 chip to another Arduino connected to my laptop which I created a shield for to house another nrf24l01 chip. This second Arduino received the data that Nicole was creating and was read by Max 6, and this data formed a very crucial aspect of the performance as a whole.

Here are some photos of the prototyping, building and wearing of the sensor

1011104_10204860769310115_8839886129191380440_n 10415597_10204868000330886_74219588007764680_n  IMG_20141127_081039640_HDR10727353_796438013745979_1650551364_n

Top to bottom: Prototyped version of the sensor array, the Arduino and radio chip, the completed array, and one of Nicole’s hands showing an accelerometer and body capacitance sensors.

Once the data was into max 6 it was used to control a number of aspects of the performance. Nicole’s movement controlled effects on the stereo feed from Marseillie – changing filter poles, adding delay and decay to the feed. This effected stereo feed was broadcasted on two of the four speakers in the room, the ones in the back two corners. The other two speakers had two granular synthesisers playing, one of which was controlled by Nicole’s movement and one which was controlled by me. Nicole’s body temperature and body capacitance was also summed to create a low frequency sinewave drone which was played on all four speakers. Nicole’s data also controlled the strobe light via DMX which was the main source of light in the space. For the first performance (the one in the video) the strobe light frequency was a summation of body temperature, light and body capacitance, however after discussion with Nicole we felt that this obfuscated the relationship between her and the strobe, and we changed the program for the other two performances so that the strobe light was directly controlled by the position of one of her hands, which was much more effective.

All of this was tied together for me by a touchOSC interface which controlled various aspects of the parameters of what Nicole was effecting (unstable parameters such as filter resonance et cetera) which did not lend themselves to control by movement, as well as giving me control of one of the granular synthesisers, to do a kind of synthesiser duet with Nicole.

10644019_652692151516993_1099358373_n

The result of this was four channels of sound, two coming from Marseillie and two from the room, but all were dependent on the movements of Nicole, locking all performers to a central groove. Trans/Human could not themselves hear what was going on in Blank studios, but in Blank we were very much improvising with their material. Being able to mesh all of us in this way was perfect – Trans/Human’s tour and the audio they collected and performed, my building/patching/sound generation mediated through Nicole’s movement. The binary state between darkness and light combined with full bodied four-channel sound created an immersive experience and environment for both performers and audience, and yielded, to my mind, three distinct, varied performances.

Thanks to Lee Etherington, Harry Wheeler and Hugs Bison for organisation and documentation, and thanks to the folks at Blank studios for being immensely helpful in accommodating our audio/internet needs.

Nicole is the creative director of Surface Area Dance Theatre –

http://www.surfacearea.org.uk/company.html

Trans/Human is Adam Denton and Luke Twyman –

http://transhuman.bandcamp.com/

http://imboredofbastards.tumblr.com/

http://dentone.tumblr.com/

Databending, Glitched images, Raw editing

Over the past couple of years I have been experimenting with creating visual art based on some of my own photos processed using various forms of non-image editing. These processing methods (raw data editing, hex editing, sound processing) and so forth exploit the liminal state between order and disorder in data architecture in image protocols, with small changes omissions and insertions in raw data creating effects on the terms of the literal composition of the digital image itself, rather than by any aesthetic rules or preferences I have.

Creating these images is a matter of ‘Curating’ these glitches, and moderating them inside the preset bounds of the images architecture, so as not to break them.

Full album (a growing collection of the more prominent examples I have developed) here: http://s1382.photobucket.com/user/theseancotterill/library/Databending-Glitch%20art-Raw%20editing

All licensed CC BY-SA

DSC_0555 DSC_0558 - Copy (2)

Superdream, Connect ZA, Gateshead, Johannesburg and Jeppe Park

http://isitok.net/superdream/

Over the past few months I have been involved in the Superdream project, which has been coordinated by ISIS Arts, The Trinity Session, Sticky Situations, The Swallows Foundation, Gateshead Council and others to produce Art for public spaces in both Gateshead and Jeppestown, Johannesburg and to create links between UK and South African artists.

For the Superdream project I produced digital media art for two events, one in Windmill Hills Park, Gateshead and one in Jeppe Park, Johannesburg. Both areas were ‘undeveloped’, and one of the aims of the interventions in these spaces was to redefine public space through art, technology, innovation and a sense of community.

In Windmill Hills on May 2nd 2014 I presented an interactive audiovisual work, in stages of development.

‘Polyfields’ used four HC-SR04 ultrasonic distance sensors on an Arduino Mega 2560 programmed in Arduino’s IDE to output the distance of objects from the sensors, and these distances were used by Pd-extended to be the controls for a four channel audio piece, as well as a four channel video piece.

The audio and video data was taken from four sites around the park, both using conventional microphones as well as contact microphones and wall sensor modules, and video was also taken from these four sites – with one sensor representing one ‘site’.

In this way the participants (purposefully or accidentally) would introduce ‘channels’ of video and audio, to create their own experience of this supernormal locus of identity.

SEAN-1

SEAN-2

SEAN-3

I was then invited to go to Johannesburg with ISIS arts and The Trinity Session along with Lucy Carolan, Richard Glynn, Andrea Masala, Marek Gabrysch and Charlotte Gregory to produce work for the Superdream event in Jeppe Park as part of the British Council’s Connect ZA programme.

For this event I presented a redesigned version of my work for Windmill Hills, using material gathered from Jeppe Park, working on collaboration with South African artists Em Kay and KMSN, and re-coded the installation using Max/MSP/Jitter and the NewPing library. The four sensors were nailed to a tree in Jeppe Park and the participants were invited to interact with it as they pleased, again creating their own experience of the artwork, it lying dormant until interaction was detected.

P1010571

P1010590

P1010593

For this event I also created a piece based around a table football table sourced from a workshop next to the park which was close to a state of disrepair. With the help of Marek Gabrysch the table was able to be restored to it’s full working condition, and I augmented the football table with sensors in the goals, as well as small buffers next to the goals to wire the football table for sound, with a small selection panel on the side of the football table, players could select sounds which were triggered by hitting the buffers, and a random ‘Goal!’ sound would be triggered when a goal was scored.

This installation was created using an Arduino Mega 2560 and a Raspberry Pi running Stanford University’s Satellite CCRMA operating system. I designed a patch in Pd-extended interfacing with the Arduino using the Firmata software, which was then run from a headless Raspberry Pi which handled the sound. It was very much a prototype – a bespoke installation for the park.

The installation was very popular with the local children and thanks to some fantastic refereeing by one of the South African artists there were quite a few hotly contested games!

P1010548

P1010556

P1010559

P1010561

P1010566

P1010567

P1010581

P1010582

Aether – ‘the upper pure, bright air’

This video serves as documentation of a composition I wrote with Dario Lozano-Thornton for sine waves and computer controlled strobe. Aether is a 9 minute audiovisual composition, with a score for an arduino-based variable rate and intensity strobe composed based on a composition for pure tones by Dario.

The piece is reflective of demanding physical data extracted from extremely simple conditions – pure frequencies and binary visual states. The audio frequencies due to their nature and spectral placement appear at certain points to be distorting, or coming from multiple sources, however this is not the case, it is simply our ears’ inability to accurately represent that data. Similarly with the strobe light, it is difficult for our eyes (and cameras) to accurately represent binary states past certain frequencies, so visual hallucination and distortion can occur. Unfortunately, this composition is difficult to accurately represent on camera due to extremely fast strobes.

The video is from a showing of Aether at Broadacre House, at the Function event. The composition was adapted for two flashing laptop screens for this event.