Sonic Pattern Residency, Access Space

From the 18th-20th November 2015 I took part in a residency at Access Space organised by Alex McLean as part of the Sonic Pattern and the Textility of Code series of events from Inhabiting the Hack.

https://inhabitingthehack.github.io/

As part of this residency I worked with sonic, digital and textile artists Magdalena Halay, Nora O Murchú and Toni Buckby. The residency was open-ended, with no set goals other than to investigate ideas surrounding our disciplines and relating them back to the broader themes of Sonic Pattern.

The residency began with a workshop in Tablet Weaving, a technique using rotations of square card to create patterned weaves. By following simple procedures tablet weaving can create interesting geometrical patterns with immediate results (my own weave is pictured). I plan to add to my openFrameworks-based live coding visual software with a programmable emulation of tablet weaving controlled by musical data.

Then the rest of the residency was used to explore particular projects. In my case I collaborated with interdisciplinary textile artist and weaver Toni Buckby, in order to create a gesture tracking glove using Arduino (pictured), which would graph Toni’s weaving hand gestures of the course of a live performance which was being dictated by a second gesture sensor working in tandem with a program written by Toni and Alex McLean. During the performance the glove would also feed data into SuperCollider, and I live-coded the sounds produced in response to Toni’s hand gestures in the context of a larger group improvisation. The graph produced as a result of the performance will then be laser-cut to form a performance artefact (a test pressing is pictured), and possibly source material for further performances.

Through the residency I experimented with the production of performance artefacts and sonic objects through laser cutting, as well as utilising accurate representation of gestures in space using Euler angles derived from accelerometer and gyroscope data (previously I had only used raw values for comparatively rudimentary ways of controlling sound). I also gained experience in weaving techniques and knowledge about textile skills through collaboration with Toni.

Thanks to Alex McLean and everyone at Access Space.

Live Coding, Algorave & 3D Visuals

In the months since ICLC I have been reconfiguring the way in which I perform my live coding sets, and as Live Coding forms an integral part of the research and work I am doing for my MA I have chosen to improve on my skills of improvising from a greater pool of musical materials during performance, as well as incorporating a more cohesive and considered visual element to my performances.

I have recently played for OXJAM Newcastle/Gateshead takeover, NARC. Magazine Presents… and Manchester Algorave, and I am due to perform at Jam Jah Minifest, London Algorave and others.

I am currently developing a program in openFrameworks to create a visual backdrop to my live coding sets. The impetus for this development was for the NARC Magazine night which was focused around audiovisual performance (featuring analogue film collective Filmbee). This program currently displays a number of sound-responsive elements, and can be sequenced easily using tasks within SuperCollider. As I have recently switched to using Ubuntu as my main system I can display the visuals behind my sets using the transparency option on Compiz.

I will be uploading a full sound and screen capture of a live coding set to my Vimeo account soon.

I have also been re-developing the sounds I use when performing. I am experimenting with tonal ideas based on the harmonic series, synthesis using Nicholas Collins’s chaos theory UGens and generally having a more flexible approach to tempo and rhythm, inspired by Drum and Bass and Hip-Hop, rather than the rigid techno tempos I have been using in my music for the past few years. A couple of my most recent sets are uploaded to Bandcamp here. Check co34pt.bandcamp.com for a rolling archive of recorded live sets.

I have also been using Live Coding as a tool for improvising with other musicians, including techno duo Motmot with Tim Shaw, as well as using Tidal in a number of free improvisation sessions featuring Elvin Brandhi, John Bowers and Charlie Bramley

A Third Exhibit, NICAP Commission

http://www.thelateshows.org.uk/home.html

In March 2015 I was part of a collaborative commission for NICAP alongside Helen Shaddock, Ed Wainwright, Corbin Wood, Jack Lines and Mags Margetts to produce a piece of work responding to Newcastle University’s Hatton Gallery collection of Victor Pasmore’s work, in particular focusing on his seminal spatial work ‘An Exhibit’.

http://moussemagazine.it/taac1-b/

This group commission produced an interactive audiovisual response to Pasmore’s work, informed by the disciplines of each person involved, including music, digital interactivity, architecture, visual art, creative coding and fabrication.

‘An Exhibit’ comprised a number of coloured plexiglass sheets hung in the Hatton Gallery which would create the visual effect of coloured windows through which would filter the vision of observers, much like a filter on a photograph. As there were a number of these plexiglass sheets, by moving around the gallery space visitors could experiment with the effects of their physical placement on their ability to observe the world through these filters. Despite the fact that this work was produced in 1957, it had a strong focus on interactivity, and this was the driving force for our response, ‘A Third Exhibit’.

‘A Third Exhibit’ took inspiration from these plexiglass sheets with weighted cloth sheets suspended in the air of the main Hatton Gallery space using helium balloons which would freely move around the space according to air turbulence created by foot traffic throughout the exhibition. These weighted cloth sheets were then lit by eight DMX lights, controlled from my laptop by two distance-sensing ‘nodes’ placed at central points of the gallery space to be triggered by the actions of visitors. There was also a musical piece composed by Corbin Wood using sound recordings taken throughout the collaborative process which was triggered by the activation of these nodes.

My responsibility for the work was in designing the interactive system which would turn the movement of visitors into changes in lighting through the space. The distance-sensing ‘nodes’ I used to do this took the form of camera tripods with Arduino Mega boards mounted on them, with each Mega board being host to a ring of eight HC-SR04 ultrasonic distance sensors. The information generated by these nodes was fed back to a laptop over a Serial connection, and the serial connection was then read using Pd-extended, which processed the distance information gathered by the sensors into DMX values, which were then sent to the eight lights in the room using Q Lighting Controller and a USB-DMX interface, as well as controlling the volume of a stereo sound piece. I developed this interactive system using Linux.

The effect of the interactive system and the shifting cloth sheets created an evolving, generative artwork which was directly (distance to nodes changing the colour of lights) as well as indirectly (air turbulence gradually changing the position of sheets) controlled by the visitors to the gallery. The eight lights in different colours shining at different angles on different cloth sheets produced a number of subtle colour changes throughout the room, with colours regularly being intersected and changed by the cloth sheets.

Visitors responded very well to the interactivity in the piece, keen to see how their movements would change the colours in the space and to experience the work through discovery and participation.

‘A Third Exhibit’ was shown for The Late Shows 2015
Photography by Corbin Wood and Helen Shaddock.

co¥ᄀpt, SuperCollider, Vim, Iterm2, Jitter

Inspired by a blog post by Cole Ingraham, I recently decided to revamp my live coding setup.

I had previously been using the SuperCollider IDE to do my live coding, however it had the limitations of not being very extensible, as well as lacking the flexibility of other editors. Using a combination of SuperCollider’s Vim mode, a number of Vim extensions, the apple ITerm2 and its transparency and (in this case) Max/MSP/Jitter fed by SoundFlower, I can not only code more efficiently using multiple windows simultaneously, but I can easily display visuals behind my code, adding to the potentially quite dry ‘pure code’ projection I have previously been using during gigs.

Vim is quite a step for me, and for this performance I had only been using it for a few days. As a result, my performance is a little bit loose. I also am working on developing my own sound-responsive visual setup, so the jitter patch I used for visuals was pulled from here and altered to be changed live (using the code window on the right of the display).

For the next while I will be tuning this setup, as well as writing some code to produce and live-modify visuals. I will be performing for ICLC 2015 using this technique also.

co¥ᄀpt – Live @ Culture Lab 8/5/15

For the major project portion of my BA Music degree I gave a 40 minute Live Coding performance at Culture Lab in May. This performance was effectively a summary of my progress in learning Live Coding over the course of the past academic year.

I, as always, used SuperCollider and integrated a simple lighting setup to be controlled on the fly using code.

Here are screen, video and audio recordings of the set.

State of Grace – Core Company Member

http://www.stateofgracenortheast.co.uk/artists/company-profiles-sean-cotterill

http://www.stateofgracenortheast.co.uk/about.html

I am pleased to announce that I am now a core company member of State of Grace, a North-East based performance collective, specialising in performance practice and research across disciplines. I will be working with State of Grace through training and production, with the aim of producing performance works encompassing a wide range of creative disciplines, including Dance, Theatre, Spoken Word, Music, Visual Art and others.

I have been lucky enough to be a part of two previous State of Grace training periods, which culminated in performances.

#BABBLEBALTIC 27/04/15

Here is the video documentation of the set I performed in as part of Babble at BALTIC Centre for Contemporary Art alongside Charlie Dearnley, Lauren Vevers and TURC. The performance took the form of a text reading by Charlie and Lauren which I live-processed using SuperCollider, recording snippets of their readings throughout and morphing them into rhythms, tones, repetitions, groans and stutters. Behind this, TURC played a musical accompaniment inspired by their work with dark techno and electronic music.

The performance evolved throughout as Charlie and Lauren’s dialogue became more intense along with TURC’s music, and my live recorded material became more diverse. It culminates in an instrumental section where TURC let rip with a strong rhythmic flow and I performed multiple processes simultaneously on all of the voice samples I had recorded, followed by Charlie and Lauren bringing the performance to a close with a redux of some of the themes presented throughout their text.

co¥ᄀpt live releases

Here are a few ‘live albums’ of live coding sets i’ve been performing. I’m hoping this will form a rolling archive of sets I perform, so keep an eye out for future releases.

My final degree recital will be taking place on the 8th of may at Culture Lab as part of a gig alongside SNAILS WITH NAILS and COOKING WITH FAYE. Starting at 7pm.

I’m going to be performing a 40 minute set of live coded dance music and live coded lighting.

BABBLE – 27th April 2015

1548090_10152708370480840_8436144822958272909_o

I’ll be performing at BALTIC on the 27th of April for BABBLE, an event focusing on interconnection between sound and text. I’ll be live-coding vocal processing of poems read by Charlie Dearnley and Lauren Vevers using SuperCollider, accompanied by music from TURC.

SWINGME @ Square One, 12th March 2015

‘This work builds on a number of previous pieces using spatial and environmental sensing to re-present collected environmental data in a dynamic and immersive way.

This piece is centred around a pendulum with a DIY electronic weight containing a three-axis accelerometer and a digital light sensor, tracking directional motion and ambient light levels. When the pendulum is swung, three streams of audio, video and light will be activated according to the speed, angle and orientation of the pendulum. The audio and video channels are three sound (indoor, outdoor and electromagnetic) and video (timelapse footage of transport, foot traffic and cloud movement in the sky) recordings taken from around newcastle, and the LED lighting responds very directly to X, Y and Z axes by creating corresponding Red, Green and Blue values.

By moving the pendulum weight these situations can be interacted with in a fleeting, exploratory manner. Without any interaction the installation will lie dormant, waiting for energy to be added by a passer by.

There is a fourth channel of video, audio and light which can be activated by shining a strong light directly onto the light-sensitive part of the weight (Phone LED torches work particularly well).’

This interactive piece was produced for Square One, and the video shows both some raw video footage from the installation as well as footage of myself and others interacting with the ‘pendulum’. It proved a popular installation, with people really enjoying interacting with it in a very direct way. I purposely designed the installation to be as responsive and durable as possible, which led to bursts of colour coming from the installation space throughout the event with people shaking, swinging and spinning the pendulum around.

The pendulum was composed of an Arduino Nano interfacing with a MPU-6050 accelerometer and BH1750 light sensor, transmitting the data wirelessly using a nRF24l01 radio transceiver, which was picked up by another nRF24l01 in the form of a custom-built Arduino ‘radio shield’ and the data was read and parsed in Max MSP, which handled sound, light and video.

IMG_3316 IMG_3322 IMG_3290 IMG_3291 IMG_3300 IMG_3315 IMG_3308 IMG_3301 IMG_3317

Photos and event by Josh Borom and Matt Pickering

Follow

Get every new post delivered to your Inbox.