Category: livecoding

Algorithmic Yorkshire

My new year’s resolution was not to start any new collaborations.

Here’s a new collaboration with Ash Sagar aka section_9 (among others):

First live date is at the Newcastle Gateshead Algorave on the 26th April. Judging by this first jam session it should be a blinder..

Cycle 22 live extension

Here’s a new work in progress, I am happy with how things are going with Tidal at the moment

(redone, less quiet..)

Videos from recent live dates

Some recent activity is turning up some video clips. Here’s one giving an impression of the first Dutch algorave, organised by Fiber and STEIM. It features some seconds of Yee-King and I playing drums and code as Canute, although the music on top is from Luuma‘s set:

And here’s a longer video of a performance with Leafcutter John, featuring some audience participation:

Test transmission 20140210

Algorave on Quietus

A mention somewhere between the legendary Holly Herndon and Goodiepal in this article on The Quietus, and my day is made.

Experimentallabor residency

I’m on the way to take part in a short residency in Dusseldorf, hosted by Julian Rohrhuber at the Robert Schumann School:

Fifth Experimentallabor Residency: Penelope’s Loom – Coding threads in antiquity, live notation and textile inspired programming languages
Structure can be result and origin of a dynamic process at the same time – a thought that is common to weaving, mathematics and music. Today, as programming has become a practice that is closer to improvisation than to machine control, this commonality becomes increasingly interesting for the arts. It is along these lines, in the fifth Experimentallabor Residency, that Ellen Harlizius-Klück, Alex McLean, and Dave Griffiths will rethink programming languages in the arts in conjunction with the history of weaving.
Introduction: Wed Feb 5 2014, 17:30, IMM Experimentallabor

Lots more events coming up, full list here.

Arte Tracks feature

Here’s a feature on live coding and algorave on Arte Tracks, which was aired in Germany and France on 31st Jan 2014. It features interviews with Alexandra Cardenas and myself, and some nice live footage including from the live.code.fest and a recent solo gig I did at the white building in Hackney.

They put up some bonus videos here, with more from Alexandra plus an interview with Benoit and the Mandelbrots.



Here’s Broken, a new two-sided single out on Chordpunch.

Creative Commons Licence
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.


Tidal cycles continued

I’ve continued with the Tidal cycles project, pushing forward with at least one cycle per weekday, apart from one day when I made a longer recording (to appear on chordpunch soon). All the audio is downloadable and creative commons licensed (CC-BY), check the descriptions for the tweet-sized tidal code for each cycle, and follow on twitter or soundcloud for updates.

I should note that this is of course inspired by the long-lived sctweets tradition in the supercollider community.

Remote performance via zeromq

I did a remote performance streamed to Barcelona last week as part of a “Perspectives on multichannel live coding” concert, which involved me sitting on my studio floor in Sheffield, live coding broken techno for 16 speakers. The music was beamed over to an audience of 30-40 people in Universitat Pompeu Fabra, who were surrounded by 16 speakers, while I created the music locally, monitoring in quadrophonic surround sound (sadly I didn’t have 16 speakers to hand). I really enjoyed the challenge of making a coherent multi-channel performance, and got some positive feedback on the music, but thought I’d share the more technical side..

The organiser/curator Gerard Roma and I discussed the possibility of streaming audio, compressed with ogg vorbis and streamed over icecast. Encoding/decoding and streaming 16 channels of audio is a bit problematic though, we probably had the bandwidth but the libraries just aren’t there with 16 channel support. It’s straightforward to stream 4 channels, or 5.1, but for some reason every channel has to be labelled with a location, and I couldn’t get sixteen channels working with gstreamer.

In any case streaming synth control messages rather than audio output is a better approach really, and that’s what we went with. I just ran my synthesiser Dirt in both places, and sent trigger messages over Open Sound Control to both. Unfortunately it wasn’t quite that simple due to the various institutional firewalls between us, so I sent the OSC over ZeroMQ. This involved running a simple daemon on my (unfirewalled) server, which received OSC over plain UDP, which it forwarded to any ZeroMQ subscribers. It was then easy to add some code to Dirt which subscribed to the ZeroMQ server, and piped OSC messages into liblo for processing. Using ZeroMQ as part of this made for really easy to write, fault-tolerant code.

A slightly amusing side effect is that anyone running a recent git checkout of Dirt during my various tests and performance would have received my OSC messages and heard me mess around and play.. Something that could be made more of in the future…

I’d love to do more multichannel performances, streamed or in person, let me know if you’d like me to propose something for your system!