Ojack in the lab

I was happy to host Olivia Jack in the slaboratory (my studio in Sheffield) a couple of weeks ago, between the Live Code Summerschool and workshops at Diversityfest in Rotherham Show. Olivia is the creator of Hydra, a web-based system taking over a good portion of the live coding VJ world with feedback patterns heavily inspired by analogue video synthesis techniques. We had some time for collaboration, and recorded a couple of the things we made. Olivia worked some of her experiments in reading pixel data into kind of pattern generating machines, which she send to my laptop over the OSC protocol so I could sonify them:

There’s also some subtle audio input into the above patch, creating a feedback loop. This felt like a really nice a/v collaboration. I’ve worked with a lot of VJs and other visual artists, and this was the first time I was really looking at visual work while still focused on the code and sound. We tried a range of things but found the above minimalism approach worked great, getting to the point where it was hardly possible to edit code any more because it was so trance-inducing.

We used a HDMI capture card to combine the a/v on my laptop for the recording, which meant I had a window showing Olivia’s desktop on my screen, next to my code. This simple technological tweak helped a surprising amount. Before this I didn’t know it was possible to collaborate with a visualist on this level, because my usual focussed code+sound feedback loop is so damn engrossing. Lots to think about.

The above collab was a bit less minimal, but we tried a lot of things along the way including sampling pixels in circles, and came back to good ol’ sixteen step sequencing on three levels, mapping from brightness to audio filtering in quite a direct way. A lot of fun and the possibilities really open up when Olivia starts mixing in some Hydra patterns to mess with video.

One experiment we didn’t manage to film was using tidal.pegjs by Charlie Roberts and Mariana Pachon Puentes. It’s an implementation of Tidal’s mini-notation for polyrhythmic sequences, and although it’s still a little bit buggy (if not matching with Tidal’s behaviour is a bug) but I think has loads of potential for enabling collaboration through shared metre.

By this I mean sharing underlying a single pattern or sequence between live coders, where anyone can edit it at any time. Rather than playing the sequence directly, each person could use it as a base for pattern transformation, shifting it, doubling it up, making it interfere with other sequences / patterns and so on.. But underlying everything would be this metrical structure. The magic would then come when someone changes that underlying pattern – everything would change at once, hopefully with a perceivable relationship in terms of changing complexity, tactus/tatum and so on. A super simple idea but I think it’d be a lot of fun..

Anyway it was great to find some time to collaborate with Olivia on this, and hopefully will have some outcomes in performance when we’re next in the same city..

This is also something I want to do a lot more of. In this past I’ve organised a lot of algoraves etc where friends have travelled to perform but when I’ve hardly had a chance to talk to them let alone collaborate. So let me know if you’re passing through!

Leave a Reply

Your email address will not be published. Required fields are marked *