Category: livecoding

TEDx Hull

Looking forward to talking about Algorave, live coding, TidalCycles and a cultural grounding for it all in pattern at TEDx Hull tomorrow. I have been a bit unsure whether the showbiz 15 minute talk was for me but preparing for it has been a nice exercise in organising my thoughts, and I am now really looking forward to it. I’ll do some semi-improvised live coding, hopefully won’t crash and burn.. The rest of the line-up is really interesting too.

PENELOPE

I’ve just realised that I haven’t posted here about my new job. I have left my post as research/teaching fellow in the University of Leeds, and since February 2017 have started work for a Museum, in particular the Research Institute of the Deutsches Museum, an incredible science museum in Munich — although I am still based in Sheffield UK. I’ll be working part time over the next five years on the PENELOPE research project lead by Ellen Harlizius-Klück, following our previous project Weaving Codes, Coding Weaves.

“Our aim is to integrate ancient weaving into the history of science and technology, especially digital technology. The project encompasses the investigation of ancient sources as well as practices and technological principles of ancient weaving. We set up a PENELOPEan laboratory where we detect the models and topologies of weaves and develop codes to make them virtually explorable.”
It’s a great privilege to have this huge chunk of time to really get to the bottom of something, an experience I haven’t had since my PhD. The project has much deeper connection to the world of live coding than it might first appear, being all about computational pattern, and the sharing of thought — but taking a much longer view of live coding than is usual. You can read more on the project website, including a brief exploration of making music from ‘tabby’ weaves.

Algorave article on MixMag.net w/ yaxu mix

Here’s a thing, a lovely article on Algorave on mixmag.net, by Steph Kretowicz..

Among interviews with a range of nice folks it includes some words by me as well as this mix that I mentioned in an earlier post:

I really enjoyed making the mix – a real pleasure to get close to the music, and although I am very rusty (and last time was mixing vinyl), it still felt like a different way of listening, I’ve missed it. It was good also to bring such nice music together, looking forward to doing more of these.

Read the full article here: http://mixmag.net/feature/algorave/

Yaxu + Rituals @ Idle Chatter

Another video I forgot to post here, this one from Idle Chatter back in October 2016, with RITUALS aka Dan Hett on visuals

Live coding

I’m starting to blog on Medium, here’s a crosspost of an introductory post:

Live coding

From around the year 2000, musicians, visual artists and choreographers have been popping up around the world to form a community of live coders. This community uses programming languages to create live work, predominantly in the performing arts. This idea appeared from different places in various flavours, such as just-in-time programming and on-the-fly programming, although the term live coding has became standard.

Patchwork portrait of seminal live coding band Powerbooks Unplugged (2004-),
film by Jonas Hummel (2010)

But, liveness and code form an unlikely friendship. On one side, liveness is about direct, unmediated connection, in the moment. On the other, code is about abstraction, generalisation, procedures to be replayed across different timespans and media. From this perspective live coding is almost oxmoronic — liveness is about now, code is about whenever. It is no wonder that many live coders purposefully embrace error and failure — their practice runs against our understanding of what code is for.

ALGOBABEZ at Algorave Leeds, 2016

But when we write code not to make reuseable software, but to create in the moment, it takes on a very different quality, something closer to the embodied experience of speech. Live coders can work across networks or across disciplinary boundaries, pushing against the distinction between natural and computer languages.

Shared Buffer (Alexandra Cardenas, David Ogborn, Eldad Tsabary, Alex McLean)
@ Pikselfest Norway, 2014

Live coding has developed and grown over the past 17 years into a thriving, international community, meeting to create symposia [1,2], festivals [1,2], conferences, concerts and long nights of techno. All these performances involve the act of computer programming as performance: instructions are written and modified by a human while a computer executes them. Proclaiming “show us your screens”, live coders open up the developing structure and movement behind their work by projecting their screens, so the audience can experience the code grow alongside the development of its output.

Study in Keith, Andrew Sorensen

The experience of live coding is a strange one. Locked in a state of creative flow, working in a world made entirely of symbols, words and text, while simultaneously hyper-aware of the passing of time, and of the sound generated from the composition of those symbols. Hearing is a sense of touch, a way to feel the code. This is amplified further by the presence of others in the room, whose expectations you play with and respond to.

Kindohm @ ICLC 2016, Hamilton

Live coding isn’t a genre, or a set of tools, but a community of diverse practices, rolling back history to look for paths not taken — stripping back the graphical user interface to find the language machine underneath. Then, not using the language to describe already-made designs but to explore live thought, externalised through language.

REPL Electric — The Stars

Tidalbot

On xmas eve I made a bot

The source is here. It’s been a little unstable (any tips for running the jack audio subsystem on a virtual server?) but generally works well. There have been some nice patterns coming up, a random selection:

Then some mystery person (or bot) made a bot called tidalbotbot that is generating patterns for tidalbot, e.g.:

We’ve been meaning to include examples in the tidalcycles docs for a while, should get around to this.. Also planning on a tidal pattern sharing website, which could interface nicely with tidalbot. More soon!

Musicbox controller

For upcoming collaborations with musicbox maestro David Littler, and to explore data input to Tidal as part of my ODI residency, I wanted to use one of these paper tape-driven mechanical music boxes as a controller interface:

You can see from the photo that I have quite a messy kitchen. Also that I’ve screwed the musicbox onto a handmade box (laser cut at the ever-wondrous Access Space). The cable coming out of it leads to a webcam mounted inside the box, that is peeking up through a hole underneath the paper as it emerges from the music box. With a spot of hacked-together python opencv code, here is the view from the webcam and the notes it sees:

Now I just need to feed the notes into Tidal, and use them to feed into live coded patterns. Should be good enough for upcoming performances with David tonight at a semi-private “Digital Folk” event at Access Space and another tomorrow in London at the ODI lunchtime lecture.

By the way the music in the above was made by my Son and I clipping out holes more or less at random. The resulting tune has really grown on me, though!

UPDATE – first live coding experiment:

Making Spicule

Algorithmic approaches to music involve working with music as language, and vice-versa, in fact music and language become inseparable. This allows a musician to describe many layers of patterns as text, in an explicit way that is not possible by other means. By this I mean that musical behaviours are given names, allowing them to then be combined with other musical behaviours to create new behaviours. This process of making language for music is not one of cold specification, but of creative exploration. People make new language to describe things all the time, but there’s something astonishing about making languages for computers to make music, and it’s something I want to share.

Here’s a recording of one of the live streams I’ve been doing while working on my solo album Spicule from my home studio:

I start with nothing, but in the last few minutes everything comes together and I have a couple of different parts that start feeling like a whole track. There isn’t really a musical structure to the session apart from the slow building of parts, and a sudden cut when everything comes together. The macro structure of the track will come later, but by a process of trying rough ideas, and listening to see where they go, the music emerges from the words.

I generally go through much the same process when I’m doing improvised performances, making music from nothing, but this feels very different.. Instead of being tied to the structure of a performance, making continual changes to work with the audience’s expectations, I’m dealing with repetitions even more than usual. I’ve started experimenting with lights, at first to try accentuating the sound but I think now more to help focus, to get inside the repetition and maintain flow. Unfortunately doesn’t quite work in the video because the sound and video are slightly out of sync.. But the left/right light channels map to the left/right speakers, and each sound has a different colour.

As live coding develops, I still really enjoy improvisation, but am finding myself doing polished performances more often, involving prepared tracks, with risk low, and the original making processes behind them hidden. This is probably for the best, but then it feels important to share the behind-the-scene improvisation and development that goes on.. My pledgemusic crowdfund is a great way to do this, thanks to the generous critical feedback, encouragement and (gulp) hard deadline.. If you haven’t joined it yet, you can do it here!

Sound to light for light to sound

xynaaxmue
xynaaxmue

I collaborated with xname on a performance as xynaaxmue on Saturday, audio+video up soon I hope.. xname performs with circuits that turn light into sound, improvising noise using stroboscopic lights. I was live coding with tidalcycles, as ever.

In the past I’ve created flashing patterns on an external monitor for xname’s circuits to feed off, check here for a recording of that one. This time I wanted to control a pair of RGB flash panels over DMX.. I used a tinkerit DMX hat for the arduino, officially retired but you can still find them online and the library is downloadable on github.

I hacked together a Tidal interface the night + morning before the conference, and it worked pretty well.. The Haskell and Arduino code is here.

With everything loaded up, Tidal code like this triggers flashes of light as well as sound:

x2 $ every 2 (slow 2) $ (jux (rev) $ foldEvery [5,7] (slow 2) 
   $ (slowspread (chop) [64,128,32] 
   $ sound "bd*2 [arpy:2 arpy] [mt claus*3] [voodoo ind]"))
  # dur "0.02"
  # nudge (slow 4 sine1)

The basic features:
  • sound – (sample name) is translated into colour in a semi-arbitrary way (a mapping which falls back on some crypto hashing)
  • pan – (kind of) pans between the two lights
  • dur – controls the duration of the flash
  • the flashes have a linear fade, which works across chop and striate
  • it is kind of polyphonic but the colour mixing can be improved.. mixing coloured light seems to get into the realm of philosophy though !

Will update with documentation of the performance itself when it’s up.

Project stock check

Not much time to reflect right now, but taking some time to think about ongoing and upcoming activities at least..

Making Spicule LP is going pretty well, the crowdfund is past the halfway mark, the graphic and hardware design coming together with ace collaborators I’m hardly worthy of working with, and I’m looking forward to spending a lot more time in my studio over the summer.

My Open Data Institute sound art residency isn’t going too badly either, I’ve been working on an exhibition there called Thinking Out Loud with curator in residence Hannah Redler which opens soon. It’ll include great work by Felicity Ford, David Griffiths and Julian Rohrhuber, Ellen Harlizius-Klück, Dan Hett, David Littler, Antonio Roberts, Sam Meech, and Amy Twigger-Holroyd, and a ‘looking screen’ where I’ll be able to make my activities during the residency public, as I move from a research phase to making some strange things. I’ve also brought my 2002 “forkbomb.pl” software artwork out of retirement.

A few writing projects wrapping up – the Oxford Handbook on Algorithmic Music coming out of its formal review stage, a special issue of Textile journal coming together, polishing off an article in a special issue of Contemporary Theatre review with Kate Sicchio about our Sound Choreographer <> Body Code collaboration (deadline tonight, erp).. Plus a collaborative book project on live coding emerging nicely.

Quite a few events coming up, including organising an euleroom event, an Algorave tent at EMFCamp, and looming on the horizon — a new festival on Algorithmic and Mechanical Movement (AlgoMech for short) in November. AlgoMech will be a big focus really, but I’m on the way I’m looking forward to some collaborative performances, an audio/visual noise performance with xname (interleaved as xynaaxmue) at the third iteration of Live Interfaces, and a performance at computer club in Sheffield with Alexandra Cardenas. Hoping to play again with Matthew Yee-King as Canute soon, and maybe Slub will burst out on the scene again as well.

I’m also finding more time to contribute to TidalCycles, which is starting to feel like a proper free/open source project now, with quite a few exciting developments and side-projects spinning off it.

I’ve had a great time there, but am wrapping up my research and teaching work in the University of Leeds, just a spot of supervision to do now and I’m done. All being well, I’ll be joining a new five-year project in a research institution, starting in a couple of months time, lead by Ellen Harlizius-Klück and working also with FoAM Kernow.

That’s about it I think.. It seems like a lot, but it actually feels like everything is coming together and becoming easier to think about.. Especially the AlgoMech festival which brings together just about everything I’ve been doing and interested in since.. forever, really.. and can’t wait to get stuck into a new strand of research.