Category: events

Recording, interview + off me nut

A couple of things to share:

1. Happy to be introducing live coding to the Off Me Nut records halloween special, a proper Sheffield warehouse party on 27th Oct 2017. They made me this months “five star spooky recommendation”, putting the pressure on..

2. I had a great time playing the Haptic Somatic night at Unsound Archives festival, and the following morning was interviewed by Elsa Ferreira for the french edition of Vice’s Noisey. You can read the results here if you know French, or otherwise enjoy the google translation.

3. Lastly, had fun times in a live code duet with Joanne at the No Bounds Algorave last weekend, here’s the video:

 

Things coming up this Autumn

After a busy summer I’m looking forward to these Autumnal live dates as I gear up for algomech

  • 28th Sept, Haptic SomaticUnconcious Archives festival London – playing solo at Corsica Studios, awesome festival to be part of and one of my most favourite venues.
  • 14th Oct, Algorave @ No Bounds Festival Sheffield – teaming up with Dr Joanne in Access Space as part of a stellar no bounds line-up, will get heavy
  • 20th Oct, Picture House Social Sheffield – live coding solo, joining the support for Marie Davidson
  • 9-12th Nov, then the big one, AlgoMech festival Sheffield – I’m organising four days of algorithmic and mechanical music + art, with 65daysofstatic headlining an incredible line-up, if I say so myself.

Thoughts on AlgoMech 2017

 

This slideshow requires JavaScript.

AlgoMech – the festival of Algorithmic and Mechanical Movement is back for its second year. At one point I had strong doubts about doing a second edition of the festival (would it be AlgoMeh?) but it’s come together into something that I’m really excited about.

It will have an exhibition, with a nice mixture of machinery, textiles, projections and software art. Putting an exhibition together is way out of my comfort zone but with the artists involved I’m not worried. There’ll also be Open Platform performance art event within the exhibition, always revelatory events with performances about technology, but without technology. More to be announced, including work from Ellen Harlizius-Klück and FoAM Kernow.

The least likely performances will be from two bands bridging the divide between guitar+drums and techno. Amazingly 65daysofstatic (a band from South Yorkshire who want you to be happy) are going to headline, performing brand new work Decomposition Theory, three times. It’s unclear what they’re up to but it looks like it’s going to involve algorithms and maybe live coding (they’ve been known to dabble with gibber and also Tidal already).

Two of the 65dos shows will have the strongest support I could imagine in this context – aggrobeat band Blood Sport teaming up with live coder Heavy Lifting aka Lucy Cheesman. Blood Sport already make a kind of repetitive post-punk techno, with Lucy involved (as Heavy Bleeding) it’s going to be intense.

Then there’ll be the Algorave. It shows how far this scene has come that last year there were 12 top notch acts, and that they’ll be around the same again this year (more TBA) without repeats. Graham Dunning’s mechanical techno went down really well last year, so I’ve mixed in some more mechanisms this year. Firstly Faubel and Schreiber making minimal techno-generating robots, projected using an overhead projector. Also goto80 + Remin, where goto80 will do live tracking on a commodore 64, and Remin will provide a robotic hand, typing music on a commodore 64. The live coders I’ve booked have been doing amazing stuff lately. If last year is anything to go by, this is going to go off.. As a resident I’m happy to be collaborating with Dave Griffiths and Alexandra Cardenas as Slub as well..

The final day will be more relaxed and reflective. A longer form kinetic sound art performance from Ryoko Akama and Anne F, I’m hoping to find a special venue for that.. Then in the evening a Sonic Pattern event with five amazing mechanical music acts packed in – Leafcutter John, Sarah Kenchington, Naomi Kashiwagi, Camilla Barratt-Due and Alexandra Cardenas, and Peter K. Rollings. I’m trying to put my finger on this feeling I get from this group of people. It reminds me of my days organising dorkbot, it’s not a case of artists being happy to step out of their comfort zone. They are totally comfortable, they just cheerfully disregard all technological boundaries on their search for sounds and ideas, and just make amazing stuff.

A really nice symposium line-up is starting to emerge too, but that won’t be announced for a few days. Plus some hands-on workshops. .. and probably some more to come..

Anyway my hope is that by bringing these human artists together, working with algorithms and mechanisms, we’ll have the opportunity to really feel the connections between physical and abstract systems, and get a richer, longer (into the past and future) + human-centric view of what technology can be.

Tanglebots

I had the whole day at Wybourn Community Primary last Monday, working with a class of year 4s to make tanglebots. Some other adults joined in too, Joanne Armitage (UofLeeds/ALGOBABEZ), Tanya Fish (Pimoroni, who also contributed tech), Anna Woolman (British Science Association), Ellie Lockley (Sheffield Hallam University) plus teachers including the excellent Julian Wood. It was funded by RCUK via a British Science Association project on reaching young people with science.

It went really well, the kids were really engaged and had no lack of confidence in taking toys apart and rebuild them into tangling monsters. The results are installed in Playground, a free digital arts exhibition for children in the Sheffield Institute of Arts gallery, open next week during the Children’s Media Conference. Here’s what they look like in-situ:

It’ll be interesting to see how long they stay ‘working’ in the exhibition with streams of children coming to see them. I couldn’t really leave them actually tangling things as they of course stop working very quickly when they get in a tangle. But hopefully the twitching and spinning will give the idea.

The tanglebots workshop was developed with FoAM Kernow as part of the Weaving Codes project, in partnership with lovebytes. You can read more about it over here.

Coding with knots

My first Quipu attempt

Being inspired by Quipu, and also Dave and Julian‘s work on visualising and sonifying Quipu data (their “coding with knots” paper is here, will link to preprint soon..), I thought I’d have a go at making some. Quipu seem to be ancient databases, used to archive and communicate information. They were in use by Andean people over hundreds of years. The goal is to encode Tidal patterns in a similar manner, which I think was also Dave’s idea. So far, I think this should work very well.. Some initial (and of course, naive) thoughts about trying to make Quipu are below.

Representation of a real Quipu (Meyers Konversations-Lexikon, 1888)

I started with four-ply 100% cotton handknit yarn, which was less fluffy and easier to work with than sheep’s wool. It did cross my mind to try to find Alpaca wool for authenticity, but it was not very easy to find, and I read that cotton was used in the Quipu too. After a bit of knotting things weren’t going too well, due to the stretchiness of the yarn it was difficult to get the knots close together, and it was all a bit fiddly.. Especially for the ‘end knot’ in a number which requires you to pass the yarn through its own loop up to nine times, securing itself in neat spiral. I wondered whether this yarn had the right structure for the task..

Looking closer at the diagrams and photographs of Quipu on-line though, it became clear that the Quipu yarn was ‘doubled’ – folded in half and twisted together to create a loop at one end, which could then be used to attach to another piece of yarn with a ‘cow hitch’. I found I could double small pieces of wool quite easily by hand, by twisting it with the direction of the existing twist until I met resistance, folding the yarn in half, and letting it twist together. Happily, the result was a yarn which was not only easy to hitch (and un-hitch) from another piece, but also was much easier to work with, in terms of tying the knots and getting them to sit in the right place, with a little help from a 20mm crochet hook.

It seemed important to start knotting at the bottom of the doubled yarn, otherwise it would start to unravel back to its original 4-ply. In terms of the Quipu representation of natural numbers (which handily, is base ten), this meant starting by knotting the unit, and then the tens, and so on. Once attached, the threads were easy to move along the thread they were attached to, and un-hitch if necessary.

I’ll need a lot more practice to make the knots more ‘readable’. Yarn is twisted in a particular direction – in this case the ‘Z’ rather than ‘S’ direction, so perhaps the knots have to match in terms of whether they’re right or left handed. There is a suggestion that the direction of twist and handedness of the knot is significant in terms of information storage, I’m a bit sceptical about this, as least they appear to be dependent on one another. For now I’m putting a little bit of twist into the yarn as I tie the knot to make sure the ply doesn’t ‘open up’.

Anyway, I think I have some grasp of how this can work now. The next task is to try to notate Tidal patterns. The ability to hitch threads to the side of others should make knotting together the parse tree of a TidalCycles pattern fairly straightforward. I’d also like to use beads to represent things which aren’t directly numerical, such as sound samples — Hama beads look perfect for this, the right size, available in bulk in a wide range of colours, and cheap! Hopefully I’ll have something to show at my lunchtime talk at the ODI this Friday.

*Update* I did indeed manage to transliterate some Tidal code into knots and beads, in particular taking this by Kindohm:

jux (|*| speed "1.5") $ foldEvery [3,4] (0.25 <~) $ stack [
  s "less:2([3 5]/2,8)" # unit "c" # speed "4",
  s "less(3,8)" # cut "1" # up "{0 -3 1}%32",
  s "{~ [~ cp] ~ less:3/2 ~}%4",
  s "[less:1*2 less:1]*2"]

I created this:

I’ll post again with more explanatory detail, but from my notes..

  • Knots are numbers, including a slightly awkward attempt at representing rational numbers
  • Values which are not directly numerical are represented with beads. Samples are yellow for cp and purple for feel (the sample number is a number, so is a knot). As an exception, single letters of the alphabet are treated as ordinal numbers and so represented with knots, e.g. c = 3.
  • Function names are a bead followed by a brown bead. In particular, red brown for jux, pink brown for foldEvery, blue brown for ~>
  • Subpatterns and functions-as-parameters are separate strings tied on to the side (currying works out fine)
  • Parameter names are a bead followed by a black bead. red black for sound, green black for cut, blue black for up, pink black for speed.

The parse tree comes out rather nicely I think, and is almost a complete representation of the original pattern (I decided to overlook the difference between # and |*| for this first attempt). I’m trying to minimise the use of beads, treating colours as logograms feels like cheating, simply making a text out of an alphabet of colour. The branching structure is what takes it away from one dimensional text though, and into something which is much closer to that of a running computer program (particularly a pure functional one like a Tidal pattern).

The next step is to try to compose a new pattern directly as string, to explore its affordances.. Maybe this could inspire a breakthrough in usability for my Texture interface for Tidal..

Stream to Algorave Montréal

A recording of a stream I did to Algorave Montréal this morning

TEDx Hull

Looking forward to talking about Algorave, live coding, TidalCycles and a cultural grounding for it all in pattern at TEDx Hull tomorrow. I have been a bit unsure whether the showbiz 15 minute talk was for me but preparing for it has been a nice exercise in organising my thoughts, and I am now really looking forward to it. I’ll do some semi-improvised live coding, hopefully won’t crash and burn.. The rest of the line-up is really interesting too.

Spring things

Things coming up in 2017..

  • Running Tidalclub Sheffield with Lucy Cheesman – every third Thursday of the month, the third edition this 16th March
  • 17th Mar – Algorave wearefive – celebratory online stream with 48 performances beaming from round the world + clock
  • 23 Mar – Algorave Berlin – a really nice line-up, I’ll be playing solo
  • 31 Mar – TEDx Hull – a talk about algorave, and algorithmic dance culture
  • 22 Apr – Eulerroom 6 – haven’t done one of these for a while.. Hosted by Tidal Club, I’ll be organising with Lucy but not actually playing.
  • 28 Apr – Algorave Leeds – another huge line-up, live coding solo again
  • 16 May – Taking part in a panel session at Thinking Digital Arts, Gateshead
  • 26/27 May – Running a two-day TidalCycles workshop with the multi-channel system at Call&Response in South London
  • 2 Jun – Back to the Open Data Institute in London, launching the outcomes of my residency there
  • 7 Jun – An evening TidalCycles workshop at London Music Hackspace, Somerset House
  • 9 Jun – Algorave activity as part of No Bounds festival, Hope Works Sheffield, I’ll be performing with Joanne
  • 23 Jun – Talk and probable algorave activity at Bump festival, Brussels
  • 9 Jul – Canute performance at the Bluedot festival algorave, Jodrell Bank
  • 18-20 Aug – Another collab with Joanne at the Green Man festival algorave in Einstein’s Garden
  • 9 Sep – Organising evening performances at FARM Workshop 2017, Oxford
  • 8-12 Nov – Organising Algomech festival in Sheffield again

More TBA

Musicbox controller

For upcoming collaborations with musicbox maestro David Littler, and to explore data input to Tidal as part of my ODI residency, I wanted to use one of these paper tape-driven mechanical music boxes as a controller interface:

You can see from the photo that I have quite a messy kitchen. Also that I’ve screwed the musicbox onto a handmade box (laser cut at the ever-wondrous Access Space). The cable coming out of it leads to a webcam mounted inside the box, that is peeking up through a hole underneath the paper as it emerges from the music box. With a spot of hacked-together python opencv code, here is the view from the webcam and the notes it sees:

Now I just need to feed the notes into Tidal, and use them to feed into live coded patterns. Should be good enough for upcoming performances with David tonight at a semi-private “Digital Folk” event at Access Space and another tomorrow in London at the ODI lunchtime lecture.

By the way the music in the above was made by my Son and I clipping out holes more or less at random. The resulting tune has really grown on me, though!

UPDATE – first live coding experiment:

Algorithmic and Mechanical Movement

flyerI’ve been working on the Festival of Algorithmic and Mechanical Movement (AlgoMech for short) lately, curating it with Lovebytes and funded by Sheffield Year of Making and Arts Council England. It’s going to be a big week for me, bringing together lots of strands into one festival featuring concerts, a day symposium, hands-on workshops and an algorave. It’s diverse enough to be a bit hard to sell but is exploring a bit of a different take on technology in performance, a long view on algorithms and machines, with focus on the people involved. If it gets good audience support then we’ll be doing our best to make it an annual event, and so I’d absolutely love it if you came along, and/or helped spread the word via twitter, facebook or by sharing the algomech.com website, this nice write-up, or the video below or by telling someone nice about it who you think might be interested. Thanks a lot!