Category Archives: haskell

Patterns again

Back to patterns in Haskell, an unruly puzzle that’s run through the last few years of my life, trying to work out how I want to represent my music.  Here’s the current state of my types:

  data Pattern a = Sequence {arc :: Range -> [Event a]}
                 | Signal {at :: Rational -> [a]}
  type Event a = (Range, a)
  type Range = (Rational, Rational)

A Range is a time range, with a start (onset) and duration.  An Event is of some type a, that occurs over a Range.  A Pattern can be instantiated either as a Sequence or Signal.  These are directly equivalent to the distinction between digital and analogue, or discrete and continuous.  A Sequence is a set of discrete events (with start and duration) occurring within a given range, and a Signal is a set of values for a given position in time.  In other words, both are represented as functions from time to values, but Sequence is for representing a set of events which have beginnings and ends, and Range is for a continuously varying set of values.

This is a major improvement on my previous version, simply because the types are significantly simpler, which makes the code significantly easier to work with.  This simplicity is due to the structure of patterns being represented entirely with functional composition, so is closer to my (loose) understanding of functional reactive programming..

The Functor definition is straightforward enough:

  mapSnd f (x,y) = (x,f y)
  instance Functor Pattern where
    fmap f (Sequence a) = Sequence $ fmap (fmap (mapSnd f)) a
    fmap f (Signal a) = Signal $ fmap (fmap f) a

The Applicative definition allows signals and patterns to be combined in in a fairly reasonable manner too, although I imagine this could be tidied up a fair bit:

  instance Applicative Pattern where
    pure x = Signal $ const [x]
    (Sequence fs) <*> (Sequence xs) = 
      Sequence $ \r -> concatMap
                       (\((o,d),x) -> map
                                      (\(r', f) -> (r', f x))
                                        (\((o',d'),_) -> (o' >= o) && (o' < (o+d)))
                                        (fs r)
                       (xs r)
  (Signal fs) <*> (Signal xs) = Signal $ \t -> (fs t) <*> (xs t)
  (Signal fs) <*> px@(Sequence _) = 
    Signal $ \t -> concatMap (\(_, x) -> map (\f -> f x) (fs t)) (at' px t)
  (Sequence fs) <*> (Signal xs) = 
    Sequence $ \r -> concatMap (\((o,d), f) -> 
                                map (\x -> ((o,d), f x)) (xs o)) (fs r)

In the Pattern datatype, time values are represented using Rational numbers, where each whole number represents the start of a metrical cycle, i.e. something like a bar.  Therefore, concatenating patterns involves ‘playing’ one cycle from each pattern within every cycle:

  cat :: [Pattern a] -> Pattern a
  cat ps = combine $ map (squash l) (zip [0..] ps)
    where l = length ps
  squash :: Int -> (Int, Pattern a) -> Pattern a
  squash n (i, p) = Sequence $ \r -> concatMap doBit (bits r)
    where o' = (fromIntegral i)%(fromIntegral n)
          d' = 1%(fromIntegral n)
          cycle o = (fromIntegral $ floor o)
          subR o = ((cycle o) + o', d')
          doBit (o,d) = mapFsts scaleOut $ maybe [] ((arc p) . scaleIn) (subRange (o,d) (subR o))
          scaleIn (o,d) = (o-o',d* (fromIntegral n))
          scaleOut (o,d) = ((cycle o)+o'+ ((o-(cycle o))/(fromIntegral n)), d/ (fromIntegral n))
  subRange :: Range -> Range -> Maybe Range
  subRange (o,d) (o',d') | d'' > 0 = Just (o'', d'')
                       | otherwise = Nothing
    where o'' = max o (o')
          d'' = (min (o+d) (o'+d')) - o''
  -- chop range into ranges of unit cycles
  bits :: Range -> [Range]
  bits (_, 0) = []
  bits (o, d) = (o,d'):bits (o+d',d-d')
    where d' = min ((fromIntegral $ (floor o) + 1) - o) d

Well this code could definitely be improved..

If anyone is interested the code is on github, but is not really ready for public consumption yet.  Now I can get back to making music with it though, more on that elsewhere, soon, maybe under a new pseudonym..


I’ve got some sounds out of my new live coding system, codenamed “smoothdirt”.  Here’s an mp3 for you.  The sounds are triggered with some C and structured and scheduled with some Haskell.  Plenty more to do, but already really happy hearing embedded juxtoposition of timescales, smooth multichannel panning (2 channels in this test, but I’m playing on a quadrophonic cinema soundsystem at lovebytes) and sample accuracy, which I test at the end by playing a kick drum sample a lot.

My new representation also allows me to treat musical structure as both a discrete pattern and a continuous signal, which I’m very happy about, but haven’t explored the depths of yet..

Anyway with a few tweaks and effects it’ll be ready for the algorave in London this weekend.

Patterns in Haskell revisited

A while back I came up with this way of representing musical patterns as pure functions in Haskell:

data Pattern a = Pattern {at :: Int -> [a], period :: Int}
These patterns can be composed nicely with pattern combinators, creating strange polyrhythmic structures, see my earlier post for info.
This turned out just great for representing acid techno, see for example this video of people dancing to Dave and I.  I was using Tidal which uses a representation similar to the above (and Dave was using his lovely SchemeBricks software).
However lately I’ve been wanting to make music other than acid techno, in particular in preparation for a performance with Hester Reeve, a Live Artist.

After a lot of fiddling about, I seem to be settling on this:

Continue reading Patterns in Haskell revisited

PhD Thesis: Artist-Programmers and Programming Languages for the Arts

With some minor corrections done, my thesis is finally off to the printers.  I’ve made a PDF available, and here’s the abstract:

We consider the artist-programmer, who creates work through its description as source code. The artist-programmer grandstands computer language, giving unique vantage over human-computer interaction in a creative context. We focus on the human in this relationship, noting that humans use an amalgam of language and gesture to express themselves. Accordingly we expose the deep relationship between computer languages and continuous expression, examining how these realms may support one another, and how the artist-programmer may fully engage with both.

Our argument takes us up through layers of representation, starting with symbols, then words, language and notation, to consider the role that these representations may play in human creativity. We form a cross-disciplinary perspective from psychology, computer science, linguistics, human-computer interaction, computational creativity, music technology and the arts.

We develop and demonstrate the potential of this view to inform arts practice, through the practical introduction of software prototypes, artworks, programming languages and improvised performances. In particular, we introduce works which demonstrate the role of perception in symbolic semantics, embed the representation of time in programming language, include visuospatial arrangement in syntax, and embed the activity of programming in the improvisation and experience of art.

Feedback is very welcome!

BibTeX record:

    title = {{Artist-Programmers} and Programming Languages for the Arts},
    author = {McLean, Alex},
    month = {October},
    year = {2011},
    school = {Department of Computing, Goldsmiths, University of London}

RIS record:

ID  - McLean2011
TI  - Artist-Programmers and Programming Languages for the Arts
PB  - Department of Computing, Goldsmiths, University of London
AU  - McLean, Alex
PY  - 2011/10/01

Workshop output

The Text live coding workshop went really well, surprisingly well considering it was the first time anyone apart from me had used it and (so I found out after) most of the participants didn’t have any programming experience. The six participants took to the various combinators surprisingly quickly, the main stumbling block being getting the functions to connect in the right way… Some UI work to do there, and I got some valuable feedback on it.

Once the participants had got the hang of things on headphones, we all switched to speakers and the seven of us played acid techno for an hour or so together, in perfect time sync thanks to netclock. Here’s a mobile phone snippet:

The sound quality doesn’t capture it there, but for me things got really interesting musically, and it was fun walking around the room panning between the seven players…

Text update and source

I’ve updated Text a bit to improve the visual representation of higher order types (you’d probably need to full screen to view):

I won’t be touching this until after the workshop on Saturday.

I’ve also made the source for the visual interface available here under the GPLv3 free license. To get it actually working as above you’d also need to install my tidal library, Jamie Forth’s network sync, my sampler, the nekobee synth, and somehow get it all working together. In short, it’s a bit tricky, I’ll be working on packaging soonish though.

Test run of Text

I’ve been rather busy writing lately, my PhD funding runs out in April, and I hope by then I’ll have finished and will be looking for things to do next.

I have had a bit of time to make Text, a visual language I mentioned earlier, a bit more stable, here’s a test run:

A bit of a struggle, partly due to the small screen area I gave myself for the grab, but also due to some UI design issues I need to sort out before my workshop at Access Space in Sheffield next week, on the 5th February. Access Space is a really nice free media lab, but will turn nasty unless I free the workshop software, so expect a release soon.

In case someone is interested, here’s the linux commandline I use to record a screencast with audio from jackd:

gst-launch-0.10 avimux name=mux \
! filesink location=cast.avi \
ximagesrc name=videosource use-damage=false endx=640 endy=480 \
! video/x-raw-rgb,framerate=10/1 \
! videorate \
! ffmpegcolorspace \
! videoscale method=1 \
! video/x-raw-yuv,width=640,height=480,framerate=10/1 \
! queue \
! mux. \
jackaudiosrc connect=0 name=audiosource \
! audio/x-raw-float,rate=44100,channels=2,depth=16 \
! audioconvert \
! queue \
! mux.


Text is a experimental visual language under development.  Code and docs will appear here at some point, but all I have for now is this video of a proof of concept.

It’s basically Haskell but with syntax based on proximity in 2D space, rather than adjacency.  Type compatible things connect automatically, made possible though Haskell’s strong types and currying.  I implemented the interface in C, using clutter, and ended up implementing a lot of Haskell’s type system.  Whenever something changes it compiles the graph into Haskell code, which gets piped to ghci.  The different colours are the different types.  Stripes are curried function parameters.  Lots more to do, but I think this could be a really useful system for live performance.

hackpact week 4

Ok the third fourth week of hackpact actually started yesterday, but I didn’t think my contribution then warranted a new entry.


Bit of an error with the screencast, see if you can spot the problem. Pretty happy with the sound though. (will take a while to appear due to vimeo’s encoding queue)

I’m musically humbled by my son who has taught himself how to play the guitar and sing the blues. He’s two today (24th), happy birthday Harvey.


Well I made a blueberry birthday cake for young Harvey on the 24th, which is a hacklet at a push, photo when I find the transfer cable. No creations at all on the 25th or 26th, I was away for the weekend and thankfully didn’t get a moment with my laptop. I did record a session last night thought that I’ll upload at some point…

The hackpact has been really good for making me get bored with my software and develop it further. I need some longer term development time now though to play out some of my frustrations I’ve been feeling over the last month. In particular using a command line interface is feeling like a big limitation, I need to express relationships over more than one line. Maybe I can adapt the yi haskell editor for my needs.

Next month I think I’ll switch to making a fixed recording every week. I’ve never really made fixed recordings so should be interesting.

Now I’ve fallen off the hackpact wagon I’m not sure if I’ll be able to totally get back on, particularly as I need to finish my PhD upgrade report by the end of this month. We’ll see..