I turned my protein sonification plugin into an interface to my mad Pd patch for the Algorave on Saturday. Here is a little preview:
I’m giving an introductory workshop on live coding with ixilang and SuperCollider next weekend in Middlesborough. Check it out if you’re interested!
I did a live set as part of the 24h 5th birthday celebrations. The stream was immense fun, lots of great performances there, really worth checking out. Unfortunately YouTube messed up resulting in a couple of streams including mine not being archived, so after hoping for some time that YouTube would pull through I redid my part today (trying to retrace every mouse movement from last week):
Technically I’m using iemguts to move boxes around and do automatic connections based on proximity in Pure Data and play samples included with Tidal. Beware that there seem to be no signal connections between things as I’m just using control messages that set
[receive~] targets, so instead of sending audio I’m sending where to read audio from.
Two amazing remixes from Friday Dunard and rdlk as well as a whole algorave live set from 2015 as pay what you want:
I participated in the Datamorphosis Hack Event at Newcastle City Library. I used the amazing Newcastle Urban Observatory, which gives access to a whole host of sensors spread around Newcastle, from temperature to number of cars in various car parks. They also provide an API to query sensors in a number of ways. I used some inner city sensor values for humidity, pressure, temperature, and wind speed to implement a cheap version of Till Bovermann’s Wetterreim (see Bovermann et al: Auditory Augmentation). I’ll document that in the future. For now, a bit of code to access the API:
In June I’m giving a workshop on collaborative live coding in SuperCollider in London. I’ll share all the dirty Mandelbrots secrets nobody ever wanted to know.
In February I was invited to participate in a hackpact where participants were supposed to do some SuperCollider programming every day and document it (more details). Unfortunately I didn’t make it through the full 15 days, but I nearly made half of it, also documented on the website.
In this post I’m just going to go over the seven days and explain what I did there, as the official website just has the code and recordings.
Alex McLean invited me to participate in the Kairotic intervention of the Heritage & Culture Hack in Sheffield on 10/01/2015. Amy Twigger Holroyd was also part of the group and we decided to try to turn knitting patterns into sound for great success. I wrote a small parser in SuperCollider that currently only parses the horseshoe lace from knitting bee correctly, which looks (manually split up into arrays for each row) like this:
"k1, *yo, k3, sl1, k2tog, psso, k3, yo, k1*",
"k1, *k1, yo, k2, sl1, k2tog, psso, k2, yo, k2*",
"k1, *k2, yo, k1, sl1, k2tog, psso, k1, yo, k3*",
"k1, *k3, yo, sl1, k2tog, psso, yo, k4*",
Amy successfully completed a round of the horseshoe lace as you can see in her tweet:
— Amy Twigger Holroyd (@amykeepandshare) January 10, 2015
Here is the horseshoe lace repeated twice in each row, with text output of the knitting commands for illustration:
And here the whole thing sped up:
I put the source code in a gist.
A recording of a live coding duo performance I did last year with Shelly Knotts, using SuperCollider and a lot of feedback…