Alex McLean invited me to participate in the Kairotic intervention of the Heritage & Culture Hack in Sheffield on 10/01/2015. Amy Twigger Holroyd was also part of the group and we decided to try to turn knitting patterns into sound for great success. I wrote a small parser in SuperCollider that currently only parses the horseshoe lace from knitting bee correctly, which looks (manually split up into arrays for each row) like this: [
"k1, *yo, k3, sl1, k2tog, psso, k3, yo, k1*",
"k1, *k1, yo, k2, sl1, k2tog, psso, k2, yo, k2*",
"k1, *k2, yo, k1, sl1, k2tog, psso, k1, yo, k3*",
"k1, *k3, yo, sl1, k2tog, psso, yo, k4*",
Amy successfully completed a round of the horseshoe lace as you can see in her tweet:
I found a SuperCollider version of Alex McLean’s vocable synthesiser I made a couple of years ago and never got around to cleaning up on my hard drive the other day. Originally I wanted to turn it into a class but before it rots some more I decided to post it on sccode. The original was written in haskell and uses hsc3.
As part of my master’s I built an instrument into a duck. Not a real duck of course. My plan was to create a standalone instrument with an Arduino board (so not using the Arduino to send e. g. sensor data to a computer which does the sound synthesis). Amazingly enough I found a nice way to do sound synthesis on the Arduino: Mozzi.
The next thing was how to smoothen the PWM output of the Arduino. A solution for that was to build a low pass filter as described here. Last thing was to attenuate the output a bit, for that I used the simple amplifier as described in Nicolas Collins’ book “Handmade Electronic Music”. Now just shove that all into a cheap bathroom radio in duck form, add some sensors and a mini jack output, and I’m all set.
For the piece I wanted to use video material as accompaniment. I searched for videos of orchestras I could use and found some nice little news snippets from the Netherlands which are CC-BY-SA licensed on Open Beelden. As I wanted to play the videos in a granular fashion, I wanted to have something that would just skip to any possible point instantaneously. After trying to achieve that task with Processing and failing miserably as at least Processing 1.x had a pretty slow video performance, I built a simple video player with Cinder that reacted to osc messages I sent from SuperCollider.
I used SuperCollider to do the actual sequencing of events with some random elements in it and for the reverb, filtering, compressing and spatialisation of the duck’s sound. The finished piece was premiered at the Studiokonzert 2013 in ZKM Kubus in Karlsruhe and I also performed it at the second concert of the MuSA Symposium.
I made some simple code visualization for it using Quil. I still didn’t have time to figure out how to get it in full screen mode, so i put some tape on the projector instead (you can still see the title bar a bit). The whole thing looked a little like that: