Between 8th and 12th May 2012, we, along with Peter Gregson, were given a residency in the wonderful Snape Maltings Concert Hall in Aldeburgh, Suffolk. We were there to work on our existing collaboration with Peter, goPlay – a unique system for performs to have musical control over their electronic stage set-ups.

Having recently unveiled the first leg of our work in a concert at the National Portrait Gallery, where Peter performed three short pieces written specially for goPlay, we were eager to extend the concepts we had begun exploring earlier in the year. We had a few specific ideas we wanted to try out, but were also looking forward to seeing where the creative process would take us ‘in the moment’ – some of the most intriguing elements of the original body of work came from chance discoveries…

During the week we made lots of good progress with new ideas for the system, along with going back over the existing work and tweaking it. Specifically, we experimented with polyphony and rhythm detection – two elements we were keen to add to the system’s armoury.

 

Polyphony from a single sound-source (i.e. Peter playing a double-stop on his cello) was particularly interesting because we thought that we were already maxing out the accuracy of our existing pitch detection system with even single note sources, so to be able to identify two separate pitches on the fly was going to be very difficult. Of course, there are pieces of software out there that do a fantastic job of extracting polyphonic pitch information from a single sound source, such as Melodyne, but their code isn’t open-source and embeddable in Pd, so we had to think creatively with the [fiddle~] and [sigmund~] objects (Pure Data’s bundled pitch-detection objects).

In the end, we were able to achieve something reliable enough for Peter to be able to trigger different pre-defined combinations of cello drones by simply playing different double stops. The result was the very gentle, ambient soundscape that can be heard at the beginning of this YouTube video.

While Ragnar was working on tweaking that further, I looked into extracting tempo from Peter’s playing. Whenever we had tried this before, we had strived for the ultimate scenario, where anything Peter was playing including combinations of single notes, dotted notes and tied notes would return a tempo value. Realistically, without some predictability to notes being played, this wasn’t going to be feasible. Instead, we gave ourselves a few helpful constraints on the source material and got great results. Effectively, goPlay would listen out for occurrences of one specific trigger note, and derive the tempo from the length of the gaps in-between subsequent occurrences. This was very reliable, but only gave us half the picture. In order to calculate the time signature, the system counts the number of notes played in-between trigger notes.

After some tweaking and testing, Peter was able to trigger a delay line that was perfectly in time with what he had been playing immediately prior. We also experimented with pre-sequenced drum beats accompanying his playing, which would follow not only tempo, but time signature as well.

Towards the end of our residency we went about tweaking some of the material we had already developed and gave a small performance/explanation to the team at Snape Maltings. Everyone was very receptive to the potential of the direction we were going in, and the questions raised were all interesting and relevant.

We’ll be posting up a video of the first three movements after the break. In the meantime, the Aldeburgh put together a summary week of our time there.

R&D of goPlay is supported by the National Lottery, through Arts Council England’s Grants for the Arts fund.