Research Blog – II

Greetings!

It’s been a few months since my last written update, and there are plenty of things to talk about! Since I last updated this blog, I’ve been fortunate to perform with many different groups of instrumentalists and in a variety of settings, and I feel like this pragmatic experience is helping greatly in defining my improvisational approach to laptop performance.

Technically speaking, I’ve programmed a collection of synth definitions in SuperCollider that I use to process live inputs, synthesize sounds, and manipulate pre-recorded audio buffers. Until this point, I’ve been performing by sending server messages from the SuperCollider IDE to activate the synths and manipulate their arguments. This approach can be bit cumbersome and doesn’t really give me the level of reactivity I’d like to have; it does, however, provide me with the flexibility to alter every synth argument to the most minute detail.

I’ve come to realize that transposing this approach onto a physical interface means that I will have to prioritize either spontaneity or the amount of control I have over each argument. I’m currently in the process of mapping these server messages to a Monome 256 controller, which will give me the ability to react to musical situations much quicker than the live-coding approach. However, as the Monome is just a grid of toggle buttons, I have to limit myself to preset arguments for each synth definition – perhaps three “versions” of each synth. I see this as a necessary limitation at the moment, but it will perhaps force me to find creative solutions in performing with a restricted degree of control over synth parameters.

Though I haven’t finished the mapping process yet, I’ve been performing a fair amount with the server-messages approach. Since I last updated this blog, I have performed in the following settings:

23.3 – MAUM concert in Levinsalen, performed with 4 instrumentalists

30.3-31.3 – duo concerts in Denmark w/ saxophonist Anders Abelseth

1.4-7.4 – duo concerts in Berlin w/ vocalist Thea Soti (herself using analogue electronics)

10.4 – duo concert w/ pedal steel guitarist Emil Brattested

19.4-30.4 – Nord+Mix workshop in Vilnius, Lithuania where I performed in 3rd order ambisonics

4.5 – 6 channel collaborative piece with flute, harp, and 5 dancers

For the coming months, I have quite a bit of work to do, and quite a few things to look forward to! First, I will premiere a performative installation at Sentralen for the Only Connect festival. At the Nord+Mix workshop, I was introduced to the concept of spatial modulation synthesis, which I found very interesting. For this installation, we were asked to work with specific “spaces” in the service hallways of Sentralen in Oslo; I’ll try to fully exploit the idea of space by reading excerpts from the english translation of Georges Perec’s “Espèces d’espaces” while the transient information from the text controls the spatial modulation of my voice.

In the end of June, I’m heading to Köln for a series of concerts and workshops with the improvising vocal duo Monsters for Breakfast. Thea Soti, who I worked with in Berlin in April, makes up one half of the duo, and she has been generous enough to arrange a short tour in Germany along with a few workshops where I will present my approach to using SuperCollider in an improvisational context. In preparation for these concerts, I’m hoping to develop a few more synth definitions that I’ll be able to test out over the course of these concerts and workshops.

I have a few other concerts and workshops coming up as well, but I’ll report on those in the next blog update! Until then….

Research Blog – [Documentation]

Here is where I’ll post documentation from the various projects I’m working with as this two-year study progresses:

Abelseth/McCormick: improvising saxophone/laptop duo

Among Us: dance performance involving buffer playback and manipulation, algorithmic synthesis, live processing of flute and harp (6 channels)

Emil Brattested: duo with pedal-steel guitar playing composed and improvised material

Fennel: augmenting the “acoustic” nature of this quartet through modest processing

Monsters For Breakfast: improvising vocal duo augmented by laptop

Nord+Mix Quartet: improvisations with soprano flute, alto flute, and viola; rehearsed in stereo, performed in 3rd Order Ambisonics during Nord+Mix workshop in Vilnius

Quintet: improvising ensemble working with semi-composed material

Thea Soti: improvising voice/laptop duo; Thea is working with hardware electronics

Trio w/ Tove Bagge & Guostė Tamulynaitė: ambient improvisations with prepared piano, synthesizer, and viola

Research Blog – I

Greetings!

This blog is a space for documenting my work during my Masters of Music in Performance Technology studies at Norges musikkhøgskole between fall 2017 and spring 2019. I’ll use this space to record both the breakthroughs and challenges I experience during my studies and research as I work towards my Master’s project, to be presented in the spring of 2019.

The original proposal for my master’s project was to create a collection of “improvising” algorithms that could independently interact with improvising instrumentalists. My goal was to use the SuperCollider programming environment to design “instruments” that would use information from analysing the current musical setting to make statistical decisions during performance: when to play, what/how to play, when to stop playing, etc.

I have since decided to go in a different direction; considering how important I consider the community aspect of music making, designing an autonomous digital performer would effectively isolate me from rehearsing and performing with other musicians, countering my own values. While I still believe this could be a future direction to explore, I’m now directing my efforts towards designing a digital instrument that I can actively use in performance.

I see the role of laptop performer as curatorial: not all the specific musical decisions are being made by me in performance, but I choose the frame within which decisions (or content) are made. In the way that a bandleader makes curatorial decisions about which performers, program, or venue to work with, the algorithmic programmer makes curatorial decisions concerning degrees of randomness/density/etc. without necessarily controlling the specific details of each sound event.

With this approach, the laptop performer is in constant dialogue with the software and hardware, both in “rehearsal” or prototyping stages and in performance as well. In a live setting, as the computer is left to decide the details of musical events, the laptop performer (from a curatorial perspective), must decide how contextualise the music created by the computer; this can be done by modifying software parameters, introducing or removing new processes, or by simply turning the instrument off.

As I develop this instrument and my curatorial approach to laptop performance, I’ll try my best to update this blog regularly with video and audio documentation of various performances, my thoughts on the process, and also some of the SuperCollider code driving certain elements of my “instrument.” More to come soon!