Saturday, September 19, 2009

Generative Music: an interview with Peter Chilvers

by Matteo Milani, U.S.O. Project, September 2009

Generative music is a term popularized by Brian Eno to describe music that is ever-different and changing, and that is created by a system (Wikipedia). I recently had the chance to interview musician and software designer Peter Chilvers, who created the new iPhone/iPod touch application called Air (© Opal Ltd).
Based on concepts developed by Brian Eno, with whom Chilvers created Bloom, Air assembles vocal (by Sandra O'Neill) and piano samples into a beautiful, still and ever changing composition, which is always familiar, but never the same.

Air features four ‘Conduct’ modes, which let the user control the composition by tapping different areas on the display, and three ‘Listen’ modes, which provide a choice of arrangement. For those fortunate enough to have access to multiple iPhones and speakers, an option has been provided to spread the composition over several players.
"Air is like Music for Airports made endless, which is how I always wanted it to be." - Brian Eno


“About 20 years ago or more I became interested in processes that could produce music which you hadn’t specifically designed. The earliest example of that is wind chimes. If you make a set of wind chimes, you define the envelope within which the music can happen, but you don’t precisely define the way the music works out over time. It’s a way of making music that’s not completely deterministic.” - Brian Eno

[via apple.com]


Matteo Milani: Thanks for your time. Peter Chilvers as a musician first: few words about 'A Marble Calm' project.

Peter Chilvers: I happened across the phrase 'A Marble Calm' on holiday a few years ago, thought it sounded like an interesting band name, then started thinking about the type of band that might be. The more I thought about it, the more it seemed to tie up a number of ideas that were interesting to me: drifting textural ambient pieces, improvisation and song. By making it a loose collective, it's enabled me to bring in other vocalists and musicians I've enjoyed working with on other projects - vocalists Sandra O'Neill (who also worked with me on 'Air' for the iPhone) and Tim Bowness, marimba player Jon Hart and flautist Theo Travis.


MM: When did you start working with generative music?

PC: In the 90's I worked as a software developer on the 'Creatures' series of games. When we started on Creatures 2, I was given the opportunity to take over the whole soundtrack. The game wasn't remotely linear - you spent arbitrary amounts of time in different locations around an artificial world, so I wanted to create a soundtrack that acted more as a landscape. I ended developing a set of 'virtual improvisers', constantly generating an ambient soundscape in the background - it was quite involved actually, with its own simple programming language, although little of that was visible to the user.

[...] Peter chose to use his background in improvised music to create an array of "virtual musicians" that would play along to the action on screen. Each composition in Creatures contains a set of "players", each with their own set of instructions for responding to the mood of the norns on screen.

Peter was able to generate much more interesting effects using recorded instruments rather than using General MIDI sounds generated by a soundcard, which can often be quite restrictive. This meant that he could take advantage of the many different ways that a note on a "live" instrument can be played - for example, on a guitar the sound changes greatly depending on the part of the finger used to strike a string, and on a piano when one note is played, all the other strings vibrate too. Also by altering the stereo effects, he could fatten the sound at certain times.

He also made use of feedback loops within the soundtrack. Feedback loops were first experimented with in the 1970s - if any of you can remember Brian Eno, you may be interested to know he composed most of his music then using this method. The idea is that you play a track and record it into RAM (onto a tape back in the 1970s). After about a short while (around 8 seconds in Creatures 2), the loop starts and the original sounds are played back so the composer carries on creating sounds in response to what's gone before.

Behind the scenes, scripts control the music engine and set the volume, panning and interval between notes as the mood and threat changes.

[via gamewaredevelopment.co.uk
]


MM: Why did you choose the Apple platform to develop the applications?

PC: I've been a huge fan of Apple products for a long time, and their timing in releasing the iPhone couldn't have been better. Bloom actually existed in some form before the iPhone SDK was announced - possibly before even the iPhone itself was announced. From the second we tried running the prototype, it was obvious that it really suited a touch screen. And Apple provided one!

The difficulty developers have faced with generative music to date has been the platform. Generative music typically requires a computer, and it's just not that enjoyable to sit at a computer and listen to music. The iPhone changed that - it was portable, powerful and designed to play music.


MM: Who designed the visualizations of Bloom? Eno himself?


PC: It was something of a two way process. I came up with the effect of circles expanding and disappearing as part of a technology experiment - Brian saw it and stopped me making it more complex! Much of the iPhone development has worked that way - one of us would suggest something and the other would filter it, and this process repeats until we end up with something neither of us imagined. Trope, our new iPhone application went through a huge number of iterations, both sonically and visually before we were happy with it.


MM: What kind of algorithms define Bloom's musical structure? Are they specifically based on Brian's requests or just an abstraction based on his previous works?

PC: Again, this is something that went back and forth between us a number of times. As you can see, anything you play is repeated back at you after a delay. But the length of that delay varies in subtle, but complex ways, and keeps the music interesting and eccentric. It's actually deliberately 'wrong' - you can't play exactly in time with something you've already played, and a few people have mistaken this for a bug. Actually, it was a bug at one point - but Brian liked the effect, and we ended up emphasising it. "Honour they error as a hidden intention" is something of a recurring theme in Brian's work.
A forthcoming update to Bloom adds two new 'operation modes', one of which was designed specifically to work with the way Brian prefers playing Bloom.


MM: Does the graphic and audio engine include audio and video standard libraries or you wrote your own classes?

PC: I've built up my own sound engine, which I'm constantly refining and use across all the applications. It went through several fairly substantial rewrites before I found something reliable and reusable.


MM: Is all the code in 'Objective C' or did you use any external application?

PC: It's all Objective-C. I hadn't used the language before, although I'd worked extensively in C++ in the past. It's an odd language to get used to, but I really like it now.


MM: Is Bloom sample based? What is music engine actually controlling (e.g. triggering, volume, panning, effects)? What about the algorithmic side of the music engine?

PC: Bloom is entirely sample based. Brian has a huge library of sounds he's created, which I was curating while we were working on the Spore soundtrack and other projects. It's funny, but the ones I picked were just the first I came across that I thought would suit Bloom. We later went through a large number of alternatives, but those remained the best choices.

The version of Bloom that's currently live uses fixed stereo samples, but an update we're releasing soon applies some panning to the sounds depending on the position of each 'bloom' on screen. It's a subtle effect, but it works rather well.


MM: Would you like to describe your actual and next projects?

PC: I've been involved in two new applications for the iPhone: Trope and Air. Both Apps were intended to be released simultaneously. Trope is my second collaboration with Brian Eno, and takes some of the ideas from Bloom in a slightly different, slightly darker direction. Instead of tapping on the screen, you trace shapes and produce constantly evolving abstract soundscapes.

Air is a collaboration with Irish vocalist Sandra O'Neill, and is quite different to Bloom. It's a generative work centred around Sandra's vocal textures and a slowly changing image. It draws heavily on techniques that Brian has evolved over his many years working on ambient music and installations, as well as a number of the generative ideas we've developed more recently.

I have just had some interesting news: Trope has been approved, it's now available in the App Store!

More information can be found at www.generativemusic.com.

"Trope is a different emotional experience - more introspective, more atmospheric. It shows that generative music, as one of the newest forms of sonema, can draw on a broad palette of moods." Brian Eno
[Brian Eno discussing Generative Music at the Imagination Conference, 1996]

UPDATE: Trope in action!


"[...] I had realised three or four years ago that I wasn't going to be able to do generative music properly – in the sense of giving people generative music systems that they could use themselves – without involving computers. And it kind of stymied me: I hate things on computers and I hate the idea that people have to sit there with a mouse to get a piece of music to work. So then when the iPhone came out I thought: oh good, it's a computer that people carry in their pockets and use their fingers on, so suddenly that was interesting again." - Brian Eno
[via timeoutsydney.com.au]

Related Post: Deep Green: sound design for iPhone App

1 comment:

  1. Anonymous10/28/2011

    Peter's music for Creatures 2 and 3 was the subject of my final-year project and dissertation. I made software (MNGEdit) to decrypt, edit and play it - pretty fun compared to some of the other projects people got stuck with, and it actually got used by third-party composers.

    ReplyDelete