Friday, September 25, 2009

'hist whist' by Marco Stroppa

Composer and computer scientist, Marco Stroppa is an artist for whom musical invention is inseparable from the exploration of new scientific and technological arenas. His series of works for solo instrument and chamber electronics offers the opportunity to explore novel methods for projecting sound in the concert hall and renew the computer paradigms that regulate the relationship between the worlds of instrumental and synthesized sounds.

The seventh short film in the "Images of a Work" series endeavors to understand his work through excerpts of rehearsals and interviews with the artists in the IRCAM studios.

[Click on the image above to see the video, French only -via Ircam]

Saturday, September 19, 2009

Generative Music: an interview with Peter Chilvers

by Matteo Milani, U.S.O. Project, September 2009

Generative music is a term popularized by Brian Eno to describe music that is ever-different and changing, and that is created by a system (Wikipedia). I recently had the chance to interview musician and software designer Peter Chilvers, who created the new iPhone/iPod touch application called Air (© Opal Ltd).
Based on concepts developed by Brian Eno, with whom Chilvers created Bloom, Air assembles vocal (by Sandra O'Neill) and piano samples into a beautiful, still and ever changing composition, which is always familiar, but never the same.

Air features four ‘Conduct’ modes, which let the user control the composition by tapping different areas on the display, and three ‘Listen’ modes, which provide a choice of arrangement. For those fortunate enough to have access to multiple iPhones and speakers, an option has been provided to spread the composition over several players.
"Air is like Music for Airports made endless, which is how I always wanted it to be." - Brian Eno


“About 20 years ago or more I became interested in processes that could produce music which you hadn’t specifically designed. The earliest example of that is wind chimes. If you make a set of wind chimes, you define the envelope within which the music can happen, but you don’t precisely define the way the music works out over time. It’s a way of making music that’s not completely deterministic.” - Brian Eno

[via apple.com]


Matteo Milani: Thanks for your time. Peter Chilvers as a musician first: few words about 'A Marble Calm' project.

Peter Chilvers: I happened across the phrase 'A Marble Calm' on holiday a few years ago, thought it sounded like an interesting band name, then started thinking about the type of band that might be. The more I thought about it, the more it seemed to tie up a number of ideas that were interesting to me: drifting textural ambient pieces, improvisation and song. By making it a loose collective, it's enabled me to bring in other vocalists and musicians I've enjoyed working with on other projects - vocalists Sandra O'Neill (who also worked with me on 'Air' for the iPhone) and Tim Bowness, marimba player Jon Hart and flautist Theo Travis.


MM: When did you start working with generative music?

PC: In the 90's I worked as a software developer on the 'Creatures' series of games. When we started on Creatures 2, I was given the opportunity to take over the whole soundtrack. The game wasn't remotely linear - you spent arbitrary amounts of time in different locations around an artificial world, so I wanted to create a soundtrack that acted more as a landscape. I ended developing a set of 'virtual improvisers', constantly generating an ambient soundscape in the background - it was quite involved actually, with its own simple programming language, although little of that was visible to the user.

[...] Peter chose to use his background in improvised music to create an array of "virtual musicians" that would play along to the action on screen. Each composition in Creatures contains a set of "players", each with their own set of instructions for responding to the mood of the norns on screen.

Peter was able to generate much more interesting effects using recorded instruments rather than using General MIDI sounds generated by a soundcard, which can often be quite restrictive. This meant that he could take advantage of the many different ways that a note on a "live" instrument can be played - for example, on a guitar the sound changes greatly depending on the part of the finger used to strike a string, and on a piano when one note is played, all the other strings vibrate too. Also by altering the stereo effects, he could fatten the sound at certain times.

He also made use of feedback loops within the soundtrack. Feedback loops were first experimented with in the 1970s - if any of you can remember Brian Eno, you may be interested to know he composed most of his music then using this method. The idea is that you play a track and record it into RAM (onto a tape back in the 1970s). After about a short while (around 8 seconds in Creatures 2), the loop starts and the original sounds are played back so the composer carries on creating sounds in response to what's gone before.

Behind the scenes, scripts control the music engine and set the volume, panning and interval between notes as the mood and threat changes.

[via gamewaredevelopment.co.uk
]


MM: Why did you choose the Apple platform to develop the applications?

PC: I've been a huge fan of Apple products for a long time, and their timing in releasing the iPhone couldn't have been better. Bloom actually existed in some form before the iPhone SDK was announced - possibly before even the iPhone itself was announced. From the second we tried running the prototype, it was obvious that it really suited a touch screen. And Apple provided one!

The difficulty developers have faced with generative music to date has been the platform. Generative music typically requires a computer, and it's just not that enjoyable to sit at a computer and listen to music. The iPhone changed that - it was portable, powerful and designed to play music.


MM: Who designed the visualizations of Bloom? Eno himself?


PC: It was something of a two way process. I came up with the effect of circles expanding and disappearing as part of a technology experiment - Brian saw it and stopped me making it more complex! Much of the iPhone development has worked that way - one of us would suggest something and the other would filter it, and this process repeats until we end up with something neither of us imagined. Trope, our new iPhone application went through a huge number of iterations, both sonically and visually before we were happy with it.


MM: What kind of algorithms define Bloom's musical structure? Are they specifically based on Brian's requests or just an abstraction based on his previous works?

PC: Again, this is something that went back and forth between us a number of times. As you can see, anything you play is repeated back at you after a delay. But the length of that delay varies in subtle, but complex ways, and keeps the music interesting and eccentric. It's actually deliberately 'wrong' - you can't play exactly in time with something you've already played, and a few people have mistaken this for a bug. Actually, it was a bug at one point - but Brian liked the effect, and we ended up emphasising it. "Honour they error as a hidden intention" is something of a recurring theme in Brian's work.
A forthcoming update to Bloom adds two new 'operation modes', one of which was designed specifically to work with the way Brian prefers playing Bloom.


MM: Does the graphic and audio engine include audio and video standard libraries or you wrote your own classes?

PC: I've built up my own sound engine, which I'm constantly refining and use across all the applications. It went through several fairly substantial rewrites before I found something reliable and reusable.


MM: Is all the code in 'Objective C' or did you use any external application?

PC: It's all Objective-C. I hadn't used the language before, although I'd worked extensively in C++ in the past. It's an odd language to get used to, but I really like it now.


MM: Is Bloom sample based? What is music engine actually controlling (e.g. triggering, volume, panning, effects)? What about the algorithmic side of the music engine?

PC: Bloom is entirely sample based. Brian has a huge library of sounds he's created, which I was curating while we were working on the Spore soundtrack and other projects. It's funny, but the ones I picked were just the first I came across that I thought would suit Bloom. We later went through a large number of alternatives, but those remained the best choices.

The version of Bloom that's currently live uses fixed stereo samples, but an update we're releasing soon applies some panning to the sounds depending on the position of each 'bloom' on screen. It's a subtle effect, but it works rather well.


MM: Would you like to describe your actual and next projects?

PC: I've been involved in two new applications for the iPhone: Trope and Air. Both Apps were intended to be released simultaneously. Trope is my second collaboration with Brian Eno, and takes some of the ideas from Bloom in a slightly different, slightly darker direction. Instead of tapping on the screen, you trace shapes and produce constantly evolving abstract soundscapes.

Air is a collaboration with Irish vocalist Sandra O'Neill, and is quite different to Bloom. It's a generative work centred around Sandra's vocal textures and a slowly changing image. It draws heavily on techniques that Brian has evolved over his many years working on ambient music and installations, as well as a number of the generative ideas we've developed more recently.

I have just had some interesting news: Trope has been approved, it's now available in the App Store!

More information can be found at www.generativemusic.com.

"Trope is a different emotional experience - more introspective, more atmospheric. It shows that generative music, as one of the newest forms of sonema, can draw on a broad palette of moods." Brian Eno
[Brian Eno discussing Generative Music at the Imagination Conference, 1996]

UPDATE: Trope in action!


"[...] I had realised three or four years ago that I wasn't going to be able to do generative music properly – in the sense of giving people generative music systems that they could use themselves – without involving computers. And it kind of stymied me: I hate things on computers and I hate the idea that people have to sit there with a mouse to get a piece of music to work. So then when the iPhone came out I thought: oh good, it's a computer that people carry in their pockets and use their fingers on, so suddenly that was interesting again." - Brian Eno
[via timeoutsydney.com.au]

Related Post: Deep Green: sound design for iPhone App

Thursday, September 17, 2009

Ben Burtt to Receive the 'Charles S. Swartz Award'

[The winners of Academy Awards for Best Sound Effects Editing: Ben Burtt and Charles L. Campbell with Jamie Lee Curtis, Carl Weathers -1983]

The Hollywood Post Alliance has announced that Ben Burtt will receive the organization’s Charles S. Swartz Award for Outstanding Contribution in the Field of Post Production, recognizing his powerful artistic impact on the industry. The award will be bestowed on Mr. Burtt on November 12th during the Hollywood Post Alliance Awards gala at the Skirball Center in Los Angeles.

HPA Awards co-founder and committee chair Carolyn Giardina said, “We are thrilled to recognize Ben Burtt with the Charles S. Swartz Award, an honor that represents everything that the HPA stands for; creativity, technical excellence, and limitless thinking. From R2 D2 to WALL-E, Ben Burtt has helped create some of the most unforgettable characters of our generation. We are honored to present this award to him.”

The Charles S. Swartz Award was created to honor individuals who have made outstanding contributions to the field of post production; an industry in a state of an expanding creative palette and of dynamic transition as a result of digital technologies and societal changes. The award was named in honor of the late Charles Swartz, who led the Entertainment Technology Center at the University of Southern California from 2002 until 2006 and helped to build it into the industry’s premiere test bed for new digital cinema technologies. In addition to a long and successful career as producer, educator and consultant, Mr. Swartz served on the Board of Directors of the HPA. Leon Silverman, President of the HPA noted that “Ben Burtt’s career and accomplishments speak to the true spirit of this award, which recognizes impactful contributions. Ben Burtt’s impact to the art and craft of post production and to our cultural legacy should be celebrated. ”

Saturday, September 05, 2009

Kyma Symposium 2009 Preliminary Program

The First International Kyma Symposium is scheduled for 8-10 October 2009 in the vibrant Poble Nou neighborhood of Barcelona during the annual LEM festival. The preliminary program includes master classes presented by the creators of Kyma, papers and demos presented by Kyma practitioners, and a program of concerts, live improvised silent film scores.

Symbolic Sound and Station 55 invite you to share your ideas, experiences and art with fellow practitioners at the First Kyma Symposium by attending the symposium and interacting with your fellow Kyma practitioners!

We'll be live microblogging from Barcelona (via Twitter) and we'll write a daily report here on Unidentified Sound Object so that even those who cannot attend can still benefit from the symposium. Photos of the event will be available via Flickr.

If you cannot attend...

Not everyone can make it to Barcelona, but your Kyma Sounds can! Cristian Vogel invites you to submit AIFF or WAV files created exclusively with Kyma for an automated "DJ Pacarana" set which will be providing background ambience for the 'Meet and Greet' reception on Thursday evening. Textures, sound design sketches, loops, and ambiences ranging in duration from 1 second to 5 minutes can be submitted in 16 bit, 44.1K WAV or AIFF and will be combined, spatialized, and layered according to a random number generator seeded with "0.8102009". In the spirit of 'Meet and Greet', please include your name and your city as part of the sound (spoken, sung, or otherwise encrypted) so we can meet and greet you virtually! Please send your files to his drop box.

---------------------------------------------
Kyma Symposium in Barcelona 8-10 October 2009
Preliminary Program (as of 31 August 2009)
---------------------------------------------

THURS (8 October 2009)
18:00 meet & greet
w/ DJ Pacarana: installation created by Cristian Vogel with participation of the international Kyma user community
21:00 Dinner at nearby restaurant (no host)


FRI TALKS
11:00 Welcome
11:30 Carla Scaletti & Kurt Hebel: "Recombinance Makes Us Human": Philosophy of Kyma
Demo of Recently Added Features
14:00 ---Lunch at Bharma
16:00 Camille Troillard: New Developments in OSCulator (1 hour)
17:00 Hector Bravo-Benard: Using the AC Toolbox with Kyma (30' lecture + 10' piece)
17:45----Coffee Break----
18:00 Cristian Vogel: The Black Swan: Composing with Kyma for Dance (1 hour)
19:00 Eckard Vossas: Improvising with Kyma: Miniaturen X, Cecil Variations B, Lehnade 2: (30')
19:45---Break for concert setup--

FRI CONCERT at Niu
20:00 Franz Danksagmüller: Kökarlen: Silent film + voice +live Kyma & voice (~2 hrs)
22:00 tear down, Dinner together nearby (no host)

SAT TALKS
11:00 Carla Scaletti & Kurt Hebel: Composing with Kyma: Vector spaces, Timeless nonlinear timelines, sequencing
CapyTalk, Smalltalk, Tools

14:00 ---Lunch at Bharma
16:00 Bruno Liberda: siebenmal gefärbt: Addicted to Kyma (1 hour)
17:00 Cristian Vogel: Kyma in the Club: The Never Engine (1 hour)
18:00 ---break--- (set up and sound checks for concert)

19:00 SAT CONCERT at Niu
* Carla Scaletti: SlipStick for Continuum and Kyma (~10')
* Marin Vrbica: Tiger in the Jungle: Video + Live Performer (~10')
* Hector Bravo-Benard: Styrotron: Live Kyma performance (~10')

---INTERMISSION---

* Franz Danksagmüller: Frtiz Lang's Metropolis: Silent film + Live Kyma accompaniment (~30')
21:00 Tear down
22:00 Celebratory Dinner together
---------
1:30 AM * Cristian Vogel: Never Engine (Live Kyma!)
The Moog

SUN
Informal breakfast and lunch, consulting, meetings, demos, sightseeing, travel day

For up-to-the-minute information on schedules, maps, accommodations, travel, and discussions, please join the erutufon forum (there is no cost to join the forum).

Venue: Niu Espai Artistic Contemporani
Cost: € 80 (includes workshops, concerts and 2 lunches)

Related Post: The First International Kyma User Symposium - Barcelona 2009