I start composing the work in 2003. The work is dedicated to Rachel Corrie. She died a few days before I began the work. After a long process of composition, the work was completed in 2006. In March 2008, the tuba part was recorded at LIEM and the studio version production was completed. Finally, the work was premiered at Reina Sofía Museum in July 2009.
Fragment of the score with special instructions for space diffusion at the JIEM concert.
Explanation of the work with little more details (project presentation at LIEM).
Jesus Jara, to whom the work is dedicated, played the tuba at the recording sessions. The work is an interpretation of electronics like an extended technique. In this sense, follows works (Ashley, Reich) which used feedback as a compositional element. Electronic media is used in a different way it was designed for.
The electronic part has been generated by morphing with resynthesis techniques using Phase Vocoding with Csound. The envelopes, used to generate harmonic interferences with Phase Vocoding, are used as a control signal to shape the evolution of various tuba performance techniques.
The manner in which the use of these envelopes is established over time is based on generative grammars.
As a mixed work (acoustic and electronic instruments), which combines two heterogeneous sound sources, The main issue is the interaction of these two media. The work thinks the different possible ways in which both sources can relate, making the work a metaphorical paraphrase of the historical evolution of the relationship between those different media, taking into account the influence that each one had on the other. The way in which composers and performers faced electronic music allowed new ways of using traditional acoustic instruments, expanding the number of interpretive techniques. This work is to perform the opposite step; it takes, from the field of expanded techniques, ideas to expand the language of electronic media.
The way in which this idea is performed is by attempting to apply the concept of multiphonic (or extended techniques) to electronic music.
For Rachel Corrie recording
Development of the work
The form in which the multiphonic idea is translated to electronics is by Morphing with Phase Vocoding Resynthesis with Csound.
It is observed that, if we make Morphing between various sounds with a different pitch, it creates a series of harmonic interferences that produce an effect similar to a multiphonic. Thus, Phase Vocoding is used to achieve other results than those for what it was designed for.
Using this technique I create various series of wave files. To accomplish this, I first made a Phase Vocoding analysis of several waves files. An orc file is created to determine the length of the resulting file and generates an instrument definition file that performs the Morphing Resynthesis.
In all cases, except one, the first sound was always a metal sound (trumpet, horn, Tibetan horn) and the second one was a woodwind sound (flute, oboe, Clarinet multiphonics or the composers´ voice. Except patch 10 and 11, in both the first sound is a flute or clarinet multiphonic).
Here, I transcribe the instrument definition file for patch 6, which is the Morphing between trumpet and oboe.
The orc file is:
, Prepared by busevín for resynthesis in for Rachel Corrie in 2003
sr = 44100
kr = 4410
ksmps = 10
nchnls = 1
ktiempo2 line 0, p3, p4;
ktiempo1 line 0, p3, p5;
kinterpolacion linseg 1, p3 * 20.1, * .30,0.5 p3, p3 * 30.0, p3 * 20.0
ktiempo1 pvbufread “oboe.pv”
ktiempo2 pvinterp apv, 1, “trompeta.pv”, 1,1,1, 75.1-kinterpolacion ,1-kinterpolacion