Results

We are pleased to announce the results of the fourth edition of the artistic research residency program at IRCAM.

The choice from among so rich and diverse spectrum of propositions was a difficult task for 36 expert readers. The committee was made up of independent composers and artists, of researchers and computer musicians associated with major international artistic and scientific institutions. Every submission was examined by at least two expert readers, representing research and the artistic expertise. The evaluation took into account important criteria such as musical quality, novelty, the feasibility of the project as well as previous experience of the candidate.

For 2013-2014 residency year, the committee selected four candidates proposing four innovative projects. We thank all the candidates for their innovative ideas in the field of the artistic research and we congratulate the laureates. 

Winners and Project Proposals

Pavlos Antoniadis [website]

Project Title: Gesture cutting through textual complexity: Towards a tool for online gesture analysis and control of complex piano notation processing
Abstract: The suggested project will materialize a general framework for such online analysis of gesture, and will implement it in the form of a tool, used for the processing of complex piano notation: a real-time "overwriting" of the score in the form of a personalized, malleable, multi-layered "tablature" (visible) and/or interface (graspable). The general methodology is designed along the development of the gesture follower by the Real-Time Musical Interactions Team: Machine learning techniques and temporal mapping, comparing the incoming dataflow with stored templates, characterize the proposed gestural analysis. During an initial step (learning procedure) of scanning the piece, templates of temporal profiles of three different types of piano-specific gestural events: grasps, neumes, and edges (or potentially others, in the form of each player's personal responses to notation) are created. During the second step (online processing), the templates are processed down to the finest temporal grain (unfolding) and simultaneously are sequentially arranged and/or merged into new ones, propelling the piece forward (re-folding). The suggested tool aims at: maximization of efficiency in learning, through data storage and simplification; piano pedagogy; improvisation; re-evaluation of the relationship between notation and action in complex music.
In collaboration with the Real-Time Musical Interactions Team (IMTR) at IRCAM.

Aaron Einbond [website]

Project Title: A Factor Oracle for Timbre
Abstract: One of the most important recent developments in real-time musical interaction is Corpus-based con- catenative synthesis (CBCS), permitting a high-level control of synthesis by audio features while retain- ing the full detail of time-domain audio. However still missing is a better temporal logic for organizing synthesis based on coherent gestures or trajectories as they unfold in time. For example, with the CataRT package for MAX individual samples may be selected by targeting a list of associated audio features, or descriptors, however there is not necessarily a connection between the descriptors of one sample and a successive sample to be concatenated.
At the same time, the Factor Oracle algorithm has proven a successful approach for real-time analysis of musical data, with applications for improvisation with OMax and score following with Antescofo. Could a factor-oracle-based system be used to augment real-time CBCS, permitting a predictive logic for synthesis? Building on initial explorations of timbral description in OMax the goal is to expand the list of available timbral descriptors and integrate them with CBCS to create a flexible tool for real-time timbral exploration, computer-assisted composition, and improvisation. A further goal is to enhance the connection between both platforms and score-based notation, presenting an invaluable resource for composers and a promising link for future applications to score and gesture following.
In collaboration between IMTR and Musical Representations Teams.

Jason Freeman [website]

Project Title: Shadows, New Techniques in Real-time Music Notation and Audience Participation
Abstract: Over the past dozen years, my artistic practice and research agenda have focused on rethinking the relationships among composers, performers, and audiences in the live performance of contemporary classical music.   
This exploration has been motivated by three overarching questions:

  1. How can live musical performance reflect, and reflect upon, a cultural landscape increasingly dominated by user-­generated content and social media?
  2. How can live performance reject the goal of replicating the perfection of a studio recording (Auslander 1999) and rediscover the risk, spontaneity, uniqueness, and community of being live?
  3. As a composer, how can I address the challenges of designing open scores and systems that invite others to be creative within them, creating musical results I alone could not have envisioned?

In my work, I seek to create environments in which composers, performers, and audience members are linked together through novel participatory interfaces, integrate visual feedback (dynamically rendered music notation and data visualization), and expressive sound (both electroacoustic and acoustic) to create complex feedback networks in which each constituency continuously influences the others, and each other, in real time during each performance. The concert then becomes a unique product of the shared experience that transpires in the moment of performance.
In collaboration with the MuTant team-project as part of the INEDIT ANR Project.

Nicolas Mondon [website]

Project Title: Acoustical diffusion of electroacoustic material : experiments on clarinet
Abstract: This project deals with diffusing electronic sounds inside the body of the clarinet as opposed to speakers outside the sphere of the instrument itself. It aims at creating the illusion that the instrument itself produces electronic sounds. The electronic thus integrates the instrumental sounds instead of enveloping the instrumental gestures spatially, by approaching the natural directivity of the instrument itself.
In collaboration with the Instrumental Acoustics Team.

S M T W T F S
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      
             
Previous monthPrevious dayNext dayNext month

The french Gateway to Contemporary Music ResourcesThe french Gateway to Comtemporary Music Resourcesclose

Veuillez installer Flash pour afficher ce lecteur.