Diemo Schwarz is researcher–developer in real-time applications of computers to music with the aim of improving musical interaction, notably sound analysis–synthesis, and interactive corpus-based concatenative synthesis.
He holds a PhD in computer science applied to music developing of a new method of concatenative musical sound synthesis by unit selection from a large database. This work is continued in the CataRT application for real-time interactive corpus-based concatenative synthesis within Ircam’s Sound Music Movement Interaction team (ISMM).
His current research comprises uses of tangible interfaces for multi-modal interaction, generative audio for video games, virtual and augmented reality, and the creative industries.
As an artist, he composes for dance, video, and installations, and interprets and performs improvised electronic music with his solo project Mean Time Between Failure, in various duos and small ensembles, and as member of the 30-piece ONCEIM improvisers orchestra.
Amaury holds a Msc from IRCAM. He first worked as assistant researcher for Sony-ComputerScienceLaboratory and then as lead audio designer for video game company Ubisoft in Montreal. He founded in 2009 the start-up AudioGaming focused on creating innovative audio technologies. AudioGaming expended its activities in 2013 through its brand Novelab which is creating immersive and interactive experiences (video games, VR, installations,…). Amaury recently worked on projects like Type:Rider with Arte and Kinoscope with Google as executive producer and Notes on Blindness VR as Creative director and Audio director.
Joseph Larralde is a programmer in IRCAM’s ISMM team. He’s also a composer / performer using new interfaces to play live electroacoustic music, focusing on the gestural expressiveness of sound synthesis control. His role in the RAPID-MIX project is to develop a collection of prototypes that demonstrate combined uses of all the partners’ technologies, to bring existing machine-learning algorithms to the web, and more broadly to merge IRCAM’s software libraries together with the ones from UPF and Goldsmith in the RAPID-API.
Frédéric Bevilacqua is the head of the Sound Music Movement Interaction team at IRCAM in Paris (part of the joint research lab Science & Technology for Music and Sound – IRCAM – CNRS – Université Pierre et Marie Curie). His research concerns the development of gesture-based musical interactive systems, movement computing, sensori-motor learning with auditory feedback. He holds a master degree in physics and a PhD in Biomedical Optics from EPFL in Lausanne. He also studied music at the Berklee College of Music in Boston and has participated in different music and media arts projects. From 1999 to 2003 he was a researcher at the Beckman Laser Institute at the University of California Irvine. In 2003 he joined IRCAM as a researcher on gesture analysis for music and performing arts.