Author: admin

Home / Articles posted by admin
Orbe is in the running for the 2018 edition of the Innovation Radar Prize!

Orbe is in the running for the 2018 edition of the Innovation Radar Prize!



                

orbe_logo_optm
The Innovation Radar Prize will select the top innovators supported by the @eu_h2020 during the past year. Orbe has been selected thanks to nodal.studio, our web platform for digital creatives.Would you help us and vote for us at https://ec.europa.eu/futurium/en/industrial-enabling-tech-2018/orbe. Orbe created nodal.studio with the help of the research projects CoSiMa (ANR) and rapidmix (h2020) in collaboration with the ISMM team of IRCAM (STMS Lab). With Nodal.studio, you can concretize your ideas and build your project with an accessible authoring tool. No need to instal any application or plugin, just access Nodal.studio with your web browser. To learn more about Nodal.studio, follow these links: https://nodal.studio/ https://www.facebook.com/orbemobi/videos/325346658267847/
IMG_4940 (1) (1)
RAPID-MIX API WORKSHOP @ eNTERFACE’17

RAPID-MIX API WORKSHOP @ eNTERFACE’17

flyer_cover

For this eNTERFACE’17 workhop, we are looking for participants who want to learn and explore how to employ the RAPID-MIX API in their creative projects, integrating sensing technologies, digital signal processing (DSP), interactive machine learning (IML) for embodied interaction and audiovisual synthesis. It is helpful to think of RAPID-MIX-style projects as combining sensor inputs (LeapMotion, IMU, Kinect, BITalino, etc.) and media outputs with an intermediate layer of software logic, often including machine learning.

Participants will gain practical experience with elements of the toolkit and with general concepts in ML, DSP and sensor-based interaction. We are adopting Design Sprints, an Agile UX approach, to deliver our workshops at eNTERFACE’17. This will be mutually beneficial both to the participants—who will learn how to use the RAPID-MIX API in creative projects—and for us to learn about their experiences and improve the toolkit.

We also intend to kickstart the online community around this toolkit and model it after other creative communities and toolkits (e.g., Processing, openFrameworks, Cinder++, etc.) eNTERFACE’17 participants will become this community’s power users and core members, and their resulting projects will be integraded as demonstrators for the toolkit.

Work plan and Schedule

Our work plan for July 3-7 is divided into one specific subset of the RAPID-MIX API each day, with the possibility for a mentored project work extension period.

1. Sensor input, biosensors, training data
2. Signal processing and Feature Extraction
3. Machine Learning I: Classification and Regression
4. Machine Learning II: Temporal-based and gesture recognition
5. Multimodal data repositories and collaborative sound databases

Check out our full proposal: https://goo.gl/Q2DP1F
and our public Google Calendar @ RAPID-MIX@eNTERFACE17 – Calendar ID: 39qpvg0ubciiou0medr73qnh6k@group.calendar.google.com

and the flyer for the event:  https://goo.gl/9DsDp5

If you would like to learn how to rapidly use machine learning technology for creative and expressive interaction, send us a short CV or bio to:

rapidmix.enterface17@gmail.com

Diemo Schwarz

Diemo Schwarz

Diemo Schwarz is researcher–developer in real-time applications of computers to music with the aim of improving musical interaction, notably sound analysis–synthesis, and interactive corpus-based concatenative synthesis.

He holds a PhD in computer science applied to music developing of a new method of concatenative musical sound synthesis by unit selection from a large database. This work is continued in the CataRT application for real-time interactive corpus-based concatenative synthesis within Ircam’s Sound Music Movement Interaction team (ISMM).

His current research comprises uses of tangible interfaces for multi-modal interaction, generative audio for video games, virtual and augmented reality, and the creative industries.

As an artist, he composes for dance, video, and installations, and interprets and performs improvised electronic music with his solo project Mean Time Between Failure, in various duos and small ensembles, and as member of the 30-piece ONCEIM improvisers orchestra.

http://imtr.ircam.fr/imtr/Diemo_Schwarz

http://diemo.concatenative.net

Amaury La Burthe

Amaury La Burthe

Amaury holds a Msc from IRCAM. He first worked as assistant researcher for Sony-ComputerScienceLaboratory and then as lead audio designer for video game company Ubisoft in Montreal. He founded in 2009 the start-up AudioGaming focused on creating innovative audio technologies. AudioGaming expended its activities in 2013 through its brand Novelab which is creating immersive and interactive experiences (video games, VR, installations,…). Amaury recently worked on projects like Type:Rider with Arte and Kinoscope with Google as executive producer and Notes on Blindness VR as Creative director and Audio director. 

Joseph Larralde

Joseph Larralde

Joseph Larralde is a programmer in IRCAM’s ISMM team. He’s also a composer / performer using new interfaces to play live electroacoustic music, focusing on the gestural expressiveness of sound synthesis control. His role in the RAPID-MIX project is to develop a collection of prototypes that demonstrate combined uses of all the partners’ technologies, to bring existing machine-learning algorithms to the web, and more broadly to merge IRCAM’s software libraries together with the ones from UPF and Goldsmith in the RAPID-API.

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

Francisco Bernardo from Goldsmiths EAVI team was recently invited by Google ATAP team to Mountain View, California, to participate on Project Soli developers workshop.

As part of the Soli Alpha Developers program, Francisco demonstrated the successful integration of the Soli sensor with some of the RAPID-MIX tools such as Wekinator and Maximilian, for the purpose of using gesture recognition for the control of music processes.