For this eNTERFACE’17 workhop, we are looking for participants who want to learn and explore how to employ the RAPID-MIX API in their creative projects, integrating sensing technologies, digital signal processing (DSP), interactive machine learning (IML) for embodied interaction and audiovisual synthesis. It is helpful to think of RAPID-MIX-style projects as combining sensor inputs (LeapMotion, IMU, Kinect, BITalino, etc.) and media outputs with an intermediate layer of software logic, often including machine learning.

Participants will gain practical experience with elements of the toolkit and with general concepts in ML, DSP and sensor-based interaction. We are adopting Design Sprints, an Agile UX approach, to deliver our workshops at eNTERFACE’17. This will be mutually beneficial both to the participants—who will learn how to use the RAPID-MIX API in creative projects—and for us to learn about their experiences and improve the toolkit.

We also intend to kickstart the online community around this toolkit and model it after other creative communities and toolkits (e.g., Processing, openFrameworks, Cinder++, etc.) eNTERFACE’17 participants will become this community’s power users and core members, and their resulting projects will be integraded as demonstrators for the toolkit.

Work plan and Schedule

Our work plan for July 3-7 is divided into one specific subset of the RAPID-MIX API each day, with the possibility for a mentored project work extension period.

1. Sensor input, biosensors, training data
2. Signal processing and Feature Extraction
3. Machine Learning I: Classification and Regression
4. Machine Learning II: Temporal-based and gesture recognition
5. Multimodal data repositories and collaborative sound databases

Check out our full proposal: https://goo.gl/Q2DP1F
and our public Google Calendar @ RAPID-MIX@eNTERFACE17 – Calendar ID: 39qpvg0ubciiou0medr73qnh6k@group.calendar.google.com

and the flyer for the event:  https://goo.gl/9DsDp5

If you would like to learn how to rapidly use machine learning technology for creative and expressive interaction, send us a short CV or bio to: