blog

Home / Archive by category "blog"
Orbe is in the running for the 2018 edition of the Innovation Radar Prize!

Orbe is in the running for the 2018 edition of the Innovation Radar Prize!



                

orbe_logo_optm
The Innovation Radar Prize will select the top innovators supported by the @eu_h2020 during the past year. Orbe has been selected thanks to nodal.studio, our web platform for digital creatives.Would you help us and vote for us at https://ec.europa.eu/futurium/en/industrial-enabling-tech-2018/orbe. Orbe created nodal.studio with the help of the research projects CoSiMa (ANR) and rapidmix (h2020) in collaboration with the ISMM team of IRCAM (STMS Lab). With Nodal.studio, you can concretize your ideas and build your project with an accessible authoring tool. No need to instal any application or plugin, just access Nodal.studio with your web browser. To learn more about Nodal.studio, follow these links: https://nodal.studio/ https://www.facebook.com/orbemobi/videos/325346658267847/
IMG_4940 (1) (1)
RAPID-MIX API WORKSHOP @ eNTERFACE’17

RAPID-MIX API WORKSHOP @ eNTERFACE’17

flyer_cover

For this eNTERFACE’17 workhop, we are looking for participants who want to learn and explore how to employ the RAPID-MIX API in their creative projects, integrating sensing technologies, digital signal processing (DSP), interactive machine learning (IML) for embodied interaction and audiovisual synthesis. It is helpful to think of RAPID-MIX-style projects as combining sensor inputs (LeapMotion, IMU, Kinect, BITalino, etc.) and media outputs with an intermediate layer of software logic, often including machine learning.

Participants will gain practical experience with elements of the toolkit and with general concepts in ML, DSP and sensor-based interaction. We are adopting Design Sprints, an Agile UX approach, to deliver our workshops at eNTERFACE’17. This will be mutually beneficial both to the participants—who will learn how to use the RAPID-MIX API in creative projects—and for us to learn about their experiences and improve the toolkit.

We also intend to kickstart the online community around this toolkit and model it after other creative communities and toolkits (e.g., Processing, openFrameworks, Cinder++, etc.) eNTERFACE’17 participants will become this community’s power users and core members, and their resulting projects will be integraded as demonstrators for the toolkit.

Work plan and Schedule

Our work plan for July 3-7 is divided into one specific subset of the RAPID-MIX API each day, with the possibility for a mentored project work extension period.

1. Sensor input, biosensors, training data
2. Signal processing and Feature Extraction
3. Machine Learning I: Classification and Regression
4. Machine Learning II: Temporal-based and gesture recognition
5. Multimodal data repositories and collaborative sound databases

Check out our full proposal: https://goo.gl/Q2DP1F
and our public Google Calendar @ RAPID-MIX@eNTERFACE17 – Calendar ID: 39qpvg0ubciiou0medr73qnh6k@group.calendar.google.com

and the flyer for the event:  https://goo.gl/9DsDp5

If you would like to learn how to rapidly use machine learning technology for creative and expressive interaction, send us a short CV or bio to:

rapidmix.enterface17@gmail.com

AGILE and the RAPID-API

AGILE and the RAPID-API

We are excited to announce that we are working with AGILE, a European Union Horizon 2020 Research Project developing and building a modular software-and-hardware gateway for the Internet of Things (IoT).

We will be combining the RAPID-API with the modular hardware of the AGILE project. Their smart hardware provides a multitude of ways to interface with the real world through sensors, and the Rapid-API will give people ways to harness and use that real-world data.

You’ll be able to see some of the results of this collaboration at Adaptation, a media arts event organised by AGILE. Artists are invited to submit proposals to Adaptation for works which represent data through audiovisual experiences, and selected artists will work with technologists to realise their ideas. We will be working with AGILE to have the RAPID-API available to help this realisation. Find out more about Adaptation and how to get involved here.

Find out more about AGILE here.

agile

BITalino at Smartgeometry 2016

BITalino at Smartgeometry 2016

BITalino, the board developed by Rapid Mix partners PLUX which gives quick and easy access to biosignals, featured in an installation by the Atmospheric Delight team at Smartgeometry 2016 in Sweden, a workshop and conference exploring novel technologies. Read about it in more detail in Architect Magazine:

http://www.architectmagazine.com/technology/smartgeometry-2016-a-path-to-interpersonal-and-interplanetary-connections_o

There’s also a video showing for a deeper look:

Smartgeometry 2016: Atmospheric Delight from Marc Webb on Vimeo.

It’s great to see key Rapid Mix technologies being used for prototyping within the creative industries!

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

Francisco Bernardo from Goldsmiths EAVI team was recently invited by Google ATAP team to Mountain View, California, to participate on Project Soli developers workshop.

As part of the Soli Alpha Developers program, Francisco demonstrated the successful integration of the Soli sensor with some of the RAPID-MIX tools such as Wekinator and Maximilian, for the purpose of using gesture recognition for the control of music processes.

Workshop in “Musical Gesture as Creative Interface” Conference

Workshop in “Musical Gesture as Creative Interface” Conference

On the past March 16th, Goldsmiths EAVI team Atau Tanaka and Francisco Bernardo delivered a workshop on “Interactive Applications in Machine Learning” to the attendees of the International Conference on “Musical Gesture as Creative Interface” in Porto.

About 15 participants engaged with a hands on approach to electromyography and machine learning for music making using RAPID-MIX tools BITalino and Wekinator, and Max for biosignal conditioning and signal sonification using several synthesis and mapping approaches.

The workshop went all the way to a very successful end, with many of the participants achieving the full setup and engaging in joint experimentation with music making out of biosignals.