News

Orbe is in the running for the 2018 edition of the Innovation Radar Prize!

Orbe is in the running for the 2018 edition of the Innovation Radar Prize!



                

orbe_logo_optm
The Innovation Radar Prize will select the top innovators supported by the @eu_h2020 during the past year. Orbe has been selected thanks to nodal.studio, our web platform for digital creatives.Would you help us and vote for us at https://ec.europa.eu/futurium/en/industrial-enabling-tech-2018/orbe. Orbe created nodal.studio with the help of the research projects CoSiMa (ANR) and rapidmix (h2020) in collaboration with the ISMM team of IRCAM (STMS Lab). With Nodal.studio, you can concretize your ideas and build your project with an accessible authoring tool. No need to instal any application or plugin, just access Nodal.studio with your web browser. To learn more about Nodal.studio, follow these links: https://nodal.studio/ https://www.facebook.com/orbemobi/videos/325346658267847/
IMG_4940 (1) (1)
RAPID-MIX API WORKSHOP @ eNTERFACE’17

RAPID-MIX API WORKSHOP @ eNTERFACE’17

flyer_cover

For this eNTERFACE’17 workhop, we are looking for participants who want to learn and explore how to employ the RAPID-MIX API in their creative projects, integrating sensing technologies, digital signal processing (DSP), interactive machine learning (IML) for embodied interaction and audiovisual synthesis. It is helpful to think of RAPID-MIX-style projects as combining sensor inputs (LeapMotion, IMU, Kinect, BITalino, etc.) and media outputs with an intermediate layer of software logic, often including machine learning.

Participants will gain practical experience with elements of the toolkit and with general concepts in ML, DSP and sensor-based interaction. We are adopting Design Sprints, an Agile UX approach, to deliver our workshops at eNTERFACE’17. This will be mutually beneficial both to the participants—who will learn how to use the RAPID-MIX API in creative projects—and for us to learn about their experiences and improve the toolkit.

We also intend to kickstart the online community around this toolkit and model it after other creative communities and toolkits (e.g., Processing, openFrameworks, Cinder++, etc.) eNTERFACE’17 participants will become this community’s power users and core members, and their resulting projects will be integraded as demonstrators for the toolkit.

Work plan and Schedule

Our work plan for July 3-7 is divided into one specific subset of the RAPID-MIX API each day, with the possibility for a mentored project work extension period.

1. Sensor input, biosensors, training data
2. Signal processing and Feature Extraction
3. Machine Learning I: Classification and Regression
4. Machine Learning II: Temporal-based and gesture recognition
5. Multimodal data repositories and collaborative sound databases

Check out our full proposal: https://goo.gl/Q2DP1F
and our public Google Calendar @ RAPID-MIX@eNTERFACE17 – Calendar ID: 39qpvg0ubciiou0medr73qnh6k@group.calendar.google.com

and the flyer for the event:  https://goo.gl/9DsDp5

If you would like to learn how to rapidly use machine learning technology for creative and expressive interaction, send us a short CV or bio to:

rapidmix.enterface17@gmail.com

AGILE and the RAPID-API

AGILE and the RAPID-API

We are excited to announce that we are working with AGILE, a European Union Horizon 2020 Research Project developing and building a modular software-and-hardware gateway for the Internet of Things (IoT).

We will be combining the RAPID-API with the modular hardware of the AGILE project. Their smart hardware provides a multitude of ways to interface with the real world through sensors, and the Rapid-API will give people ways to harness and use that real-world data.

You’ll be able to see some of the results of this collaboration at Adaptation, a media arts event organised by AGILE. Artists are invited to submit proposals to Adaptation for works which represent data through audiovisual experiences, and selected artists will work with technologists to realise their ideas. We will be working with AGILE to have the RAPID-API available to help this realisation. Find out more about Adaptation and how to get involved here.

Find out more about AGILE here.

agile

BITalino at Smartgeometry 2016

BITalino at Smartgeometry 2016

BITalino, the board developed by Rapid Mix partners PLUX which gives quick and easy access to biosignals, featured in an installation by the Atmospheric Delight team at Smartgeometry 2016 in Sweden, a workshop and conference exploring novel technologies. Read about it in more detail in Architect Magazine:

http://www.architectmagazine.com/technology/smartgeometry-2016-a-path-to-interpersonal-and-interplanetary-connections_o

There’s also a video showing for a deeper look:

Smartgeometry 2016: Atmospheric Delight from Marc Webb on Vimeo.

It’s great to see key Rapid Mix technologies being used for prototyping within the creative industries!

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

RAPID-MIX tools in Google ATAP Project Soli Developers workshop

Francisco Bernardo from Goldsmiths EAVI team was recently invited by Google ATAP team to Mountain View, California, to participate on Project Soli developers workshop.

As part of the Soli Alpha Developers program, Francisco demonstrated the successful integration of the Soli sensor with some of the RAPID-MIX tools such as Wekinator and Maximilian, for the purpose of using gesture recognition for the control of music processes.

Workshop in “Musical Gesture as Creative Interface” Conference

Workshop in “Musical Gesture as Creative Interface” Conference

On the past March 16th, Goldsmiths EAVI team Atau Tanaka and Francisco Bernardo delivered a workshop on “Interactive Applications in Machine Learning” to the attendees of the International Conference on “Musical Gesture as Creative Interface” in Porto.

About 15 participants engaged with a hands on approach to electromyography and machine learning for music making using RAPID-MIX tools BITalino and Wekinator, and Max for biosignal conditioning and signal sonification using several synthesis and mapping approaches.

The workshop went all the way to a very successful end, with many of the participants achieving the full setup and engaging in joint experimentation with music making out of biosignals.

Wekinator & Micro Bit on BBC News

Wekinator & Micro Bit on BBC News

Goldsmiths’ Rebecca Fiebrink demonstrates her work with RAPID-MIX technology Wekinator in one of these “Seven outstanding Micro Bit projects.”

http://www.bbc.com/news/technology-35824446

BITalino in Smart Automobile commercial

BITalino in Smart Automobile commercial

BITalino is used to implement a lie detector in this commercial for Smart Automobile:

RAPID MIX at Stanford University

RAPID MIX at Stanford University

Michael Zbyszyński from the RAPID-MIX team did a talk at CCRMA (the Centre for Computer Research in Music and Acoustics) at Stanford University, long-term pioneers in music computing (and prior home of contemporary electronic music superstar Holly Herndon as well as our own Atau Tanaka)

The audience  included Thomas Rossing, John Chowning and Matt Wright. It was a chance to show some of RAPID-MIX technologies to some legendary figures in electronic music.

The Wekinator, Ableton Live and the LOOP summit

The Wekinator, Ableton Live and the LOOP summit

The Wekinator, a highly usable piece of software that helps you incorporate gestural interaction into artistic projects, got a nice mention on the Ableton blog. The post was about the recent Loop summit which was exploring the cutting edge of music tech, and it rightly identifies the potential for developing new musical instruments with the Wekinator. The post also mentions RAPID MIX partners IRCAM and their CoSiMa project. Read more here: https://www.ableton.com/en/blog/new-frontiers-of-music-tech/

The Wekinator is a core RAPID MIX technology, developed by team member Rebecca Fiebrink. Behind the user friendly interface is a wealth of highly sophisticated machine learning technology that will be at the heart of many future RAPID MIX projects.

If you’ve not used Wekinator for your gestural music or art, you should have a play with it. Make sure you’ve got the latest version of Wekinator here, and keep us informed about what you’re doing with it. We’d love to hear about any projects using the Wekinator with Ableton. You can also still sign up for the online course “Machine Learning for Musicians and Artists”, taught by Rebecca, here.

FiebrinkPhoto2

RAPID MIX at Barcelona Music Hackday

RAPID MIX at Barcelona Music Hackday

We had the pleasure of attending the Barcelona Music Hackday this year. We made this short video about our visit:

 

RAPID MIX First Year Technology Prototypes

RAPID MIX First Year Technology Prototypes

Here we have collected videos of the prototypes we’ve been developing over the first year of RAPID MIX. View them as a playlist on YouTube:

New Wekinator and Online Course in Machine Learning

New Wekinator and Online Course in Machine Learning

RAPID MIX team member Rebecca Fiebrink has launched a new version of the machine learning software, Wekinator, an incredibly powerful yet user friendly toolkit for bringing expressive gestures into your music, art, making, and interaction design. This is a major new version that includes dynamic time warping alongside new classification and regression algorithms. You can download it here (for mac/windows/linux) along with many new examples for connecting it to real-time music/animation/gaming/sensing environments: www.wekinator.org. The technology that drives Wekinator will feature in many RAPID MIX products that you’ll be hearing about in the future.

The launch coincides the launch of a free online class run by Rebecca, Machine Learning for Musicians and Artists: https://www.kadenze.com/courses/machine-learning-for-musicians-and-artists/infoIf you’re interested in machine learning for building real-time interactions,  sign up! No prior machine learning knowledge or mathematical background is necessary. The course is probably most interesting for people who can already program in some environment (e.g., Processing, Max/MSP) but should still be accessible to people who don’t. The course features two guest lecturers, music technology researcher Baptiste Caramiaux and composer/instrument builder/performer Laetitia Sonami.
Lecture topics will include:
• What is machine learning?
• Common types of machine learning for making sense of human actions and sensor data, with a focus on classification, regression, and segmentation
• The “machine learning pipeline”: understanding how signals, features, algorithms, and models fit together, and how to select and configure each part of this pipeline to get good analysis results
• Off-the-shelf tools for machine learning (e.g., Wekinator, Weka, GestureFollower)
• Feature extraction and analysis techniques that are well-suited for music, dance, gaming, and visual art, especially for human motion analysis and audio analysis
• How to connect your machine learning tools to common digital arts tools such as Max/MSP, PD, ChucK, Processing, Unity 3D, SuperCollider, OpenFrameworks
• Introduction to cheap & easy sensing technologies that can be used as inputs to machine learning systems (e.g., Kinect, computer vision, hardware sensors, gaming controllers)
Kurv Guitar on Sky News

Kurv Guitar on Sky News

We’re excited to announce more media coverage of the Kurv Guitar, one of the first commercial products to be powered by RAPID MIX technologies, on Sky News “Swipe” show, as part of a special on new musical instruments. You can watch it here:

http://news.sky.com/story/1627847/virtual-instrument-makes-air-guitar-a-reality

The Kurv Guitar uses our own hardware design and bluetooth firmware, along with RAPID MIX gesture recognition and synthesis technologies, such as team member Mick Grierson’s Maximilian library.

 

ROLI releases NOISE: a free app that turns the iPhone into an expressive musical instrument

ROLI releases NOISE: a free app that turns the iPhone into an expressive musical instrument

It’s certainly a day for exciting news at RAPID MIX!

RAPID MIX partners ROLI, builders of the beautiful Seaboard and owners of JUCE, have released an iPhone app that turns your phone into a truly expressive musical instrument.

Get NOISE from the App Store

NOISE is the most full-bodied instrument to ever hit the glass surface of an iPhone. It turns the iPhone screen into a continuous sonic surface that responds to the subtlest gestures. Taking advantage of the new 3D Touch technology of the iPhone 6s, the surface lets music-makes shape sound through Strike, Press, Glide, Slide, and Lift – the “Five Dimensions of Touch” that customers and critics have celebrated on ROLI’s award-winning Seaboard RISE.

As an instrument NOISE features 25 keywaves, includes 25 sounds, and has five faders for fine-tuning the touch-responsiveness of the surface. It is now available for music-makers of any skill level who want to explore a multidimensionally expressive instrument that fits in their pockets.

NOISE is also the ultimate portable sound engine for the Seaboard RISE and other MIDI controllers. Powered by Equator and using MIDI over Bluetooth, NOISE lets music-makers control sounds wirelessly from their iPhones. It is one of the first apps to enable Multidimensional Polyphonic Expression (MPE) through its MPE Mode, so the NOISE mobile sound engine works with any MPE-compatible controller. ROLI’s sound designers have crafted the app’s 25 preset sounds – which include Breath Flute and Extreme Loop Synth – especially for MPE expressivity. Additional sounds can be purchased in-app.

With a Seaboard RISE, a Flip Case, and NOISE on their iPhones, music-makers now have a connected set of portable tools for making expressive music on the go.

While it works with all models of iPhone from the iPhone 5 to the iPhone 6s, NOISE has been optimized to take full advantage of 3D Touch on the iPhone 6s.

 

Kurv Guitar Kickstarter Success!

Kurv Guitar Kickstarter Success!

We wrote earlier about the Kurv Guitar, a project powered by RAPID MIX technologies (you can read the original post here).

Today we’re very pleased to announce that it’s reached its funding goal, with 36 days still left to go!

Have a look at the Kickstarter here.

At its heart is a powerful combination of RAPID MIX gesture recognition and synthesis technologies (such as Mick Grierson’s Maximilian library) along with our own hardware design and bluetooth firmware.

 

PLUX releases the SnapBITs for BITalino

PLUX releases the SnapBITs for BITalino

IMG_8358
BITalino makers PLUX are one of the key industry partners working on the RAPID-MIX project, producing “Arduino”-like technologies specialised on body sensing and bio-hacking wearables prototyping. The RAPID-API will give multiple ways to use this information in new, creative ways.
As a result of the RAPID-MIX presence at the Barcelona Hack Day 2015, where hackers highlighted usability concerns associated with the typical wires, the team at BITalino has assimilated this feedback to release the new SnapBITS – “getting rid of the pesky electrode cables to make life easier for BITalino users”.

IMG_8357Check out their website for more information, and stay tuned to the RAPID-MIX blog to find out how we can help you creatively use these exciting technologies:
Michael Zbyszyński joins the RAPID-MIX team!

Michael Zbyszyński joins the RAPID-MIX team!

We’re really excited to announce that Michael Zbyszyński has joined the RAPID-MIX team and relocated to Goldsmiths. He has a background as a musician, maker and coder, working at places inside and outside of academia including Avid, Cycling ’74 and CNMAT.
See his full bio on the Team page.

RAPID-MIX at the JUCE Summit

RAPID-MIX at the JUCE Summit

RAPID-MIX team member and Goldsmiths reader in creative computing Mick Grierson was present at the JUCE summit on 19-20th November, alongside representatives from Google, Native Instruments, Cycling ’74 and other key music industry players. Mick is Innovation Manager for RAPID-MIX, running the whole project as well as being a key developer of MIX technologies.

http://www.juce.com/summit#overview

Even if you haven’t heard of JUCE, chances are you’ve used it without knowing. It is an environment in which you can build plug-ins, music software and more, and is perhaps best known as the environment in which Cycling 74’s Max MSP is built and runs. It’s ever present in the background of a lot of audio software. The summit brought together key technologists in the music industry, and it was a pleasure and an honour to have a RAPID MIX presence amongst them

Mick spoke between David Zicarelli (from Cycling 74) and Andrew Bell (Cinder), presenting his C++ audio DSP engine Maximilian, describing how it integrates with JUCE, Cinder, Open Frameworks and other tools. Maximilian is an incredibly powerful, lightweight audio engine which will form a key part of the RAPID API we will be launching soon. If you want to dig deeper, check out the Maximilian library on Github here:

https://github.com/micknoise/Maximilian

JUCE is now owned by RAPID-MIX partners ROLI, and we’re excited about future developments involving RAPID-MIX, Juce and the creative music industry partners present at the summit. Stay tuned for more news!

Kurv Guitar Launch

Kurv Guitar Launch

We are delighted to announce the launch of the Kickstarter for the Kurv guitar. This is the first of many projects and products that are powered by RAPID MIX technologies.

It’s the RAPID MIX gesture recognition and synthesis technologies, along with our own hardware design and bluetooth firmware, that make projects like the Kurv Guitar possible. We expect this to be the first of many musical interfaces, ranging from games to serious musical instruments, that help turn the human body into an expressive musical tool using RAPID MIX technologies. Check out their Kickstarter below:

Getting ready for the Co-Design workshop!

Getting ready for the Co-Design workshop!

Just about to start our first Co-Design workshop for the MusicHackDay at Pompeu Fabra!

RAPID-MIX in Music Hack Day, Barcelona!

RAPID-MIX in Music Hack Day, Barcelona!

RAPID-MIX will be participating in the Music Hack Day, Barcelona from June 17th to the 19th. We will be conducting a workshop for participatory design and showcasing some of the technologies that we are developing in talks and performances. Come and join us and have a hands on approach with the new cutting edge technologies for music!

Second Meeting at IRCAM, Paris

Second Meeting at IRCAM, Paris

The next RAPID MIX meeting will be at IRCAM in Paris from the 20th – 22nd May.

This meeting is to plan and prepare for the workshops and user centred design sessions we will be doing before and during the Barcelona Hackday. We will be reviewing the different technologies that the partners can bring for people to work with, think about some early prototpyes and prepare strategies and challenges for the hackers. We’re excited about bringing these different technologies together and start putting them in the hands of artists, musicians, programmers and who are going to be using them.

Barcelona Kickoff Meeting!

Barcelona Kickoff Meeting!

We are in Barcelona at the Music Technology group, UPF for our long awaited kick off meeting for our new EC Horizon2020 funded ICT project, RAPID MIX. This is an initial meeting for all the partners to meet each other, discuss the different work packages that constitute the project, and start to get inspired by sharing research, products and technologies.

 

Nice to feel welcome with wayfinding

 

 

 

 

 

 

 

 

 

Nice to feel welcome with wayfinding!

 

Sergi Jorda (Music Technology Group) reminds us of project objectives

 

 

 

 

 

 

 

 

Sergi Jorda (Music Technology Group) reminds us of project objectives

 

 

Gunter Geiger

 

 

 

 

 

 

 

 

ReacTable Systems: Gunter Geiger

 

 

Alba Rosado from MTG on Workpackage 1

 

 

 

 

 

 

 

 

Alba Rosado from MTG on Workpackage 1

Partners

Features