outcomes

Home / Archive by category "outcomes"

Products

Resulting new and augmented products – based on multimodal devices, adaptive apps, wearables – will be validated in the context of major creative industries including video games, music technology, and broader industries of quantified self and e-Health. These prototypes will be evaluated by first units’ sales to business-to- business and business-to-consumer customers, market valuation and investment reports, market-replication and job creation.

Prototypes

Starting from Proofs of Concept technologies existing in our research labs, which will be validated in relevant environments presented by partner SMEs use cases, the new multimodal interfaces will be designed following both the needs of the end users and also tackling the market needs as expressed by the partner SMEs and also those envisioned by external market experts.

RAPID API

The RAPID-API toolkit (hardware & software) for product integration includes:

  • Low cost, miniaturised hardware solutions for fast prototyping
  • Interactive machine learning algorithms for motion recognition and classification
  • Real-time machine learning and feature extraction for biosignal analysis
  • Context-aware machine listening and computer vision techniques
  • Physiological data sonification methods
  • Haptic feedback interfaces
  • Agile coding frameworks for mobile, multi-touch applications
  • Online multimodal data storage and visualisation
  • Social media platforms for collaborative music meta-tagging

Methodology

We adopt a multidisciplinary approach combining technological disciplines such as electrical engineering, machine learning, physiological and wearable computing, with human-centric fields like human-computer interaction, user-centred design, product design, and perception studies. Consortium partners have proven experience transferring research in multimodal interaction to creative industry domains such as music and video games.