If you are a game studio struggling to make interactions feel natural beyond button-mashing — this project developed the RAPID-API with motion sensors, haptic actuators, and machine learning that adapts to each player's body movements. The final products reached TRL8, meaning they were completed and qualified for integration. With 6 SME partners already building products on this API, the technology is proven for interactive entertainment.
Smart Sensors and Machine Learning That Make Devices Respond to Your Body and Gestures
Imagine your game controller, fitness tracker, or music instrument could read your muscle signals, sense your gestures, and learn how you personally move — then adapt in real time. That's what RAPID-MIX built: an open toolkit combining wearable sensors (brain and muscle signals, motion tracking, touch feedback) with machine learning that adjusts to each user. They packaged it all into a developer-friendly API so companies can plug these capabilities into their own products — whether that's a guitar that responds to how you feel, a rehab device that tracks your recovery, or a game that reads your body language.
What needed solving
Companies building interactive products — games, health devices, musical instruments — struggle to make human-device interaction feel natural. Traditional interfaces rely on buttons and touchscreens, missing the richness of body language, gestures, and physiological signals. Building custom sensor fusion with real-time machine learning from scratch is expensive and slow, putting smaller companies at a disadvantage against tech giants.
What was built
An open-source API (RAPID-API) combining wearable biosensors (EEG, EMG), motion sensors, and haptic actuators with adaptive machine learning middleware. Final commercial products (MIX systems) were delivered at TRL8 for music, gaming, and e-Health applications, along with API demonstrators for other industries and a standardized middleware protocol for multimodal data handling.
Who needs this
Who can put this to work
If you are a digital health company needing to track patient movement and physiological signals during rehabilitation — this project built multimodal hardware combining EEG, EMG biosensors with motion tracking and haptic feedback. The machine learning layer personalizes to each patient's movement patterns. The open-source RAPID-API lets you integrate these capabilities into your own devices without building sensor fusion from scratch.
If you are a music technology company looking to create instruments that respond to a performer's gestures and physical expression — this project delivered finished MIX products specifically for music applications at TRL8. The consortium included 4 creative industries SMEs who built real products using the RAPID-API, proving the technology works for expressive musical interaction with 25 deliverables completed.
Quick answers
What would it cost to integrate RAPID-MIX technology into our product?
The RAPID-API is open source, so the core software libraries, hardware designs, and middleware are available at no licensing cost. The EU invested EUR 2,330,004 in developing this technology across 9 partners. Your integration costs would depend on hardware components and development effort to adapt the API to your specific product.
Can this scale to mass production or is it still a lab demo?
The final MIX products reached TRL8 — meaning system complete and qualified, one step below commercial deployment. The project specifically used an iterative rapid development cycle moving from TRL6 through TRL7 to TRL8 across three product delivery phases. With 6 SME partners already building commercial products on the platform, the technology is designed for production use.
What is the IP situation — can we freely use this technology?
The RAPID-API is explicitly described as open source, exposing application-level access to software libraries, hardware designs, and middleware layers. This was a deliberate design choice to enable broader industries and DIY communities to use the API in their own products in cost-effective ways. Specific IP terms should be confirmed with the coordinator.
How mature is the machine learning component — does it actually work in real-time?
The project name itself emphasizes 'Realtime Adaptive Prototyping.' The machine learning software was specifically built to adapt to expressive human variation in real time, allowing fluid interaction and personalized experience. The TRL8 qualification of final products confirms this works outside the lab.
What sensors and signals does this actually support?
The system integrates physiological biosignals including EEG (brain activity) and EMG (muscle signals) in multimodal hardware configurations with motion sensors and haptic actuators. The middleware protocol handles transaction of multimodal data between the RAPID-API and end products. Based on available project data, WiFi and IoT connectivity are also part of the technology stack.
Who built this and can we still reach them?
The coordinator is Universidad Pompeu Fabra in Spain, a leading research university. The consortium of 9 partners across 4 countries (Spain, France, Portugal, UK) included 6 SMEs and 4 creative industries companies who were direct technology users. SciTransfer can facilitate introductions to the relevant partners.
How long would integration take for a working prototype?
The project was specifically designed around rapid prototyping cycles. They delivered early prototypes, then iterated through three versions (TRL6, TRL7, TRL8). The API demonstrators showed the technology applied to different domains, suggesting the API is structured for relatively fast integration. Based on available project data, the full development cycle from prototype to qualified product took approximately 3 years across the consortium.
Who built it
This is a commercially-oriented consortium with 67% industry participation — 6 out of 9 partners are SMEs, which is unusually high for an EU project. The 4 creative industries SMEs and 1 wearable technology SME were not just advisors but active product builders using the RAPID-API. Three leading research institutions (including coordinator Universidad Pompeu Fabra) provided the scientific backbone. The 4-country spread across Spain, France, Portugal, and the UK gives decent European market coverage. The fact that this was funded as an Innovation Action (not a Research Action) confirms the EU considered this close to market from the start. With EUR 2,330,004 invested and products reaching TRL8, the consortium was structured to deliver commercial-grade technology, not academic papers.
- UNIVERSIDAD POMPEU FABRACoordinator · ES
- INSTITUT DE RECHERCHE ET DE COORDINATION ACOUSTIQUE MUSIQUE - IRCAMparticipant · FR
- GOLDSMITHS' COLLEGEparticipant · UK
- ROLI Ltdparticipant · UK
- PLUX - WIRELESS BIOSIGNALS S.A.participant · PT
- AUDIOGAMINGparticipant · FR
Universidad Pompeu Fabra (Barcelona, Spain) — SciTransfer can facilitate a direct introduction to the project coordinator and relevant SME partners
Talk to the team behind this work.
Want to explore how RAPID-MIX sensor fusion and adaptive ML can power your next product? SciTransfer connects you directly with the team that built it — from open-source API access to custom integration support.