SciTransfer
WhoLoDancE · Project

AI-Powered Motion Capture Platform That Teaches Movement Through Immersive Training

digitalPrototypeTRL 5Thin data (2/5)

Imagine having a virtual dance coach that watches your every move, compares it to a master dancer, and tells you exactly what to fix — in real time, inside a 3D environment. That's basically what this project built. They created a smart system that captures dance movements, stores them in a searchable library, and uses AI to give personalized feedback. Think of it like a spell-checker, but for body movement instead of words.

By the numbers
EUR 3,332,585
EU funding for development
10
consortium partners
7
countries represented
46
total deliverables produced
3
SME partners in consortium
The business problem

What needed solving

Companies in fitness, entertainment, and rehabilitation need affordable ways to capture, analyze, and give real-time feedback on human body movement — but professional motion capture systems are expensive and require specialized expertise. There is no widely available platform that lets non-experts build movement training applications with AI-powered assessment and personalized coaching.

The solution

What was built

A proof-of-concept motion capture repository with AI similarity search enabling movement interpolation, extrapolation, and synthesis. Immersive real-time training interfaces with automated assessment and personalized feedback. Hardware evaluation of 3D holographic and non-holographic display methods for body-movement visualization. 46 deliverables in total covering capture, cataloguing, analysis, and interactive learning tools.

Audience

Who needs this

Sports tech companies building AI-powered athlete coaching platformsAnimation and game studios needing motion capture libraries with AI synthesisDigital health startups developing remote physiotherapy and rehabilitation appsEdTech companies creating immersive skills training for physical professionsVR/AR studios building body-tracking interactive experiences
Business applications

Who can put this to work

Fitness & Sports Technology
SME
Target: Sports tech companies building athlete training platforms

If you are a sports tech company struggling to give athletes precise movement feedback without expensive personal coaches — this project developed a motion capture repository with AI-driven similarity search and real-time assessment tools that can analyze body movement and deliver personalized correction. The system was built across 10 partners in 7 countries and produced 46 deliverables covering capture, analysis, and feedback.

Entertainment & Animation
mid-size
Target: Game studios and animation houses needing motion libraries

If you are an animation studio spending weeks manually creating realistic movement sequences — this project built a motion capture repository designed for interpolation, extrapolation, and synthesis of movements through similarity search. The technology enables new ways to remix and generate realistic motion data, reducing the manual effort of creating movement content from scratch.

Rehabilitation & Physiotherapy
any
Target: Digital health companies building remote rehab platforms

If you are a digital health company trying to deliver movement-based rehabilitation remotely — this project developed multi-modal interaction technologies with real-time assessment and adaptive feedback for body movement. The immersive training interfaces and personalized learning engine could be adapted to guide patients through physical therapy exercises with automated progress tracking.

Frequently asked

Quick answers

What would it cost to license or adapt this technology?

The project was funded with EUR 3,332,585 in EU contribution across a 10-partner consortium. Licensing terms would need to be negotiated with the coordinator LYNKEUS (Italy, SME). As an EU-funded project, results may be subject to open access obligations, but commercial licensing of specific tools is typically handled by the IP-holding partners.

Can this scale beyond dance to other movement training applications?

The core technology — motion capture, AI-based similarity search, real-time feedback, and adaptive personalized learning — is movement-generic. The project explicitly built tools for capture, cataloguing, analysis, and synthesis of body motion. Adapting to sports training, physiotherapy, or animation would require new movement libraries but the underlying platform architecture transfers.

Who owns the intellectual property?

IP is distributed among the 10 consortium partners across 7 countries (DE, EL, ES, FR, IT, NL, UK). The coordinator LYNKEUS is an Italian SME. Specific IP ownership depends on which partner developed which component. Contact the coordinator for licensing details on specific modules.

What was actually built and demonstrated?

The project delivered a proof-of-concept motion capture repository with similarity search enabling interpolation, extrapolation, and synthesis of dance movements. They also built immersive real-time training interfaces with assessment and feedback, plus evaluated 3D holographic display methods for immersive body-movement visualization. A total of 46 deliverables were produced.

How mature is this technology for commercial deployment?

The project explicitly describes its output as a proof-of-concept. The hardware assessment deliverable evaluated 3D holographic and non-holographic display methods for their suitability. This places the technology at demonstration stage — functional but requiring further engineering for commercial product readiness.

Is there regulatory compliance needed for movement tracking?

Based on available project data, no specific regulatory certifications were mentioned. However, commercial deployment involving body tracking would need to address GDPR for biometric data processing in the EU, particularly if the system captures and stores identifiable movement patterns of individuals.

What technical infrastructure is needed to run this?

The system requires motion capture hardware, processing infrastructure for AI-based movement analysis, and display technology for immersive feedback. The project evaluated multiple 3D holographic and non-holographic volume display methods. The EuroSciVoc keywords include mobile phones, suggesting some components may work on consumer devices.

Consortium

Who built it

The WhoLoDancE consortium brings together 10 partners from 7 European countries (DE, EL, ES, FR, IT, NL, UK), with a balanced mix of 3 universities, 3 industry players, 1 research organization, and 3 other entities. The 30% industry ratio and 3 SMEs signal genuine commercial interest, though the project leans more toward research than market deployment. The coordinator LYNKEUS is an Italian SME, which suggests the technology was developed with eventual commercialization in mind. The geographic spread across major EU markets means the technology has been tested with diverse cultural and technical requirements.

How to reach the team

LYNKEUS is an Italian SME that coordinated the project. Contact through the project website or CORDIS contact form for licensing inquiries.

Next steps

Talk to the team behind this work.

Want to explore how WhoLoDancE's motion capture and AI feedback technology could fit your training or content platform? SciTransfer can connect you with the right consortium partner for your use case.