If you are a surgical robotics manufacturer struggling to differentiate your platform beyond mechanical precision — this project developed sensor-driven controllers using reinforcement learning that deliver functional accuracy during spine procedures like pedicle screw placement and endoscopic lumbar discectomy. The real-time spine tracking algorithms based on hybrid ultrasound sensing could be integrated into your next-generation surgical robot to reduce complication rates.
Smart Surgical Robots That Understand Tissue, Not Just Geometry, for Safer Spine Surgery
Surgical robots today are incredibly precise at moving to exact positions — but surgery isn't about hitting coordinates on a map. It's more like a chef knowing when dough feels right, not just measuring ingredients. FAROS gave surgical robots a sense of touch and awareness by adding non-visual sensors and AI that learns how a skilled surgeon would react to what the tissue actually feels like. They proved this on two of the trickiest spine procedures: placing screws in vertebrae and removing damaged discs.
What needed solving
Surgical robots are geometrically precise but surgically limited — they move exactly where told but don't understand what they're cutting into. In spine surgery, where every millimeter matters and tissue varies wildly between patients, purely position-based robots cannot adapt the way experienced surgeons do, leading to inconsistent outcomes and avoidable complications.
What was built
The team built sensor-equipped robotic controllers that use reinforcement learning to achieve functional accuracy in spine surgery. Key outputs include real-time spine tracking algorithms based on hybrid ultrasound sensing and public demonstration setups validated across 2 spine surgery procedures, with 19 deliverables in total.
Who needs this
Who can put this to work
If you are a hospital investing in robotic-assisted spine surgery but facing inconsistent outcomes due to patient variability — this project built autonomous robot controllers that adapt to real tissue conditions using non-visual sensors. The system was demonstrated across 2 critical spine surgery use cases and could help standardize surgical quality across your operating teams.
If you are a sensor company looking for new clinical applications — this project validated hybrid ultrasound sensing for real-time spine tracking during robotic surgery. Their work on relating non-conventional sensor signals to functional parameters like tissue type, bone quality, and implant stability opens a direct integration path for your sensor hardware into the growing surgical robotics market.
Quick answers
What would it cost to license or integrate this technology?
The project does not publish licensing terms or pricing. KU Leuven coordinates the IP, and with 6 consortium partners across 4 countries, licensing negotiations would likely involve multiple parties. Contact the coordinator through SciTransfer to discuss terms.
Can this scale to other surgical procedures beyond spine?
The core technology — non-visual sensors combined with reinforcement learning controllers — is designed around the general concept of functional accuracy rather than spine-specific geometry. While the 2 demonstrated use cases are spine procedures (pedicle screw placement and endoscopic lumbar discectomy), the sensor-to-function mapping approach could potentially extend to other surgeries where tissue variability matters.
What is the IP situation and who owns the results?
As a Horizon 2020 RIA project coordinated by KU Leuven with 6 partners, IP is typically shared according to each partner's contribution. The consortium includes 1 industry partner and 1 SME, which suggests some results may already have commercial exploitation plans. Specific patent filings would need to be confirmed directly.
Has this been tested in real surgical settings?
The project produced public demonstration setups for functional accuracy and physical intelligence, with planned demonstrations at DIH-HERO, ERF, and CRAS events. They also developed real-time spine tracking algorithms based on hybrid ultrasound sensing. Based on available project data, clinical validation details are not specified.
How does this integrate with existing surgical robot platforms?
FAROS focused on embedding physical intelligence through non-visual sensors and AI-driven controllers. The 19 deliverables include sensor systems and algorithmic components that could potentially be layered onto existing robotic platforms. Integration specifics would depend on the target robot's architecture and sensor interfaces.
What regulatory approvals does this have?
Based on available project data, regulatory clearance status is not mentioned. As an RIA research project, the technology would still require medical device certification (e.g., CE marking under MDR) before clinical deployment. The demonstration-stage maturity suggests regulatory submissions are likely a next step.
Is there ongoing support or follow-up development?
The project closed in June 2024 after 3.5 years of development. The consortium of 4 universities and 1 research organization provides a strong academic base for continued development. Whether commercial follow-up or a successor project is planned would need to be confirmed with the coordinator.
Who built it
The FAROS consortium is a compact, research-heavy team of 6 partners across 4 countries (Belgium, Switzerland, France, UK), led by KU Leuven — one of Europe's top research universities. With 4 universities and 1 research organization making up 83% of the partnership, this is clearly a science-driven project. The single industry partner and 1 SME (17% industry ratio) suggest limited commercial pull so far, which means a business partner entering now could shape how this technology reaches the market. The cross-border expertise spanning robotics, AI, and clinical surgery is strong, but commercialization will require additional industrial partnerships.
- KATHOLIEKE UNIVERSITEIT LEUVENCoordinator · BE
- UNIVERSITAT ZURICHparticipant · CH
- CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRSthirdparty · FR
- KING'S COLLEGE LONDONparticipant · UK
- SORBONNE UNIVERSITEparticipant · FR
KU Leuven (Belgium) — contact via SciTransfer for coordinator introduction
Talk to the team behind this work.
Want to explore licensing this surgical robotics technology or partnering with the FAROS team? SciTransfer can arrange a direct introduction to the project coordinator at KU Leuven.