If you are a training simulation company dealing with the gap between visual VR training and real hands-on experience — this project developed ultrasound-based mid-air haptic prototypes and wearable haptic rings that let trainees feel resistance, texture, and object dynamics during virtual exercises. The consortium built 2 generations of non-contact haptic prototypes and 2 demonstrators covering both 2D surface and 3D object manipulation.
Touch Feedback for VR and AR Without Gloves or Controllers
Imagine reaching into thin air and actually feeling a virtual button, a 3D model, or the texture of a fabric — with your bare hands. H-Reality combined focused ultrasound waves (think invisible sound beams pushing against your skin) with small vibrating rings on your fingers to create the sensation of touching objects that aren't physically there. The result is that VR and AR experiences stop feeling hollow and start feeling real, without bulky gloves or handheld controllers getting in the way.
What needed solving
Today's VR and AR experiences are all eyes and ears — users can see and hear virtual objects but cannot feel them. This makes training simulations less effective, product design reviews superficial, and remote operation of machinery imprecise. Without touch feedback, virtual environments remain fundamentally disconnected from how humans naturally interact with the physical world.
What was built
The project built 2 generations of ultrasound-based non-contact haptic prototypes (with firmware upgrades and software libraries for texture rendering) plus 2 demonstrators — one for 2D surface interaction and one for 3D object manipulation. It also developed wearable vibrotactile haptic rings for directional feedback and computational models for rendering specific material surface properties.
Who needs this
Who can put this to work
If you are a surgical simulation company struggling with the fact that surgeons cannot feel tissue resistance or instrument feedback in virtual practice — this project created haptic technology that renders material-specific surface properties through ultrasound and vibrotactile stimulation. The project explicitly targeted surgical skills training as a use case, with 7 consortium partners across 3 countries contributing psychophysical and tribological expertise.
If you are a product design firm where engineers review 3D models on flat screens without any tactile feedback — this project built demonstrators for intuitive 3D object manipulation using mid-air haptics. Designers could feel shapes and textures of virtual prototypes with bare hands, replacing mouse-based rotation with natural gestures. The technology was validated through 2 demonstrators over a EUR 2,994,965 research program.
Quick answers
What would it cost to license or integrate this haptic technology?
The project was funded under FET Open (Future and Emerging Technologies) with EUR 2,994,965 in EU contribution across 7 partners. Licensing terms would need to be negotiated directly with the consortium, likely led by the University of Birmingham. The 2 SME partners in the consortium may already offer commercial pathways.
Can this scale to industrial deployment or mass production?
The project produced 2 generations of non-contact haptic prototypes with progressively improved signal processing and multi-board arrangements. However, as a FET Open research project, the technology was demonstrated in lab conditions rather than at production scale. Moving from demonstrator to product would require additional engineering and manufacturing partnerships.
Who owns the intellectual property and how can I access it?
IP is distributed among the 7 consortium partners across the UK, France, and the Netherlands, governed by their EU grant agreement. The University of Birmingham as coordinator would be the first point of contact. The 2 industry partners (including 2 SMEs) likely hold commercialization rights for their specific contributions.
How mature is this technology — is it ready to deploy?
The consortium delivered 2 prototype generations and 2 functional demonstrators — one for 2D surface interaction (month 27) and one for 3D object manipulation (month 42). This places the technology at a tested-prototype stage, suitable for pilot integration but not yet a turnkey product.
What hardware is needed to run this system?
Based on the deliverables, the system combines ultrasound transducer boards (with firmware for signal processing) and wearable vibrotactile rings. The Generation 2 prototype added software libraries for texture rendering and non-planar, multi-board arrangements. Specific hardware specs would need to be discussed with the consortium.
Is this compatible with existing VR/AR headsets and platforms?
The project focused on mid-air haptic interaction applicable to both VR and AR environments. Based on available project data, the demonstrators were built as standalone systems. Integration with commercial headsets like Meta Quest or HoloLens would likely require adaptation work with the consortium's software libraries.
Are there regulations or standards I should know about?
Ultrasound-based haptic devices must comply with safety limits for ultrasound exposure to human skin. Based on available project data, the consortium included tribological and psychophysical expertise to ensure safe interaction parameters. Any commercial deployment would need CE marking and potentially medical device certification if used in surgical training.
Who built it
The H-Reality consortium brings together 7 partners from 3 countries (UK, France, Netherlands) with a 29% industry ratio — meaning roughly 2 out of 7 partners come from industry, both of which are SMEs. The University of Birmingham leads the project, backed by 3 universities and 2 research organizations providing the scientific depth in ultrasound physics, skin mechanics, and psychophysics. The 2 industry SMEs are described as "commercial pioneers of ultrasonic non-contact haptics," suggesting they already have market presence in haptic hardware. This mix of academic depth and commercial haptics expertise makes the consortium credible for technology transfer, though the relatively small number of industry partners means commercialization will depend heavily on those 2 SMEs or new licensees stepping in.
- THE UNIVERSITY OF BIRMINGHAMCoordinator · UK
- INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET AUTOMATIQUEthirdparty · FR
- ULTRALEAP LIMITEDparticipant · UK
- ACTRONIKAparticipant · FR
- INSTITUT NATIONAL DES SCIENCES APPLIQUEES DE RENNESthirdparty · FR
- CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRSparticipant · FR
- TECHNISCHE UNIVERSITEIT DELFTparticipant · NL
The University of Birmingham (UK) coordinated this project. Contact their research commercialization office or the principal investigator via the project website.
Talk to the team behind this work.
Want to explore licensing this haptic technology or connecting with the H-Reality team? SciTransfer can arrange an introduction and brief you on integration options.