SciTransfer
H-Reality · Project

Touch Feedback for VR and AR Without Gloves or Controllers

digitalTestedTRL 4

Imagine reaching into thin air and actually feeling a virtual button, a 3D model, or the texture of a fabric — with your bare hands. H-Reality combined focused ultrasound waves (think invisible sound beams pushing against your skin) with small vibrating rings on your fingers to create the sensation of touching objects that aren't physically there. The result is that VR and AR experiences stop feeling hollow and start feeling real, without bulky gloves or handheld controllers getting in the way.

By the numbers
EUR 2,994,965
EU contribution for haptic technology research
7
consortium partners across 3 countries
2
generations of non-contact haptic prototypes built
2
functional demonstrators (2D surface + 3D object manipulation)
15
total project deliverables completed
2
SME partners in the consortium
The business problem

What needed solving

Today's VR and AR experiences are all eyes and ears — users can see and hear virtual objects but cannot feel them. This makes training simulations less effective, product design reviews superficial, and remote operation of machinery imprecise. Without touch feedback, virtual environments remain fundamentally disconnected from how humans naturally interact with the physical world.

The solution

What was built

The project built 2 generations of ultrasound-based non-contact haptic prototypes (with firmware upgrades and software libraries for texture rendering) plus 2 demonstrators — one for 2D surface interaction and one for 3D object manipulation. It also developed wearable vibrotactile haptic rings for directional feedback and computational models for rendering specific material surface properties.

Audience

Who needs this

VR/AR training simulation companies for hazardous industriesSurgical simulation and medical training platform developers3D product design and digital prototyping software firmsRemote equipment operation and telerobotics companiesInteractive museum and retail experience designers
Business applications

Who can put this to work

Industrial Training & Simulation
mid-size
Target: Companies providing VR-based operator training for heavy machinery, oil rigs, or hazardous environments

If you are a training simulation company dealing with the gap between visual VR training and real hands-on experience — this project developed ultrasound-based mid-air haptic prototypes and wearable haptic rings that let trainees feel resistance, texture, and object dynamics during virtual exercises. The consortium built 2 generations of non-contact haptic prototypes and 2 demonstrators covering both 2D surface and 3D object manipulation.

Medical Device & Surgical Simulation
enterprise
Target: Surgical training platform developers and medical device interface designers

If you are a surgical simulation company struggling with the fact that surgeons cannot feel tissue resistance or instrument feedback in virtual practice — this project created haptic technology that renders material-specific surface properties through ultrasound and vibrotactile stimulation. The project explicitly targeted surgical skills training as a use case, with 7 consortium partners across 3 countries contributing psychophysical and tribological expertise.

Product Design & CAD/CAM
any
Target: 3D design software companies and digital prototyping firms

If you are a product design firm where engineers review 3D models on flat screens without any tactile feedback — this project built demonstrators for intuitive 3D object manipulation using mid-air haptics. Designers could feel shapes and textures of virtual prototypes with bare hands, replacing mouse-based rotation with natural gestures. The technology was validated through 2 demonstrators over a EUR 2,994,965 research program.

Frequently asked

Quick answers

What would it cost to license or integrate this haptic technology?

The project was funded under FET Open (Future and Emerging Technologies) with EUR 2,994,965 in EU contribution across 7 partners. Licensing terms would need to be negotiated directly with the consortium, likely led by the University of Birmingham. The 2 SME partners in the consortium may already offer commercial pathways.

Can this scale to industrial deployment or mass production?

The project produced 2 generations of non-contact haptic prototypes with progressively improved signal processing and multi-board arrangements. However, as a FET Open research project, the technology was demonstrated in lab conditions rather than at production scale. Moving from demonstrator to product would require additional engineering and manufacturing partnerships.

Who owns the intellectual property and how can I access it?

IP is distributed among the 7 consortium partners across the UK, France, and the Netherlands, governed by their EU grant agreement. The University of Birmingham as coordinator would be the first point of contact. The 2 industry partners (including 2 SMEs) likely hold commercialization rights for their specific contributions.

How mature is this technology — is it ready to deploy?

The consortium delivered 2 prototype generations and 2 functional demonstrators — one for 2D surface interaction (month 27) and one for 3D object manipulation (month 42). This places the technology at a tested-prototype stage, suitable for pilot integration but not yet a turnkey product.

What hardware is needed to run this system?

Based on the deliverables, the system combines ultrasound transducer boards (with firmware for signal processing) and wearable vibrotactile rings. The Generation 2 prototype added software libraries for texture rendering and non-planar, multi-board arrangements. Specific hardware specs would need to be discussed with the consortium.

Is this compatible with existing VR/AR headsets and platforms?

The project focused on mid-air haptic interaction applicable to both VR and AR environments. Based on available project data, the demonstrators were built as standalone systems. Integration with commercial headsets like Meta Quest or HoloLens would likely require adaptation work with the consortium's software libraries.

Are there regulations or standards I should know about?

Ultrasound-based haptic devices must comply with safety limits for ultrasound exposure to human skin. Based on available project data, the consortium included tribological and psychophysical expertise to ensure safe interaction parameters. Any commercial deployment would need CE marking and potentially medical device certification if used in surgical training.

Consortium

Who built it

The H-Reality consortium brings together 7 partners from 3 countries (UK, France, Netherlands) with a 29% industry ratio — meaning roughly 2 out of 7 partners come from industry, both of which are SMEs. The University of Birmingham leads the project, backed by 3 universities and 2 research organizations providing the scientific depth in ultrasound physics, skin mechanics, and psychophysics. The 2 industry SMEs are described as "commercial pioneers of ultrasonic non-contact haptics," suggesting they already have market presence in haptic hardware. This mix of academic depth and commercial haptics expertise makes the consortium credible for technology transfer, though the relatively small number of industry partners means commercialization will depend heavily on those 2 SMEs or new licensees stepping in.

How to reach the team

The University of Birmingham (UK) coordinated this project. Contact their research commercialization office or the principal investigator via the project website.

Next steps

Talk to the team behind this work.

Want to explore licensing this haptic technology or connecting with the H-Reality team? SciTransfer can arrange an introduction and brief you on integration options.