If you are an e-commerce company dealing with high product return rates because customers say items look different in person — this project developed prototype software for perceptual appearance design that lets you display products under virtual dynamic illumination, helping customers see realistic material textures and finishes before buying. The project had 4 industry partners involved in translating research into practical tools.
Making Virtual Materials Look and Feel Real for Product Design and VR
You know how things in video games or online shopping still look a bit "off" — the leather jacket doesn't quite look like leather, the wooden table seems fake? DyViTo figured out why our brains notice those differences and built software tools that make virtual materials look and feel convincing. They studied how our eyes and hands work together to judge whether something looks right, especially when it moves or the lighting changes. The results are prototype tools that help designers create virtual products and spaces that actually trick your senses into believing they're real.
What needed solving
Businesses creating virtual products, VR environments, or digital showrooms struggle to make materials look and feel convincing. Customers return products bought online because they looked different on screen. VR experiences break immersion because surfaces look artificial, especially when lighting changes or objects move.
What was built
The project built 3 prototype tools: software for intuitively editing virtual material appearance with perceptual accuracy, a product display setup enabling virtual dynamic illumination for non-expert users, and lighting design tools that use light field parameters instead of traditional surface measurements. In total, 27 deliverables were produced across perception research, rendering, and design applications.
Who needs this
Who can put this to work
If you are a lighting design firm struggling to predict how materials will actually look under different lighting conditions — this project built prototype lighting design tools that combine physical measurements with computational modelling for real and virtual spaces, working with light field parameters instead of traditional surface illuminances. These tools were developed across a consortium of 15 partners spanning 6 countries.
If you are a VR studio spending excessive time hand-tuning material appearances to look convincing — this project created prototype software that allows users to intuitively modify virtual material appearance, taking into account how humans actually perceive dynamic materials. The research was backed by 9 university labs specializing in visual and haptic perception, neuroimaging, and material rendering.
Quick answers
What would it cost to license or use these tools?
The project produced prototype-level software tools, not commercial products. Licensing terms would need to be negotiated directly with the University of Bradford or the specific partner that developed each prototype. As an MSCA training network, IP arrangements may vary across the 15 consortium partners.
Can these tools work at industrial scale in a production pipeline?
The deliverables are described as prototypes — perceptual editing software, product display setups, and lighting design tools. They demonstrate proof of concept but would likely need engineering work to integrate into commercial production pipelines. The 4 industry partners in the consortium may have advanced some tools further.
Who owns the intellectual property?
IP is distributed across a 15-partner consortium spanning 6 countries (DE, ES, FR, NL, TR, UK). Each of the 11 ESR fellows produced research under different institutional arrangements. Specific IP ownership would depend on the partner and the individual deliverable.
How does the perceptual editing tool actually work?
Based on the deliverable description, the prototype software allows users to intuitively modify virtual material appearance while accounting for dynamic aspects of material perception. It combines insights from psychology, computer science, and neuroscience to ensure edits look realistic to human observers rather than just physically accurate.
Is this compatible with existing 3D rendering engines?
Based on available project data, the tools were built as standalone prototypes rather than plugins for specific engines. The lighting design tools work with light field parameters instead of traditional surface illuminances, which suggests a different approach from standard rendering pipelines. Integration work would be needed.
What evidence is there that these tools improve outcomes?
The project combined behavioral psychophysics with computational modelling, meaning the tools are grounded in measured human perception data rather than guesswork. The consortium included 9 universities specializing in visual and haptic perception and neuroimaging, providing a strong scientific foundation for the perceptual accuracy claims.
How long until these could be deployed commercially?
The project closed in March 2022, and deliverables were at prototype stage. With 4 industry partners already involved, some tools may have progressed further post-project. A commercial deployment timeline would depend on which specific tool and partner you engage with.
Who built it
The DyViTo consortium brings together 15 partners from 6 countries (DE, ES, FR, NL, TR, UK), with a moderate industry presence at 27% (4 industry partners alongside 9 universities and 2 other organizations). Only 1 partner is an SME. The consortium is research-heavy, which is typical for an MSCA-ITN training network — the primary mission was training 11 early-stage researchers, not building market-ready products. For a business looking to adopt this technology, the University of Bradford as coordinator and the 4 industry partners (including Delft-based contributors working on product experience design) would be the most relevant contacts. The multi-country, multi-discipline setup means expertise is spread across perception science, computer graphics, and lighting design.
- UNIVERSITY OF BRADFORDCoordinator · UK
- THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGEparticipant · UK
- UNIVERSIDAD DE ZARAGOZAparticipant · ES
- THE BOARD TRUSTEES OF THE SCIENCE MUSEUMpartner · UK
- UNIVERSITAET ULMparticipant · DE
- UNIVERSITY OF NEWCASTLE UPON TYNEparticipant · UK
- JUSTUS-LIEBIG-UNIVERSITAET GIESSENparticipant · DE
- ARUP BVpartner · NL
- LEDMOTIVE TECHNOLOGIES SLpartner · ES
- UNIVERSITY OF SOUTHAMPTONparticipant · UK
- BILKENT UNIVERSITESI VAKIFparticipant · TR
- THE NATIONAL GALLERYpartner · UK
- TECHNISCHE UNIVERSITEIT DELFTparticipant · NL
University of Bradford, UK — contact through project website or university research office
Talk to the team behind this work.
Want to explore how perceptual material rendering tools could reduce your product visualization costs or improve your VR content? SciTransfer can connect you with the right research partner from this consortium.