If you are a VR hardware company struggling with user dropout because your controllers only vibrate instead of delivering real touch — this project developed a soft electronic skin with multi-dimensional micro and nano sensors that captures and transmits rich haptic information. The consortium demonstrated remote human haptic interaction between two physically separated users, which could be the missing piece for truly immersive VR experiences.
Artificial Electronic Skin That Gives Robots and VR Devices a Human-Like Sense of Touch
Imagine shaking someone's hand through a screen and actually feeling the warmth and pressure — that's the direction this project pushes. Our skin has thousands of sensors that work together, and our brain predicts what touch should feel like before it even happens. The ph-coding team figured out how the brain does this trick and built a flexible electronic skin packed with tiny sensors that mimics it. They then wired it into robots and remote communication systems so two people — or a person and a robot — can share rich touch sensations across a distance.
What needed solving
Current VR systems, remote-controlled robots, and telepresence tools deliver flat, unrealistic touch feedback — mostly just vibrations. Users lose interest quickly because the experience feels nothing like real physical interaction. Companies building these products are stuck with haptic technology that hasn't fundamentally changed in decades, and they're losing users and revenue because of it.
What was built
The project built two concrete demonstration systems: a remote haptic interaction system that lets two physically separated people share rich touch sensations, and a Haptic AI Robotic System with integrated electronic skin (eSkin) made from bendable electronics with micro and nano sensors. In total, 8 deliverables were completed.
Who needs this
Who can put this to work
If you are a robotics company whose machines still crush delicate parts because they lack fine touch feedback — this project built a Haptic AI Robotic System integrated with eSkin that perceives multi-dimensional contact input. With 4 partners across 3 countries combining neuroscience and nanotechnology, the system enables robots to handle objects with human-like sensitivity, reducing breakage and quality rejects.
If you are a medical device company whose remote surgical tools give surgeons zero tactile feedback — this project demonstrated a system for rich haptic communication between two remotely located humans. The predictive coding approach means the system compensates for signal variability, delivering stable and reliable touch sensations even over distance, which is critical when a surgeon needs to feel tissue resistance in real time.
Quick answers
What would it cost to license or integrate this haptic eSkin technology?
No pricing information is available from the project data. The technology was developed under a EUR 2,988,285 EU-funded research grant with 4 partners. Licensing terms would need to be negotiated directly with the coordinator at Lunds Universitet and the SME partner in the consortium.
Can this scale to industrial production volumes?
The eSkin is based on bendable electronics integrating micro and nano sensors, which suggests manufacturability but not yet mass production. As a Research and Innovation Action under FET Open, the focus was on proving the concept works rather than scaling production lines. Moving to volume manufacturing would require further engineering and investment.
Who owns the intellectual property?
IP was generated under a Horizon 2020 Research and Innovation Action with 4 partners across Sweden, France, and the UK. Ownership typically follows EU grant rules where each partner owns what they created. The consortium includes 1 SME with commercial interest, which may hold key exploitation rights.
How does this compare to existing haptic technology on the market?
Current commercial haptic devices mostly use simple vibration motors that deliver limited sensations. This project's approach is fundamentally different — it mimics how the brain's predictive coding processes touch across large sensor populations. The demonstrated eSkin captures multi-dimensional input, going well beyond what vibration-based systems offer.
What was actually demonstrated and tested?
The consortium delivered 2 working demonstrations: a system for rich haptic communication between two remotely located humans, and a Haptic AI Robotic System integrated with eSkin. These were built across 8 total deliverables during the project period from 2019 to 2022.
Is this ready for regulatory approval in medical or consumer applications?
Based on available project data, no regulatory submissions were part of the project scope. The FET Open funding scheme targets breakthrough research, not market-ready products. Any company looking to deploy this in regulated sectors like medical devices would need to conduct their own certification process.
Who built it
The ph-coding consortium is compact — 4 partners across France, Sweden, and the UK, led by Lunds Universitet. With 3 universities and 1 industry partner (an SME), the team is heavily research-oriented at 75% academic. The 25% industry ratio through a single SME provides some commercial grounding but limited market pull. The EUR 2,988,285 budget is modest for hardware development spanning nanotechnology, robotics, and neuroscience. A business looking to adopt this technology should expect to engage primarily with academic partners and may need to bring additional engineering capacity for product development.
- LUNDS UNIVERSITETCoordinator · SE
- IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINEparticipant · UK
- ACTRONIKAparticipant · FR
- UNIVERSITY OF GLASGOWparticipant · UK
Coordinator is Lunds Universitet in Sweden. Use SciTransfer's contact service to get a direct introduction to the research team.
Talk to the team behind this work.
Want to explore licensing the eSkin technology or integrating predictive haptics into your product? SciTransfer can arrange a direct introduction to the ph-coding research team and help structure the conversation around your specific use case.