If you are an assistive technology company looking for the next generation of hands-free computer interfaces — this project developed prototype applications that combine eye-tracking, EEG recording, and bio-measurement sensors into a unified control system. It was tested with 3 different cohorts of disabled users across 8 partner organizations in 5 countries, covering Parkinson's disease, muscular disorders, and tetraplegia. The OS-level API extensions mean the technology can integrate with existing software rather than requiring proprietary platforms.
Eye-Tracking and Brain-Computer Interfaces That Let Disabled Users Control Software Hands-Free
Imagine you can't move your hands but your brain works perfectly fine — you just can't click a mouse or type on a keyboard. MAMEM built technology that reads your eye movements and brain signals so you can control a computer without touching anything. Think of it like a remote control powered by where you look and what you think. They tested it with real patients who have Parkinson's disease, muscular disorders, and tetraplegia, letting them browse social media, create content, and communicate with family again.
What needed solving
Millions of people with neuromuscular conditions like Parkinson's disease, muscular dystrophy, and tetraplegia cannot use standard computer interfaces — they are locked out of digital communication, social media, and workplace tools. Current assistive technologies are often limited, expensive, or require residual motor control that many patients do not have. Companies building accessible software or assistive devices need better input methods that work for people with severe motor impairments.
What was built
The project built prototype interface applications that let users control computers through eye movements and brain signals (EEG). Deliverable D5.3 is the final implemented prototype that integrates with existing operating systems and multimedia tools — including social media sharing, content creation, and general computer navigation — validated through pilot trials with 3 cohorts of disabled users.
Who needs this
Who can put this to work
If you are a software company that needs to meet accessibility requirements — this project created OS-level API extensions that translate eye movements and mental commands into standard interface actions (mouse movements, selections, multi-item choices). The technology was piloted with real disabled users for both leisure and workplace tasks, backed by EUR 2,704,375 in EU funding across 21 deliverables. This could help you serve the growing market of enterprises required to provide accessible digital workplaces.
If you run care facilities or provide rehabilitation technology for patients with progressive neuromuscular conditions — this project built and tested interface applications that help people who have lost voluntary muscle control stay connected through multimedia and social networks. The persuasive design principles and user-ability profiling developed by the consortium of 3 universities and 2 research centers mean the interfaces adapt to each patient's specific level of disability. Pilot results informed the final prototype design documented in deliverable D5.3.
Quick answers
What would it cost to license or adopt this technology?
The project was publicly funded with EUR 2,704,375 from the EU under a Research and Innovation Action (RIA). Licensing terms would need to be negotiated directly with the coordinator (CERTH, Greece) and relevant consortium partners. RIA projects typically allow flexible IP arrangements, but specific costs are not published.
Can this scale to commercial production?
The project delivered prototype interface applications (D5.2 and D5.3) that extend existing operating system APIs, which means the software layer could scale without hardware redesign. However, the system relies on off-the-shelf eye-trackers, EEG recorders, and bio-measurement sensors — so scaling depends on the cost and availability of these devices. Moving from pilot prototypes to a commercial product would require additional engineering and certification.
Who owns the intellectual property?
IP is distributed among the 8 consortium partners across 5 countries (Germany, Greece, Israel, Italy, Netherlands). Under Horizon 2020 RIA rules, each partner typically owns the IP they generated. A licensing or collaboration agreement with the coordinator CERTH would be the starting point for access.
Has this been tested with real users?
Yes. The project engaged 3 different cohorts of disabled users — people with Parkinson's disease, muscular disorders, and tetraplegia. Pilot trials were conducted and their results fed into the final prototype design (deliverable D5.3). The prototypes were evaluated for effectiveness in changing user behavior and acceptance.
What hardware is required?
The system uses an eye-tracker, an EEG recorder, and bio-measurement sensors. These are commercially available devices. The project's contribution is the software layer — pattern recognition algorithms and OS-level API extensions — that translates signals from these devices into computer control commands.
What regulations apply to this kind of technology?
As a medical assistive device, commercial deployment would likely require CE marking under the EU Medical Device Regulation (MDR) if marketed for clinical use. If positioned as a general accessibility tool rather than a medical device, regulatory requirements may be lighter. Based on available project data, the regulatory pathway was not the primary focus of the research.
How does this compare to existing voice control or switch-based systems?
Unlike voice control, this works for users who have lost speech ability alongside muscle control. Unlike switch-based systems that offer limited input options, MAMEM supports low-level control (mouse movement), mid-level actions (checkbox selection), and high-level operations (selecting multiple items from a list). This multi-level approach was specifically designed for users with severe motor impairments.
Who built it
The MAMEM consortium brings together 8 partners from 5 countries (Germany, Greece, Israel, Italy, Netherlands), with a balanced mix of 3 universities, 2 research centers, and 2 industry players including 2 SMEs. The 25% industry ratio shows meaningful private-sector involvement, though the consortium leans academic. The coordinator is CERTH (Greece), a major national research center. The multi-country spread across Western Europe and Israel gives the technology exposure to different healthcare systems and accessibility standards. For a business looking to adopt or license this technology, the presence of industrial partners suggests some commercial thinking was built into the project, though the primary orientation was research-driven with EUR 2,704,375 in EU funding.
- ETHNIKO KENTRO EREVNAS KAI TECHNOLOGIKIS ANAPTYXISCoordinator · EL
- ARISTOTELIO PANEPISTIMIO THESSALONIKISparticipant · EL
- MEDICAL RESEARCH INFRASTRUCTURE DEVELOPMENT AND HEALTH SERVICES FUND BY THE SHEBA MEDICAL CENTERparticipant · IL
- SENSOMOTORIC INSTRUMENTS GESELLSCHAFT FUR INNOVATIVE SENSORIK MBHparticipant · DE
- UNIVERSITAT KOBLENZparticipant · DE
- TECHNISCHE UNIVERSITEIT EINDHOVENparticipant · NL
CERTH (Centre for Research and Technology Hellas), Greece — contact through SciTransfer for a warm introduction to the project team.
Talk to the team behind this work.
Want to explore licensing or piloting MAMEM's eye-and-brain control technology in your products? SciTransfer can connect you directly with the research team and help structure the conversation.