If you are a digital literacy training company struggling to create up-to-date curriculum on algorithmic bias — this project developed educational materials and showcase tools that demonstrate how algorithms filter information. These resources, created across a 5-partner consortium in 4 countries, could be licensed or adapted for corporate and school training programs.
Algorithmic Bias Detection Tools and Digital Literacy Training for Organizations
You know how social media feeds, search engines, and job platforms all use secret recipes (algorithms) to decide what you see? Those recipes can be biased — showing certain people fewer job ads or skewing the news you read. This project built a center in Cyprus focused on understanding and exposing those hidden biases. They created prototype tools and educational materials so that teachers, librarians, and everyday people can spot when an algorithm is steering them wrong.
What needed solving
Organizations increasingly depend on algorithms for hiring, content delivery, customer targeting, and decision-making — but most have no way to detect or measure bias in those systems. With growing regulatory pressure (EU AI Act) and public scrutiny, companies risk both legal liability and reputational damage from opaque algorithmic processes they don't fully understand.
What was built
The project built a prototype software tool for detecting algorithmic bias (WP4 intervention tool), educational materials for librarians and teachers on algorithmic transparency, and digital artifacts demonstrating how algorithms filter citizens' access to information. In total, 22 deliverables were produced.
Who needs this
Who can put this to work
If you are a media company concerned about algorithmic filtering distorting how your content reaches audiences — this project built prototype software to analyze algorithmic gatekeeping of information access. With Cyprus ranking 22nd in the EU on the Digital Economy and Society Index, the tools were designed for real-world digital skill gaps and could help audit your content distribution.
If you are a public library system or local authority looking to improve citizens' digital skills — this project created digital artifacts and educational materials specifically designed for librarians and teachers to raise awareness of algorithmic bias. The materials were tested with local authorities and educators across 4 countries.
Quick answers
What would it cost to access CyCAT's tools or educational materials?
The project was funded with EUR 999,965 in EU contribution under a Coordination and Support Action. As an academically-led project with no commercial partners, materials may be available through open-access or academic licensing. Contact the Open University of Cyprus for specific terms.
Can these tools work at industrial scale for a large organization?
The project produced a prototype tool-based intervention from WP4, which suggests early-stage software rather than enterprise-grade deployment. Scaling would likely require additional development. The consortium of 5 university partners across 4 countries focused on research and education rather than commercial scalability.
What is the IP situation — can we license or build on this?
With all 5 consortium partners being universities and zero industry partners, IP likely sits with the academic institutions. As an EU-funded Coordination and Support Action, certain outputs may be open-access. Licensing terms should be discussed directly with the Open University of Cyprus as coordinator.
Is this relevant outside Cyprus?
While the project was triggered by Cyprus ranking 22nd in the EU on the 2017 Digital Economy and Society Index, the consortium spanned 4 countries (Cyprus, Israel, Italy, UK). The algorithmic transparency tools and educational materials address a universal challenge not limited to any single country.
What concrete outputs exist that a business could use?
The project delivered 22 total deliverables including 1 demonstrated prototype: a tool-based intervention software from WP4. Additional outputs include educational materials for librarians and teachers, and digital artifacts designed to raise awareness of algorithmic bias in everyday systems.
How current is this work given the project ended in 2021?
The project ran from 2018 to 2021 and is now closed. Algorithmic transparency has only grown more relevant since then, but the specific tools may need updating. Check cycat.io for any post-project continuation or newer outputs.
Who built it
This is a purely academic consortium — all 5 partners across 4 countries (Cyprus, Israel, Italy, UK) are universities, with zero industry involvement and zero SMEs. The Open University of Cyprus leads the effort. For a business considering adoption, this means the outputs are research-grade rather than market-tested. There are no commercial partners who have validated the tools in real-world settings, and no built-in path to market. Any business interested in these outputs would need to invest in adaptation and scaling beyond what the project delivered.
- ANOIKTO PANEPISTIMIO KYPROU (OPEN UNIVERSITY OF CYPRUS)Coordinator · CY
- UNIVERSITY OF HAIFAparticipant · IL
- THE UNIVERSITY OF SHEFFIELDparticipant · UK
- UNIVERSITA DEGLI STUDI DI TRENTOparticipant · IT
- THE UNIVERSITY OF EDINBURGHparticipant · UK
Open University of Cyprus — search for CyCAT center leadership or the university's research office
Talk to the team behind this work.
SciTransfer can help you evaluate whether CyCAT's algorithmic transparency tools and training materials fit your organization's needs and facilitate an introduction to the research team.