SciTransfer
STEP2DYNA · Project

Bio-Inspired Collision Detection Chips for Safer Vehicles, Drones, and Robots

transportPrototypeTRL 4

Imagine how a locust can dodge objects flying at it from any direction — its tiny brain processes visual information incredibly fast with almost no energy. This project studied exactly how those insect visual neurons work and built computer models that mimic them. The team then put those models onto microchips and tested them on robots and drones, creating a miniature collision detection sensor that works like an insect's eye. The goal is a cheap, tiny, low-power alternative to radar and laser systems for avoiding crashes.

By the numbers
3,560
people dying daily from vehicle collisions worldwide
EUR 1,008,000
EU contribution to the project
11
consortium partners across 6 countries
20
total deliverables produced
The business problem

What needed solving

Every day, 3,560 people die from vehicle collisions worldwide, and autonomous drones and robots remain restricted because current collision detection systems (radar, lidar, GPS) are too expensive, too power-hungry, and too bulky. Industries from automotive to logistics to agriculture need a fundamentally different approach — a cheap, miniature, low-energy sensor that can reliably detect incoming collisions in fast-changing environments.

The solution

What was built

The project built bio-inspired visual neural system models mimicking insect collision detection, identified a VLSI chip architecture for realizing these models in hardware, and produced working demonstrator systems on both robotic and UAV platforms — including a refined UAV demonstrator tested against complex backgrounds.

Audience

Who needs this

Drone manufacturers needing lightweight collision avoidance for delivery or inspection UAVsAutomotive Tier-1 suppliers developing next-generation ADAS sensor packagesWarehouse robotics companies building mobile robots for dynamic environmentsAgricultural drone operators requiring obstacle detection for precision farmingConsumer electronics companies integrating collision detection into personal mobility devices
Business applications

Who can put this to work

Autonomous drone operations
SME
Target: Drone manufacturers and delivery logistics companies

If you are a drone manufacturer struggling with collision avoidance in cluttered environments — this project developed bio-inspired visual neural chips tested on UAV platforms that detect collision risks in dynamic environments. With 3,560 people dying daily from vehicle collisions worldwide, the same chip technology that makes drones safer could open new commercial airspace approvals for your fleet.

Automotive safety systems
enterprise
Target: Automotive Tier-1 suppliers and ADAS developers

If you are an automotive supplier looking for lower-cost alternatives to radar and lidar for collision detection — this project built visual neural system models realized in chip form, specifically designed for collision detection in dynamic environments. The chip approach promises lower energy consumption and smaller size than current sensor systems, potentially reducing per-unit sensor costs in mass production.

Industrial robotics and warehouse automation
mid-size
Target: Robot manufacturers and warehouse automation integrators

If you are a robotics company needing affordable collision avoidance for mobile robots in busy warehouse floors — this project implemented visual neural systems on robotic platforms and demonstrated collision avoidance using bio-inspired spatial-temporal processing. The low-cost, miniaturized sensor design is built for exactly the kind of fast-changing environments your robots operate in.

Frequently asked

Quick answers

What would a collision detection chip like this cost compared to radar or lidar?

The project objective explicitly targets 'low cost, low energy consumption and miniaturized' sensors as an alternative to radar, laser-based lidar, and GPS. Specific per-unit pricing is not available from the project data, but the VLSI chip approach is inherently cheaper to mass-produce than multi-component radar or lidar assemblies.

Can this technology work at industrial scale in real traffic or warehouse conditions?

The project demonstrated collision avoidance on robotic platforms and refined a demonstrator system on a UAV platform against complex backgrounds. However, these were lab and controlled-environment tests within a research consortium — full-scale deployment in live traffic or high-throughput warehouses has not been documented.

Who owns the intellectual property and can I license this technology?

The project was coordinated by the University of Lincoln (UK) with 10 university partners and 1 industry partner across 6 countries. IP would be governed by the consortium agreement. Based on available project data, no commercial licensing program has been announced — a direct conversation with the coordinator would be needed.

How does this compare to existing camera-based ADAS systems?

Unlike conventional camera systems that rely on heavy software processing, this approach uses dedicated neural vision chips that mimic insect brains for spatial-temporal processing. The project objective highlights advantages in reliability, energy consumption, and physical size over current approaches including radar and laser-based systems.

What is the current development stage — is this ready to integrate?

The consortium produced final visual neural system models, a VLSI chip structure identification, and demonstrator systems on both robot and UAV platforms. This places the technology at a validated prototype stage, likely requiring further engineering and certification before commercial integration.

Which regulations would this need to pass for automotive or drone use?

Based on available project data, regulatory compliance was not a primary focus of this research project. Automotive deployment would require meeting ISO 26262 functional safety standards, while drone use would need to satisfy EASA or FAA sense-and-avoid requirements. These certification steps remain ahead.

Consortium

Who built it

The STEP2DYNA consortium brings together 11 partners from 6 countries (Argentina, China, Germany, Japan, Malaysia, UK), but is overwhelmingly academic — 10 out of 11 partners are universities, with just 1 industry partner and zero SMEs. The industry ratio sits at only 9%. This is typical for an MSCA-RISE staff exchange project focused on building research capacity rather than commercialization. For a business considering this technology, the heavy academic composition means strong scientific foundations but a clear gap in commercial readiness, manufacturing know-how, and market validation. Any company looking to adopt this would likely need to invest in engineering the research outputs into a production-grade product.

How to reach the team

University of Lincoln, UK — reach out to the School of Computer Science robotics research group

Next steps

Talk to the team behind this work.

Want to explore licensing this bio-inspired collision detection technology for your product line? SciTransfer can connect you directly with the research team and help structure the conversation.

More in Transport & Mobility
See all Transport & Mobility projects