Schematic illustration for the Visually Stimulated Motor Control system for a mobile robot with panoramic vision. The images captured by the panoramic vision camera of the Khepera II robot are transformed and split into two halves fed to the left and right visual neural networks, based on the locust’s lobula giant movement detector (LGMD). The visual cues are compared and then results are sent to a controller that outputs motor commands to control the left and right wheels in real time. (Credit: Shigang Yue and F. Claire Rind/International Journal of Advanced Mechatronic Systems)

Insects inspiring new robot vision technology for collision avoidance

Reverse-engineering the locust’s motion sensitive movement-detector interneuron

February 25, 2013

A computerized system that allows for autonomous navigation of mobile robots based on the locust’s unique visual system has been created by scientists from the University of Lincoln and Newcastle University

The work could provide the blueprint for the development of highly accurate vehicle collision sensors, surveillance technology, and even aid video game programming, according to the researchers.

Locusts have a distinctive way of processing information through electrical and chemical signals, giving them an extremely fast and accurate warning system for impending collisions.

Read more: Insects inspiring new robot vision technology for collision avoidance | KurzweilAI.

Home           Top of page