Embodied visual route following by an insect-inspired robot

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In the visual 'teach-and-repeat' task, a mobile robot is expected to perform path following based on visual memory acquired along a route that it has traversed. Following a visually familiar route is also a critical navigation skill for foraging insects, which they accomplish robustly despite tiny brains.Inspired by the well-understood associative learning ability of the mushroom body circuit in the insect brain, we develop an embodied model that can accomplish visual teach-and-repeat efficiently. Critical to the performance is steering the robot body reflexively based on the relative familiarity of left and right visual fields without stopping and scanning regularly for optimal directions.The model is robust against noise in visual processing and motor control, and can produce comparable performance to pure pursuit or visual localisation methods that heavily rely on the estimation of positional information. The model is tested on a real robot and also shown to be able to correct for significant intrinsic steering bias.

Article activity feed