Visual social information use in collective foraging

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Collective dynamics emerge from individual-level decisions, yet we still poorly understand the link between individual-level decision-making processes and collective outcomes in realistic physical systems. Using collective foraging to study the key trade-off between personal and social information use, we present a mechanistic, spatially-explicit agent-based model that combines individual-level evidence accumulation of personal and (visual) social cues with particle-based movement. Under idealized conditions without physical constraints, our mechanistic framework reproduces findings from established probabilistic models, but explains how individual-level decision processes generate collective outcomes in a bottom-up way. In clustered environments, groups performed best if agents reacted strongly to social information, while in uniform environments, individualistic search was most beneficial. Incorporating different real-world physical and perceptual constraints profoundly shaped collective performance, and could even buffer maladaptive herding by facilitating self-organized exploration. Our study uncovers the mechanisms linking individual cognition to collective outcomes in human and animal foraging and paves the way for decentralized robotic applications.

Article activity feed