Monkey Prefrontal Cortex Learns to Minimize Sequence Prediction Error

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In this study, we develop a novel recurrent neural network (RNN) model of pre-frontal cortex that predicts sensory inputs, actions, and outcomes at the next time step. Synaptic weights in the model are adjusted to minimize sequence prediction error, adapting a deep learning rule similar to those of large language models. The model, called Sequence Prediction Error Learning (SPEL), is a simple RNN that predicts world state at the next time step, but that differs from standard RNNs by using its own prediction errors from the previous state predictions as inputs to the hidden units of the network. We show that the time course of sequence prediction errors generated by the model closely matched the activity time courses of populations of neurons in macaque prefrontal cortex. Hidden units in the model responded to combinations of task variables and exhibited sensitivity to changing stimulus probability in ways that closely resembled monkey prefrontal neurons. Moreover, the model generated prolonged response times to infrequent, unexpected events as did monkeys. The results suggest that prefrontal cortex may generate internal models of the temporal structure of the world even during tasks that do not explicitly depend on temporal expectation, using a sequence prediction error minimization learning rule to do so. As such, the SPEL model provides a unified, general-purpose theoretical framework for modeling the lateral prefrontal cortex.

Article activity feed