Dissecting the contribution of recent reward versus recent performance history on cognitive effort allocation

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

An extensive body of literature has shown that humans tend to avoid expending cognitive effort, just like for physical effort or financial resources. How then, do we decide whether to put this effort in? Decision-making not only involves choosing our actions, but also the meta-decision of how much cognitive effort to invest in making this choice, weighing the costs of cognitive effort against potential rewards. Popular recent theories, grounded in the field of reinforcement learning, suggest that this cost-benefit trade-off can be informed by the opportunity costs of effort investment, which the brain may approximate by the estimated average reward rate per unit time. It follows from intuition that in a low reward environment, investing cognitive resources in the task at hand will less likely lead to missed opportunities. Recent studies provided support for this idea, showing that people exert more cognitive effort when reward rate is low. Here, we replicate one of the key previous findings but provide an important nuance to this result. Cognitive effort allocation was better explained by participants' recent performance history (i.e. accuracy rate) than average reward rate. In combination with the observation that participants were insensitive to the reward currently at stake, this invites a reinterpretation of these previous findings and suggests the need for further studies to assess whether environmental richness may indeed serve as a heuristic to modulate cognitive effort allocation.

Article activity feed