Distinct Cortical Populations Drive Multisensory Modulation of Segregated Auditory Sources

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Auditory perception can be modulated by other sensory stimuli. However, we do not fully understand the neural mechanisms that support multisensory behavior. Here, while male nonhuman primates detected a target vocalization that was embedded in a background chorus of vocalizations, we recorded spiking activity from the primary auditory cortex (A1). We found that a congruent video of a monkey eliciting the target vocalization improved the monkeys' behavior, relative to their performance when we only presented a static image of the monkey. As a proxy for the functional organization of A1, we compared the contribution of neurons with significant spectrotemporal response fields (STRFs) with those that had nonsignificant STRFs (nSTRFs). Because, on average, STRF and nSTRF neurons have different spike waveform shapes, firing rates, and neural-correlation structure, we hypothesized that they might belong to different neural populations. Consistent with this, we found that, although STRF neurons encode stimulus information through synchronized activity, task-related information in the primate A1 is encoded more as a structured dynamic process in the population of nSTRF neurons. Together, these findings suggest that modulatory multisensory behavior is supported by nSTRF neurons and identifies, for the first time, a functional differentiation between the role that STRF and nSTRF neurons contribute to behavior.

Article activity feed