Distinct cortical populations drive multisensory modulation of segregated auditory sources

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Auditory perception can be modulated by other sensory stimuli. However, we do not fully understand the neural mechanisms that support multisensory behavior. Here, we recorded spiking activity from the primary auditory cortex (A1) in non-human primates, while they detected a target vocalization that was embedded in a background chorus of vocalizations. We found that a congruent video of a monkey eliciting a vocalization improved the monkeys’ behavior, relative to their performance when we only presented a static image of the monkey. As a proxy for the functional organization of A1, we compared the contribution of neurons with significant spectrotemporal response fields (STRFs) with those that had non-significant STRFs (nSTRFs). Based on spike-waveform shape and functional connectivity, STRF and nSTRF neurons appeared to belong to distinct neural populations. Consistent with this, we found that although STRF neurons encoded stimulus information through synchronized activity, the population of nSTRF neurons encoded task-related information in the primate A1 more as a structured dynamic process. Together, these findings demonstrate a functional distinction between the behavioral contributions of nSTRF and STRF neurons.

Article activity feed