Neural Synchrony Links Sensorimotor Cortices in a Network for Facial Motor Control
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Primate societies rely on the production and interpretation of social signals, in particular those displayed by the face. Facial movements are controlled, according to the dominant neuropsychological schema, by two separate circuits, one originating in medial frontal cortex controlling emotional expressions, and a second one originating in lateral motor and premotor areas controlling voluntary facial movements. Despite this functional dichotomy, cortical anatomy suggests that medial and lateral areas are directly connected and may thus operate as a single network. Here we test these contrasting hypotheses through structural and functional magnetic resonance imaging (fMRI) guided electrical stimulation and simultaneous multi-channel recordings from key facial motor areas in the macaque monkey brain. These areas include medial facial motor area M3 (located in the anterior cingulate cortex); two lateral face-related motor areas: M1 (primary motor) and PMv (ventrolateral premotor); and S1 (primary somatosensory cortex). Cortical responses evoked by intracortical stimulation revealed that medial and lateral areas can exert significant functional impact on each other. Simultaneous recordings of local field potentials in all facial motor areas further confirm that during facial expressions, medial and lateral facial motor areas significantly interact, primarily in the alpha and beta frequency ranges, whereas during voluntary chewing, coupling occurs at lower frequencies. These functional interactions varied across facial movement types. Thus, at the cortical level, the control of facial movements is not mediated through independent (medial/lateral) functional streams, but results from an interacting sensorimotor network.
Significance Statement
Primates communicate through facial expressions. How the brain generates facial expressions remains poorly understood. To uncover how facial motor-related cortical brain regions interact to produce facial gestures, we combined fMRI-targeted electrophysiology and intracortical microstimulation while monkeys produced qualitatively different facial movements. Our two-pronged experimental approach revealed that facial motor-related cortical areas form an interconnected network characterized by synchronized neural activity demonstrating dynamic expression-selective activity states that are coordinated across the network nodes. Thus, the multiple facial motor-related cortical areas sending axons directly into the facial nucleus operate as a single network in which the overall complex, behavior-specific inter-areal interactions dictate the relevant motor output.