TY - JOUR
T1 - Facial gestures are enacted through a cortical hierarchy of dynamic and stable codes
AU - Ianni, Geena R.
AU - Vázquez, Yuriria
AU - Rouse, Adam G.
AU - Schieber, Marc H.
AU - Prut, Yifat
AU - Freiwald, Winrich A.
N1 - Publisher Copyright:
© 2026 American Association for the Advancement of Science. All rights reserved.
PY - 2026/1/8
Y1 - 2026/1/8
N2 - INTRODUCTION: Faces are central to primate social life. Beyond their static features, dynamic facial gestures convey critical information about internal states, intentions, and social hierarchies. Although much is known about how the brain perceives faces, the neural mechanisms that generate facial gestures remain poorly understood. In primates, the muscles of facial expression are under direct cortical control from multiple regions. It is unclear whether gestures are governed by distinct medial and lateral circuits or by an integrated, distributed cortical network. RATIONALE: We investigated how the primate cortex generates naturalistic facial gestures by combining functional localization (functional magnetic resonance imaging) of cortical circuitry with multichannel array recordings in four regions: primary motor (M1), ventral premotor (PMv), cingulate motor (M3), and primary somatosensory (S1) cortex. Subjects engaged in a naturalistic paradigm while we simultaneously recorded from single cells across these regions and continuously tracked ongoing movement, including ethologically recognizable facial gestures. We then investigated single-cell properties and neural population activity within each face motor region to address two questions: How do cortical regions encode facial gestures, and how are these codes organized across space and time? RESULTS: Face motor regions across the cortex were broadly and equally involved in all gesture types, overturning a classic view that the medial cortex encodes emotional movements, and the lateral cortex voluntary ones. Instead, every region contained both broadly tuned and gesture-specific cells. Analysis of neural population dynamics revealed that gesture category was decodable in all regions not only during movement but also well before its onset, with distinct activity trajectories underlying each gesture. Neural activity, especially in motor and somatosensory cortex, predicted moment-by-moment facial kinematics during gesture production, yet neuron-neuron correlations were gesture specific despite shared effectors. Temporal generalization decoding revealed a hierarchy of coding strategies across regions: lateral motor and somatosensory cortices used highly dynamic codes that changed over time, consistent with real-time control of movement. Medial cingulate cortex used a temporally stable code that generalized across premovement and movement epochs, and the premotor cortex showed intermediate, moderately stable dynamics. CONCLUSION: Our findings establish that facial gesture production is supported by a distributed cortical network hierarchically organized by temporal dynamics, balancing dynamic and stable coding strategies. In facial gesture production, muscle kinematics are the end product of a larger sensory motor process wherein cognitive variables, such as visual information, internal states, and somatosensory feedback, each play a role. Intimate cortical involvement in gesture production, along with the segregation of neural states before movement onset, indicate that facial gestures are not merely reflexive outputs but that they arise from the integration of contextual and sensory information prior to behavior. This temporal hierarchy in cortical codes for facial gestures may provide a mechanism by which a motor system integrates these cues while producing well-timed and interpretable motor outputs. In addition to refining long-standing models of facial motor control, this updated framework has translational implications: Understanding how cortical codes generate naturalistic communication may inform brain-computer interfaces designed to restore these functions in patients.
AB - INTRODUCTION: Faces are central to primate social life. Beyond their static features, dynamic facial gestures convey critical information about internal states, intentions, and social hierarchies. Although much is known about how the brain perceives faces, the neural mechanisms that generate facial gestures remain poorly understood. In primates, the muscles of facial expression are under direct cortical control from multiple regions. It is unclear whether gestures are governed by distinct medial and lateral circuits or by an integrated, distributed cortical network. RATIONALE: We investigated how the primate cortex generates naturalistic facial gestures by combining functional localization (functional magnetic resonance imaging) of cortical circuitry with multichannel array recordings in four regions: primary motor (M1), ventral premotor (PMv), cingulate motor (M3), and primary somatosensory (S1) cortex. Subjects engaged in a naturalistic paradigm while we simultaneously recorded from single cells across these regions and continuously tracked ongoing movement, including ethologically recognizable facial gestures. We then investigated single-cell properties and neural population activity within each face motor region to address two questions: How do cortical regions encode facial gestures, and how are these codes organized across space and time? RESULTS: Face motor regions across the cortex were broadly and equally involved in all gesture types, overturning a classic view that the medial cortex encodes emotional movements, and the lateral cortex voluntary ones. Instead, every region contained both broadly tuned and gesture-specific cells. Analysis of neural population dynamics revealed that gesture category was decodable in all regions not only during movement but also well before its onset, with distinct activity trajectories underlying each gesture. Neural activity, especially in motor and somatosensory cortex, predicted moment-by-moment facial kinematics during gesture production, yet neuron-neuron correlations were gesture specific despite shared effectors. Temporal generalization decoding revealed a hierarchy of coding strategies across regions: lateral motor and somatosensory cortices used highly dynamic codes that changed over time, consistent with real-time control of movement. Medial cingulate cortex used a temporally stable code that generalized across premovement and movement epochs, and the premotor cortex showed intermediate, moderately stable dynamics. CONCLUSION: Our findings establish that facial gesture production is supported by a distributed cortical network hierarchically organized by temporal dynamics, balancing dynamic and stable coding strategies. In facial gesture production, muscle kinematics are the end product of a larger sensory motor process wherein cognitive variables, such as visual information, internal states, and somatosensory feedback, each play a role. Intimate cortical involvement in gesture production, along with the segregation of neural states before movement onset, indicate that facial gestures are not merely reflexive outputs but that they arise from the integration of contextual and sensory information prior to behavior. This temporal hierarchy in cortical codes for facial gestures may provide a mechanism by which a motor system integrates these cues while producing well-timed and interpretable motor outputs. In addition to refining long-standing models of facial motor control, this updated framework has translational implications: Understanding how cortical codes generate naturalistic communication may inform brain-computer interfaces designed to restore these functions in patients.
UR - https://www.scopus.com/pages/publications/105027671016
U2 - 10.1126/science.aea0890
DO - 10.1126/science.aea0890
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
C2 - 41505546
AN - SCOPUS:105027671016
SN - 0036-8075
VL - 391
JO - Science
JF - Science
IS - 6781
M1 - eaea0890
ER -