Brain correlates of orofacial movements induced by naturalistic audiovisual stimuli

Poster No:

1689 

Submission Type:

Abstract Submission 

Authors:

CHANGJIN JUNG1, Sunyoung Choi1, Hyungjun Kim1

Institutions:

1Korea Institute of Oriental Medicine, Daejeon, South Korea

First Author:

CHANGJIN JUNG  
Korea Institute of Oriental Medicine
Daejeon, South Korea

Co-Author(s):

Sunyoung Choi  
Korea Institute of Oriental Medicine
Daejeon, South Korea
Hyungjun Kim  
Korea Institute of Oriental Medicine
Daejeon, South Korea

Introduction:

Facial expressions are known to reveal emotional states and play an important role in social communication(DARWIN, 1998; FRIDLUND, 1994). Facial muscles are structurally connected with motor nuclei in the brainstem(CATTANEO; PAVESI, 2014) and cortical regions(DOBSON, 2011; SHERWOOD, 2004), but the specific underlying mechanisms that give rise to different facial expressions remain unclear. In line with Darwin's principles, facial expressions may have originated in biological function(DARWIN, 1998), such as altering sensory afference(SUSSKIND, 2008). Neural circuitries regulating biological function may distinctively engage the facial muscles and reveal different affective properties. In rodents, facial movements (i.e. orofacial reaction) were elicited in response to the stimuli with hedonic or aversive valence(BERRIDGE, 2000; DOLENSEK, 2020). Here, we conducted a pilot study to investigate the time-series correlations between brain activity and orofacial movements induced by naturalistic audio-visual stimuli with emotional valences.

Methods:

We conducted fMRI scan (MAGNETOM Prisma 3.0T, TR/TE=1000/30ms, voxel size=2.0x2.0x3.5mm3 [110x110x39 voxels]) for healthy participants (n=4), presenting them with naturalistic audio-visual stimuli. The stimuli consisted of 30-second blocks of positive (joyful and neutral) and negative (sad and anxiety) emotional valence, followed by a 10-second fixation cross. MRI compatible video camera (12M, MRC Systems GmbH) recorded orofacial movements during fMRI scan. To build a prediction model for assessing orofacial movements, we recorded whole facial images for the same naturalistic audio-visual stimuli in a similar environment within MRI bore after the scan session. Using these facial images, we calculated scores for 12 action units related to orofacial movements (iMotions 7.0), based on the Facial Action Coding System(EKMAN, 1997). We then performed successive training using the pre-trained object detection model MobileNetV2 to predict the scores of these 12 action units from partial images focused on the orofacial area (prediction error<10% of the score ranges). The trained model was applied to the images recorded during fMRI scan, yielding time-series of 12 action units. Finally, we performed principle component (PC) analysis on these time-series of action units.
BOLD images were preprocessed for motion correction and spatial normalization to MNI space. Psychophysiological interaction (PPI) was assessed for whole brain voxel (PC: physiological responses; stimulus protocol: psychological context). Group average map for PPI was thresholded at z>3.28, and cluster corrected (FWE-corrected p<0.01).

Results:

We identified two PCs that explained 99% of the total valiance in orofacial movements during the naturalistic audio-visual stimuli. PC1 was related to an orofacial movement involving mouth-opening, while PC2 was related to an orofacial movement involving lip-pressing and lip-corner-depression. ANOVA revealed that the score of PC1 was higher for joyful compared to neutral, sad, and anxiety inducing stimuli (P<0.01). In contrast, the score of PC2 was higher for anxiety-inducing stimuli compared to neutral and sad stimuli (P<0.05).
The group map for PPI between PC1 and joyful demonstrated activations in regions including visual cortex/cerebellum (peak-voxel: -20, -94, 10 mm), left temporal pole (peak-voxel: -50, 8, -20 mm); as well as deactivations in regions including anterior cingulate cortex/orbital frontal complex (peak-voxel: 12, 36, -20 mm). While, the group map for PPI between PC2 and anxiety-inducing stimuli demonstrated activations in regions including right putamen (peak-voxel: 26, 0, 0 mm), left putamen (peak-voxel: -26, -6, 8mm), PAG (peak-voxel: 0, -38, -16 mm), SMA (peak-voxel: 14, 0, 68 mm).

Conclusions:

Our findings suggest that the brain may have distinct ways of controlling the orofacial movements elicited by varying emotional valence.

Emotion, Motivation and Social Neuroscience:

Emotional Perception 2
Emotion and Motivation Other

Motor Behavior:

Motor Planning and Execution
Motor Behavior Other 1

Keywords:

Emotions
Motor
Other - facial expressions

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I do not want to participate in the reproducibility challenge.

Please indicate below if your study was a "resting state" or "task-activation” study.

Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Functional MRI

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

FSL
Free Surfer

Provide references using APA citation style.

1. Berridge, K. C. (2000). Measuring hedonic impact in animals and infants: microstructure of affective taste reactivity patterns. Neurosci Biobehav Rev, 24(2), 173-198. doi:10.1016/s0149-7634(99)00072-x
2. Cattaneo, L., & Pavesi, G. (2014). The facial motor system. Neurosci Biobehav Rev, 38, 135-159. doi:10.1016/j.neubiorev.2013.11.002
3. Darwin, C., & Ekman, P. (1998). The expression of the emotions in man and animals (3rd ed.). New York: Oxford University Press.
4. Dobson, S. D., & Sherwood, C. C. (2011). Correlated evolution of brain regions involved in producing and processing facial expressions in anthropoid primates. Biol Lett, 7(1), 86-88. doi:10.1098/rsbl.2010.0427
5. Dolensek, N., Gehrlach, D. A., Klein, A. S., & Gogolla, N. (2020). Facial expressions of emotion states and their neuronal correlates in mice. Science, 368(6486), 89-94. doi:10.1126/science.aaz9468
6. EKMAN, P., & ROSENBERG, E. L. (1997). What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS): Oxford University Press, USA.
7. Fridlund, A. J. (1994). Human facial expression : an evolutionary view. London, England: Academic Press.
8. Sherwood, C. C., Holloway, R. L., Erwin, J. M., Schleicher, A., Zilles, K., & Hof, P. R. (2004). Cortical orofacial motor representation in Old World monkeys, great apes, and humans. I. Quantitative analysis of cytoarchitecture. Brain Behav Evol, 63(2), 61-81. doi:10.1159/000075672
9. Susskind, J. M., Lee, D. H., Cusi, A., Feiman, R., Grabski, W., & Anderson, A. K. (2008). Expressing fear enhances sensory acquisition. Nat Neurosci, 11(7), 843-850. doi:10.1038/nn.2138

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No