Neural Mechanisms of Emotion Authenticity Recognition: an fMRI Study with Dynamic Facial Stimuli

Presented During:

Saturday, June 28, 2025: 11:30 AM - 12:45 PM
Brisbane Convention & Exhibition Centre  
Room: M3 (Mezzanine Level)  

Poster No:

617 

Submission Type:

Abstract Submission 

Authors:

Alexa Schincariol1, Camilla Frangi2, Cristiano Costa1, Giulia Melis2, Nicola Filippini3, Rachele Pezzetta3, Cristina Scarpazza2

Institutions:

1Padova Neuroscience Center, University of Padua, Padova, Italy, 2Department of General Psychology, University of Padua, Padova, Italy, 3IRCCS Ospedale San Camillo, Venezia, Italy

First Author:

Alexa Schincariol  
Padova Neuroscience Center, University of Padua
Padova, Italy

Co-Author(s):

Camilla Frangi  
Department of General Psychology, University of Padua
Padova, Italy
Cristiano Costa  
Padova Neuroscience Center, University of Padua
Padova, Italy
Giulia Melis  
Department of General Psychology, University of Padua
Padova, Italy
Nicola Filippini  
IRCCS Ospedale San Camillo
Venezia, Italy
Rachele Pezzetta  
IRCCS Ospedale San Camillo
Venezia, Italy
Cristina Scarpazza  
Department of General Psychology, University of Padua
Padova, Italy

Introduction:

The ability to discern genuine from posed emotional expressions is crucial for social interactions, as it underpins trust, empathy, and relationship-building (Lange et al., 2022; Van Kleef & Côté, 2022). Facial expressions serve as a universal language of emotion (Ekman, 1993), but they can be deliberately manipulated, complicating interpretation (Crivelli et al., 2015). Misjudging posed expressions as genuine can result in adverse social consequences, including misunderstandings and reduced emotional well-being (Miles & Johnston, 2007). This highlights the importance of accurately interpreting emotion authenticity. Despite its relevance, research in this area has been limited, with an overreliance on posed stimuli that lack ecological validity (Dawel et al., 2017). This study leverages dynamic stimuli to examine the neural mechanisms underlying emotion authenticity recognition, addressing critical methodological gaps (Namba et al., 2018; Zinchenko et al., 2018).

Methods:

Thirty-one participants (15 females, mean age = 25.16 ± 4.47 years) completed a task-based fMRI study at the IRCCS San Camillo Hospital in Venice (Italy). Stimuli (n = 36) were sourced through a pilot study from the Padova Emotional Dataset of Facial Expressions (PEDFE; Miolla et al., 2023) and included videos depicting genuine and simulated expressions of happiness, disgust, and fear. The videos were presented in completely randomized order in an event-related design, with participants responding using a two-button box to classify emotions as genuine or simulated. Each participant completed 3 runs of the task. Scanning was conducted using a 3T Ingenia Scanner with T2*-weighted EPI sequences for task-based fMRI (TR=1000ms, TE=23ms) and diffusion images for distortion correction. Structural scans were acquired with high-resolution 3D T1-weighted MPRAGE sequences (TR=8.1ms, TE=3.7ms). Behavioral performance was analyzed using repeated-measures ANOVA. fMRI data were preprocessed and analyzed using FSL. The analyses were conducted with a three-level approach: the first two levels focused on subject-level analyses, whereas the third combined data from different subjects (group-level analysis, FWE-corrected). Segmented structural data were included as covariates to account for structural influences on BOLD differences.
Supporting Image: Figure_1.png
   ·Schematic representation of the experimental paradigm and timeline of two example trials.
 

Results:

Behavioral results revealed an interaction between emotion and authenticity (F(1,30) = 17.58, F(2,60) = 27.83, p < .001). Participants showed higher recognition accuracy for posed disgust and fear compared to their genuine counterparts, whereas the reverse was true for happiness. fMRI analyses identified increased activation for genuine versus simulated expressions in the right lateral occipital cortex, superior temporal gyrus, inferior frontal gyrus, and posterior cingulate cortex (Z > 3.1, p < 0.05 FWE-corrected). Additional activation was observed in the angular gyrus and cerebellum (Z > 3.1, p < 0.05 FWE-corrected).
Supporting Image: Figure_2.png
   ·fMRI results: Increased activation (red-yellow) and BOLD signal changes for genuine (orange), simulated (green), and neutral (white) expressions, averaged across runs and emotions. R: right; L: left.
 

Conclusions:

The findings underscore the importance of using dynamic, genuine stimuli to investigate emotion recognition processes instead of static posed stimuli, which lack ecological validity and fail to capture the dynamics of emotional expressions (Zinchenko et al., 2018). The observed activation patterns support the hypothesis that the perception of authentic emotional displays recruits an at least partly distinct neural network from the one recruited in the perception of posed facial expressions, including areas involved in facial and affective processing. Activations in regions associated with social cognition (angular gyrus and cerebellum) align with prior evidence that dynamic stimuli enhance emotion processing (Dobs et al., 2018; Krumhuber et al., 2021). This work advances knowledge of how the brain differentiates genuine emotions, with implications for clinical populations with social cognition impairments, such as those with autism or schizophrenia (Meyer-Lindenberg & Tost, 2012).

Emotion, Motivation and Social Neuroscience:

Emotional Perception 1
Social Cognition

Modeling and Analysis Methods:

Activation (eg. BOLD task-fMRI) 2

Novel Imaging Acquisition Methods:

Anatomical MRI
BOLD fMRI

Keywords:

Cognition
Emotions
Experimental Design
FUNCTIONAL MRI
MRI
Perception
Social Interactions
STRUCTURAL MRI
Other - emotion authenticity

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.

Please indicate below if your study was a "resting state" or "task-activation” study.

Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Functional MRI
Structural MRI

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

FSL

Provide references using APA citation style.

1. Crivelli, C., Carrera, P., & Fernández-Dols, J. M. (2015). Are smiles a sign of happiness? Spontaneous expressions of judo winners. Evolution and Human Behavior, 36(1), 52-58.
2. Dobs, K., Bülthoff, I., & Schultz, J. (2018). Use and usefulness of dynamic face stimuli for face perception studies - A review of behavioral findings and methodology. Frontiers in psychology, 9, 1355.
3. Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48(4), 384–392.
4. Krumhuber, E. G., Küster, D., Namba, S., & Skora, L. (2021). Human and machine validation of 14 databases of dynamic facial expressions. Behavior research methods, 53, 686-701.
5. Meyer-Lindenberg, A., & Tost, H. (2012). Neural mechanisms of social risk for psychiatric disorders. Nature neuroscience, 15(5), 663-668.
6. Miolla, A., Cardaioli, M., & Scarpazza, C. (2023). Padova Emotional Dataset of Facial Expressions (PEDFE): a unique dataset of genuine and posed emotional facial expressions. Behavior Research Methods, 55(5), 2559-2574.
7. Miles, L., & Johnston, L. (2007). Detecting happiness: Perceiver sensitivity to enjoyment and non-enjoyment smiles. Journal of Nonverbal Behavior, 31, 259-275.
8. Namba, S., Kabir, R. S., Miyatani, M., & Nakao, T. (2018). Dynamic displays enhance the ability to discriminate genuine and posed facial expressions of emotion. Frontiers in psychology, 9, 672.
9. Van Kleef, G. A., & Côté, S. (2022). The social effects of emotions. Annual review of psychology, 73(1), 629-658.
10. Zinchenko, O., Yaple, Z. A., & Arsalidou, M. (2018). Brain responses to dynamic facial expressions: a normative meta-analysis. Frontiers in Human Neuroscience, 12, 227.

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No