Exploring Neural Responses to Emotional Semantic and Prosodic Tasks: An fNIRS Study

Poster No:

624 

Submission Type:

Late-Breaking Abstract Submission 

Authors:

Jing Qi1, Yujia Tian1, Gang Peng1

Institutions:

1The Hong Kong Polytechnic University, Hong Kong, China

First Author:

Jing Qi  
The Hong Kong Polytechnic University
Hong Kong, China

Co-Author(s):

Yujia Tian  
The Hong Kong Polytechnic University
Hong Kong, China
Gang Peng  
The Hong Kong Polytechnic University
Hong Kong, China

Introduction:

Emotional expressions in speech are important for social interactions. To understand the emotions of a speaker depends on a variety of information, including the prosody and semantics of the speech. Previous studies have focused on the prosody of emotional speech (Bach et al., 2008; Frühholz et al., 2012; Lei et al., 2021), while the comparison between prosodic and semantic processing is less investigated. Additionally, further research is needed to explore the differences in brain regions involved in understanding various emotions. This study used fNIRS to investigate how different basic emotional categories (angry, fearful, happy, sad) of speech are processed in the cortex under prosodic task and semantic task.

Methods:

We recruited 20 native Mandarin speakers (9 male, 23.7 ± 3.45 y) to participate in the experiment with a 4 (emotions: anger vs. fearful vs. happy vs. sad) by 2 (task: prosody vs. semantics) within-subjects design. The experimental materials were sequences of multiple Mandarin disyllabic words speech. In the semantic task, the four types of emotional stimuli consisted of semantically neutral and emotional words (same emotion in 1 stimulus) in neutral prosody, with 4-7 emotional words and 1-5 neutrals. In the prosodic task, the four types of emotional stimuli were semantically neutral words in four emotional (4-7 words) and neutral prosodies (1-5 words). There were about 300-500ms scilence between words. Eventually 64 different stimuli (dura: 10 ± 0.7 s) were generated (8 for each). In both tasks, subjects were asked to count the number of words in the relevant channel (prosody\semantics) and answer "Are there 3 or more emotional words than neutral words in prosody (\semantics)." Subjects will respond after 12s of hearing the sound onset, followed by a random rest period of 12-15 s. The stimuli were presented in random order.
We recorded brain activity using NIRSport2 (NIRx Medical Technology) with a sampling rate of 10.2 Hz. The 16 emitters and 23 detectors formed 40 effective observation channels and 8 short channels (see Figure 1). The Δ[HbO] data was used as a physiological indicator. General Linear Model was conducted to calculate the task related β value under different conditions with the averaged signals of short channel as covariates. For the block design, the length of block was the duration of the material.The Hemodynamic Response Function (HRF) is used as the convolution of the time thread. Repeated measures analysis of variance was used to analyze the β values for each channel.
Supporting Image: Layout.png
   ·Figure1: Layout of fNIRS channels and optodes.
 

Results:

The F-values from the repeated measures ANOVA for β are shown in Figure 2. We observed a significant task effect on channels 34 (F(1,19) =6.69, p = 0.018) and 39 (F(1,19) =21.46, p < 0.001). with the prosodic task eliciting more cortical activity. The main effect of emotion effect appeared in the left Broca's area, STG, MTG, SMG, right Broca's area and right SMG (channel 5, 11, 14, 17, 29 and 42). Multiple comparisons found that the left MTG (channel 17) showed significant difference between happy and fearful (p=0.03). The interaction effect was significant at channel 41(F(3,57) =2.83, p = 0.046), belonging to the right supramarginal gyrus. This area's activity was significantly lower for sad than other emotions in prosodic task, but significantly higher for sad than happy in the semantic task.
Supporting Image: beta.png
   ·Figure2: Images of activation of brain regions.
 

Conclusions:

This study is an attempt to inform the integrated processing of semantics and prosody in emotional speech. It showed that the emotional prosodic task elicited stronger activation than semantics in Broca's and Wernicke's areas of the right brain. There were main effects of different emotions in the left Broca's area, STG, MTG, SMG, right Broca's area, and right SMG. Previous studies have indicated that these regions are essential for the processing of speech and emotional prosody. The right SMG showed different activation patterns for emotions under different tasks. Sad emotion can elicit stronger activation here during semantic task.

Emotion, Motivation and Social Neuroscience:

Emotional Perception 1

Language:

Speech Perception 2

Novel Imaging Acquisition Methods:

NIRS

Perception, Attention and Motor Behavior:

Attention: Auditory/Tactile/Motor
Perception: Auditory/ Vestibular

Keywords:

Emotions
Near Infra-Red Spectroscopy (NIRS)
Perception

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I do not want to participate in the reproducibility challenge.

Please indicate below if your study was a "resting state" or "task-activation” study.

Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Optical Imaging
Behavior

Which processing packages did you use for your study?

SPM
Other, Please list  -   Homer3, NIRS-KIT

Provide references using APA citation style.

Bach, D. R., Grandjean, D., Sander, D., Herdener, M., Strik, W. K., & Seifritz, E. (2008). The effect of appraisal level on processing of affective prosody in meaningless speech. Neuroimage, 42(2), 919–927.
Frühholz, S., Ceravolo, L., & Grandjean, D. (2012). Specific brain networks during explicit and implicit decoding of affective prosody. Cerebral Cortex, 22(5), 1107–1117.
Hou, X., Zhang, Z., Zhao, C., Duan, L., Gong, Y., Li, Z., & Zhu, C. (2021). NIRS-KIT: a MATLAB toolbox for both resting-state and task fNIRS data analysis. Neurophotonics, 8(1), 010802
Huppert, T., Diamond, S., Franceschini, M., Boas, D. (2009). HomER: a review of time-series analysis methods for near-infrared spectroscopy of the brain. Applied optics 48(10). https://dx.doi.org/10.1364/ao.48.00d280
Lei, Z., Bi, R., Mo, L., Yu, W., & Zhang, D. (2021). The brain mechanism of explicit and implicit processing of emotional prosodies: An fNIRS study. Acta Psychologica Sinica, 53(1), 15–25. https://doi.org/10.3724/SP.J.1041.2021.00015

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No