Poster No:
795
Submission Type:
Abstract Submission
Authors:
Jiaxin Yan1, Victoria-Anne Flood1, Motoaki Sugiura2, Hyeonjeong Jeong1
Institutions:
1Graduate School of International Cultural Studies, Tohoku University, Sendai, Miyagi, 2Department of Human Brain Science, IDAC, Tohoku University, Sendai, Miyagi
First Author:
Jiaxin Yan
Graduate School of International Cultural Studies, Tohoku University
Sendai, Miyagi
Co-Author(s):
Victoria-Anne Flood
Graduate School of International Cultural Studies, Tohoku University
Sendai, Miyagi
Motoaki Sugiura
Department of Human Brain Science, IDAC, Tohoku University
Sendai, Miyagi
Hyeonjeong Jeong
Graduate School of International Cultural Studies, Tohoku University
Sendai, Miyagi
Introduction:
Gestures and speech are not isolated modes of communication, but rather tightly integrated systems that work together to convey meaning and enhance comprehension (McNeill, 1992). Meta-analyses of fMRI studies (Cacciante et al., 2024; Yang et al., 2015) support this notion, highlighting the common neural networks-encompassing the middle temporal gyrus (MTG), inferior frontal gyrus (IFG), and left fusiform gyrus-implicated in gesture-speech integration.
For second language (L2) studies, Lin (2024) found that gestures enhance speech retention by offering supplementary visual cues that reinforce linguistic content, leaving motor traces in memory. However, the brain mechanisms underlying gesture-speech integration in L2 contexts, and how gestures facilitate the retention of speech information, remain poorly understood. Our study aimed to investigate how gestures contribute to speech comprehension and retention in bilinguals, specifically exploring the neural mechanisms involved in gesture-speech integration in both first (L1) and L2 contexts.
Methods:
Thirty-eight late bilinguals (native Japanese speakers with L2 English) participated. Using fMRI, participants watched five types of videos: (1) English with gestures (EnG), (2) English without gestures (EnNoG), (3) Japanese with gestures (JpG), (4) Japanese without gestures (JpNoG), and (5) filler videos. This design allows us to directly compare the impact of gestures on speech in both L1 and L2 contexts and their interaction. Each video contained one speech with one target word, either accompanied by a gesture or not. Representational gestures were used to ensure semantic transparency, and audio quality was controlled for consistency. Participants were instructed to focus on understanding the speech. To ensure attentiveness, comprehension questions were included in approximately one-third of the trials. After the fMRI session, participants completed a recall task to assess speech retention in both Japanese and English, providing insights into comprehension and memory performance.
Results:
We analyzed data from 35 participants, excluding those with head movement ≥3 mm and comprehension accuracy <70%. For the main effect of gestures, significant activation was observed in the right posterior part of MTG, left middle occipital gyrus, left superior occipital gyrus, and right hippocampus (cluster-level FWE corrected p<0.05). Although there was no interaction effect for (EnG>EnNoG) > (JpG>JpNoG), recall scores showed substantial individual differences. Correlational analyses were conducted separately for EnG>EnNoG and JpG>JpNoG. Significant positive correlations were observed between English-with-gesture recall scores and brain activation in the left posterior middle temporal gyrus (MTG) (small volume correction, FWE p<0.05). In contrast, no significant positive correlations were found for Japanese-with-gesture recall scores.
Conclusions:
Our findings align with previous literature, which demonstrates the common brain networks involved in gesture-speech integration in both L1 and L2. Notably, our results reveal a novel role of the left posterior MTG in enhancing speech retention when gestures accompany L2 speech. This critical area for speech-gesture integration appears to support encoding, retrieval, and other processes essential for retaining linguistic information in L2.
Language:
Language Acquisition 1
Modeling and Analysis Methods:
Activation (eg. BOLD task-fMRI)
Neuroanatomy, Physiology, Metabolism and Neurotransmission:
Anatomy and Functional Systems
Perception, Attention and Motor Behavior:
Perception: Multisensory and Crossmodal 2
Keywords:
Acquisition
Cognition
FUNCTIONAL MRI
Language
Perception
Other - gesture; multimodal; second language acquisition;
1|2Indicates the priority used for review
By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.
I accept
The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information.
Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:
I do not want to participate in the reproducibility challenge.
Please indicate below if your study was a "resting state" or "task-activation” study.
Task-activation
Healthy subjects only or patients (note that patient studies may also involve healthy subjects):
Healthy subjects
Was this research conducted in the United States?
No
Were any human subjects research approved by the relevant Institutional Review Board or ethics panel?
NOTE: Any human subjects studies without IRB approval will be automatically rejected.
Yes
Were any animal research approved by the relevant IACUC or other animal research panel?
NOTE: Any animal studies without IACUC approval will be automatically rejected.
Not applicable
Please indicate which methods were used in your research:
Functional MRI
Behavior
For human MRI, what field strength scanner do you use?
3.0T
Which processing packages did you use for your study?
SPM
Provide references using APA citation style.
Cacciante, L. (2024). Language and gesture neural correlates: A meta‐analysis of functional magnetic resonance imaging studies. International Journal of Language & Communication Disorders, 59(3), 902–912.
Lin, Y. L. (2024). Gestures as scaffolding for L2 narrative recall: The role of gesture type, task complexity, and working memory. Language Teaching Research, 28(6), 2059–2081.
McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.
Yang, J. (2015). The neural basis of hand gesture comprehension: A meta-analysis of functional magnetic resonance imaging studies. Neuroscience & Biobehavioral Reviews, 57, 88–104.
No