Enhanced inter-subject synchrony promotes phenotype prediction in naturalistic conditions

Presented During:

Tuesday, June 25, 2024: 12:00 PM - 1:15 PM
COEX  
Room: ASEM Ballroom 202  

Poster No:

1387 

Submission Type:

Abstract Submission 

Authors:

Xuan Li1, Simon Eickhoff1, Susanne Weis1

Institutions:

1Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich, Jülich, North Rhine–Westphalia Land

First Author:

Xuan Li  
Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich
Jülich, North Rhine–Westphalia Land

Co-Author(s):

Simon Eickhoff  
Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich
Jülich, North Rhine–Westphalia Land
Susanne Weis  
Institute of Neuroscience and Medicine (INM-7), Research Centre Jülich
Jülich, North Rhine–Westphalia Land

Introduction:

Recent studies have suggested that naturalistic stimuli, such as movie clips, outperform rest and conventional tasks in phenotype prediction [1, 2]. Despite their promise, the impact of stimulus selection on phenotype prediction remains largely unclear. Most existing datasets of naturalistic conditions lack sufficient justifications for selecting specific stimuli, and many studies so far have used one single stimulus. Here, we investigate the impact of stimulus selection on phenotype prediction from two aspects, namely brain states (inter-subject synchrony) [3] and stimulus features. We focus on the paradigmatic case of sex classification due to the robust and well-established nature of the brain-sex relationship.

Methods:

We used preprocessed fMRI data of 178 subjects while watching 13 different short movie clips provided by the Human Connectome Project. All fMRI data were truncated to 132 TRs (i.e., 2:12 min) for a direct comparison across different movie clips. For sex classification we applied our previously proposed approach [1], which allows phenotype prediction from evoked activity. Subject-specific loadings onto shared responses identified as principal components (PCs) of the fMRI time series across subjects were computed within each of 436 parcels [4]. These loadings (here we only used PC1 loadings) were used as features for classification by using a support vector machine with a radial basis function kernel [4]. Performance was quantified as average balanced accuracy over 10 repetitions of 10 cross validation folds. Family structure was controlled for and data leakage was prevented during all phases of the procedure. Group-level brain states were characterised by inter-subject synchrony, quantified as the variance explained by PC1 (the shared response) for each parcel and movie clip, with a large amount of variance indicating stronger inter-subject synchrony. A variety of movie features were extracted and analysed, including total motion energy, visual brightness, loudness, number of TRs showing human faces, number of spoken words and semantic features.

Results:

We observed large variations in both sex classification performance and inter-subject synchrony across movie clips and a significant correlation between the two (r = 0.70, p = 0.007; Fig. 1A-C). Influence of head motion was excluded (Fig. 1D-E). Better accuracy was significantly (r = 0.87, FDR corrected p<0.05; Fig. 1F) associated with higher inter-subject synchrony in the right temporal parietal junction (TPJ), a key brain area for social cognition and attention maintenance [5]. Similar findings were obtained for the 17 networks [6] (Fig. 1G).
High inter-subject synchrony was associated with large variations in auditory loudness across time (RMS_std, r = 0.65, p =0.017; Fig. 2A-B) and semantic features (Fig. 2C) related to concrete objects (e.g., "living_thing", "person"), human actions and social interactions (e.g.," move" ,"travel") and story structure (e.g., "causal_agent"). Better classification performance was associated with more spoken words (r= 0.55, p =0.049; Fig. 2D-E). Results on semantic features were highly similar between inter-subject synchrony and classification performance (Fig. 2F).
Supporting Image: Fig1.jpg
   ·Fig. 1: Relationship between classification performance and inter-subject synchrony
Supporting Image: Fig2.jpg
   ·Fig. 2: Influence of features of movie stimuli
 

Conclusions:

Contrary to intuition, we show that movie stimuli promoting similar brain states across subjects may actually enhance phenotype prediction [7]. Notably, our results here may provide very conservative lower bounds due to the very short nature of the movie stimuli. Higher inter-subject synchrony may reflect better subject engagement and higher attention levels caused by stimulus processing [8,9], suppressing noises (e.g., spontaneous thoughts) while amplifying phenotype relevant signals. Moreover, we show that stimuli with rich social content and cohesive stories may benefit phenotype prediction by promoting inter-subject synchrony [10]. These results collectively offer valuable insights for future studies in selecting an appropriate naturalistic stimulus for phenotype prediction.

Higher Cognitive Functions:

Higher Cognitive Functions Other

Modeling and Analysis Methods:

Activation (eg. BOLD task-fMRI) 2
Classification and Predictive Modeling 1
Multivariate Approaches

Novel Imaging Acquisition Methods:

BOLD fMRI

Keywords:

Cortex
Data analysis
FUNCTIONAL MRI
Machine Learning
Modeling
Multivariate
Other - Naturalistic viewing; movie stimuli

1|2Indicates the priority used for review

Provide references using author date format

1. Li, X. (2023), A topography-based predictive framework for naturalistic viewing fMRI. Neuroimage 120245.
2. Finn, E. S. (2021), Movie-watching outperforms rest for functional connectivity-based prediction of behavior. Neuroimage 117963.
3. Hasson, U. (2004). Intersubject synchronization of cortical activity during natural vision. Science 303, 1634– 1640.
4. Weis, S. (2020), Sex classification by resting state brain connectivity. Cerebral cortex, 30, 824-835.
5. Langner, R. (2013), Sustaining attention to simple tasks: A meta-analytic review of the neural mechanisms of vigilant attention. Psychol. Bull., 139, 870–900.
6. Yeo, B. T. T. (2011), The organization of the human cerebral cortex estimated by intrinsic functional connectivity. J. Neurophysiol.106, 1125–1165.
7. Vanderwal, T. (2017), Individual differences in functional connectivity during naturalistic viewing conditions. NeuroImage, 157, 521–530.
8. Ki, J. J. (2016), Attention Strongly Modulates Reliability of Neural Responses to Naturalistic Narrative Stimuli. J. Neurosci., 36, 3092–3101.
9. Ohad, T. (2023), Neural synchronization as a function of engagement with the narrative. NeuroImage, 276, 120215.
10. Hasson, U. (2008). Enhanced intersubject correlations during movie viewing correlate with successful episodic encoding. Neuron, 57, 452-62.