Decoding Behaviour and Cognition: A Multimodal Machine Learning Approach in Neuroimaging

Poster No:

1163 

Submission Type:

Late-Breaking Abstract Submission 

Authors:

Ehssan Amini1,2, David Coynel1,2, Leticia de Oliveira3, Andreas Papassotiropoulos4,5,6, Dominique de Quervain7,2,6, Janaina Mourao-Miranda8

Institutions:

1Division of Cognitive Neuroscience, Department of Biomedicine, University of Basel, CH-4055 Basel, Basel, Switzerland, 2Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055, Basel, Switzerland, 3Neurophysiology of Behaviour Laboratory, Biomedical Institute, Rio de Janeiro, Brazil, 4Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055, Basel, Swizerland, 5Division of Molecular Neuroscience, Department of Biomedicine, University of Basel, CH-4055, Basel, Switzerland, 6Psychiatric University Clinics, University of Basel, CH-4055, Basel, Switzerland, 7Division of Cognitive Neuroscience, Department of Biomedicine, University of Basel, CH-4055 Basel, Basel, Switzeralnd, 8UCL Hawkes Institute, Department of Computer Science, University College London, London, UK

First Author:

Ehssan Amini  
Division of Cognitive Neuroscience, Department of Biomedicine, University of Basel, CH-4055 Basel|Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055
Basel, Switzerland|Basel, Switzerland

Co-Author(s):

David Coynel  
Division of Cognitive Neuroscience, Department of Biomedicine, University of Basel, CH-4055 Basel|Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055
Basel, Switzerland|Basel, Switzerland
Leticia de Oliveira  
Neurophysiology of Behaviour Laboratory, Biomedical Institute
Rio de Janeiro, Brazil
Andreas Papassotiropoulos  
Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055|Division of Molecular Neuroscience, Department of Biomedicine, University of Basel, CH-4055|Psychiatric University Clinics, University of Basel, CH-4055
Basel, Swizerland|Basel, Switzerland|Basel, Switzerland
Dominique de Quervain  
Division of Cognitive Neuroscience, Department of Biomedicine, University of Basel, CH-4055 Basel|Research Cluster Molecular and Cognitive Neurosciences, University of Basel, CH-4055|Psychiatric University Clinics, University of Basel, CH-4055
Basel, Switzeralnd|Basel, Switzerland|Basel, Switzerland
Janaina Mourao-Miranda  
UCL Hawkes Institute, Department of Computer Science, University College London
London, UK

Late Breaking Reviewer(s):

Andreia Faria  
Johns Hopkins University
Baltimore, MD
Jaehee Kim  
Duksung Women's University
Seoul, 서울특별시
Lena Oestreich, PhD  
The University of Queensland
Brisbane, Queensland

Introduction:

The availability of large-scale multimodal neuroimaging datasets, combined with advancements in machine learning, has opened new avenues for exploring the brain–behaviour relationship through a more integrative and holistic lens. Recent studies using the ABCD and HCP datasets have demonstrated that cognitive measures can be predicted with higher accuracy compared to mood and personality measures (Ooi et al., 2022; Chen et al., 2022). Additionally, these studies have shown that combining multiple neuroimaging modalities improves prediction accuracy. This study aims to replicate these findings by leveraging a large-scale neuroimaging dataset that includes various structural and functional MRI modalities. Furthermore, we extend previous research by integrating all available modalities into single models to predict behavioural measures across five categories: cognition, personality, emotion, mood, and reaction time.

Methods:

Data from 1,157 healthy young participants were analysed. The analyses were implemented in ProNTo V3 (Schrouff et al., 2018). We computed a linear kernel for each modality. We employed a kernel ridge regression model on the sum of the kernels (which is equivalent to concatenating the modalities) using a five-fold nested cross-validation scheme to predict behavioural variables in the following five categories:
• Mood (multidimensional mood questionnaire, depression, state and trait anxiety)
• Emotion (affect intensity, pictorial arousal rating, pictorial valence rating)
• Personality (Big Five personality traits)
• Reaction time (measured for each fMRI task)
• Cognition (free recall, attention, working memory, recognition performance)
The following neuroimaging modalities were included in the models:
• Brain t-maps from three fMRI tasks (IAPS pictures, emotional valence and arousal rating, IAPS picture recognition task, and N-back task)
• Functional connectivity matrices from the N-back task
• Gray matter and white matter probabilistic maps (derived from structural MRI)
• Atlas-based brain region volumes (extracted using sMRI and the Freesurfer pipeline)
• Structural connectomes (derived from diffusion-weighted imaging)
• Demographic variables (sex, age, weight, height, smoking habits, alcohol and cannabis consumption).
All modalities were combined to predict mood, personality, and affect intensity. However, for other behavioural variables, fMRI modalities unrelated to the task of interest were excluded. Model performance was evaluated using mean accuracy and normalised mean square error (nMSE). Statistical significance was assessed using a 1,000-permutation test. Feature weight importance for each modality was computed and compared across models to examine similarities within and between behavioural domains.

Results:

We achieved higher prediction accuracy for emotion (corr: 0.38 ± 0.17; nMSE: 0.84 ± 0.13), cognition (corr: 0.34 ± 0.15; nMSE: 0.88 ± 0.12), and reaction time (corr: 0.33 ± 0.11; nMSE: 0.91 ± 0.08) compared to mood (corr: 0.09 ± 0.07; nMSE: 1.0 ± 0.02) and personality (corr: 0.12 ± 0.09; nMSE: 1.0 ± 0.03). Overall, variables measured during scanning sessions were predicted more accurately than those measured outside the scanner. For nearly all neuroimaging modalities, feature weight similarity was higher within models of the same behavioural category than across categories. However, the similarity pattern of gray matter, white matter, and fMRI modalities closely mirrored the correlation matrix of behavioural variables (cosine similarity range: 0.78–0.96; Figure 2), suggesting that this similarity may primarily reflect correlations among the target variables.
Supporting Image: results.png
   ·Correlation and normalised mean square error (nMSE) for predictive models across all behavioural variables. Each bar represents the model's performance for a specific variable averaged across folds, w
Supporting Image: similaritymatrix.png
   ·Feature weight similarity matrix across models for each modality, compared to the behavioral target correlation matrix. White boxes indicate distinct behavioral domains.
 

Conclusions:

Emotion, Cognition, and reaction time could be predicted better than mood and personality. However, we also observed that the target variables assessed during the scanning sessions were predicted better.

Learning and Memory:

Learning and Memory Other

Modeling and Analysis Methods:

Activation (eg. BOLD task-fMRI) 2
Classification and Predictive Modeling 1
Diffusion MRI Modeling and Analysis

Novel Imaging Acquisition Methods:

Anatomical MRI
BOLD fMRI
Diffusion MRI

Keywords:

Cognition
Emotions
FUNCTIONAL MRI
Machine Learning
STRUCTURAL MRI
WHITE MATTER IMAGING - DTI, HARDI, DSI, ETC
Other - Mood, Personality, Reaction time

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.

Please indicate below if your study was a "resting state" or "task-activation” study.

Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

No

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Functional MRI
Structural MRI
Diffusion MRI
Behavior
Computational modeling

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

SPM
Free Surfer
Other, Please list  -   PRoNTo

Provide references using APA citation style.

• Ooi, L. Q. R., Chen, J., Zhang, S., Kong, R., Tam, A., Li, J., ... & Yeo, B. T. (2022). Comparison of individualized behavioral predictions across anatomical, diffusion and functional connectivity MRI. NeuroImage, 263, 119636.
• Chen, J., Tam, A., Kebets, V., Orban, C., Ooi, L. Q. R., Asplund, C. L., ... & Yeo, B. T. (2022). Shared and unique brain network features predict cognitive, personality, and mental health scores in the ABCD study. Nature communications, 13(1), 2217.
• Schrouff, J., Mourão-Miranda, J., Phillips, C., & Parvizi, J. (2016). Decoding intracranial EEG data with multiple kernel learning method. Journal of neuroscience methods, 261, 19-28.

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No