BrainCAP: an open-source neuroimaging toolkit to analyze brain co-activation patterns

Presented During:

Saturday, June 28, 2025: 11:30 AM - 12:45 PM
Brisbane Convention & Exhibition Centre  
Room: M4 (Mezzanine Level)  

Poster No:

1369 

Submission Type:

Abstract Submission 

Authors:

Kangjoo Lee1, Samuel Brege1, Zailin Tamayo1, Catie Chang2, Youngsun Cho1,3

Institutions:

1Department of Psychiatry, Yale University School of Medicine, New Haven, CT, 2Department of Electrical and Computer Engineering, Vanderbilt University, Nashville, TN, 3Child Study Center, Yale University School of Medicine, New Haven, CT

First Author:

Kangjoo Lee  
Department of Psychiatry, Yale University School of Medicine
New Haven, CT

Co-Author(s):

Samuel Brege  
Department of Psychiatry, Yale University School of Medicine
New Haven, CT
Zailin Tamayo  
Department of Psychiatry, Yale University School of Medicine
New Haven, CT
Catie Chang  
Department of Electrical and Computer Engineering, Vanderbilt University
Nashville, TN
Youngsun Cho  
Department of Psychiatry, Yale University School of Medicine|Child Study Center, Yale University School of Medicine
New Haven, CT|New Haven, CT

Introduction:

The analysis of moment-to-moment changes in co-activation patterns (CAPs) in functional MRI (fMRI) has been useful for studying dynamic properties of neural activity. This method is based on clustering fMRI time-frames into several recurrent spatial patterns within and across subjects [1]. Studies have also focused on quantifying properties of the temporal organization of CAPs, such as fractional occupancy [2]. The analyses of co-activations are computationally intensive, requiring the clustering of high-dimensional data concatenated over subjects. Further, while a variety of analytic choices are involved in studying CAPs, the field lacks a unified open-source platform to allow a robust feature selection required for reproducible mappings of brain and behavioral measurements.

Methods:

We developed BrainCAP, an open-source Python-based toolkit for quantifying CAPs from fMRI data in cross-sectional and longitudinal studies (Figure 1). Using resting-state fMRI from the Human Connectome Project dataset [3], we previously estimated highly reproducible spatiotemporal features of neural CAPs linked to behavioral phenotypes such as cognition, emotion regulation, alcohol and substance use [4]. Building on this work, BrainCAP provides a comprehensive framework for the full CAP analysis pipeline: (i) Concatenation of resting-state fMRI across sessions, subjects and groups, (ii) Temporal sampling of concatenated time-series data, (iii) Clustering fMRI time-frames by spatial similarity to identify CAPs via cluster centroids, (iv) Evaluation of spatial characteristics through cosine similarity with canonical resting state networks, (v) Quantification of temporal CAP metrics, including fractional occupancy, dwell time, transition probability, and fractional entry, (vi) Second-level analyses of CAP-derived metrics, such as feature selection, dimension reduction (e.g. identifying a low-dimensional representation of individual differences), and behavior prediction (e.g. multiple linear regression), and (vii) quality control. The second-level analyses account for confounds such as age, sex, and motion while identifying individual differences and predicting behavior using CAP-derived features. The reproducibility of co-activation measures can be obtained by a split-half permutation-based approach validated in [4].
Supporting Image: Fig1.jpg
   ·Figure 1. An overview of BrainCAP workflow
 

Results:

BrainCAP is compatible with BIDS data structure [5] and high performance computing systems equipped with the Slurm job scheduler. Multiple sessions of large-N neuroimaging data in both volumetric NIFTI and grayordinate CIFTI formats can be utilized. BrainCAP can be used for analyses at both voxel-wise and parcel levels, with and without temporal sampling (e.g. selecting time-points associated with a seed time-course, excluding time-points with motion artifacts). Diverse clustering choices are available using the standard python libraries. While the spatial clustering of fMRI time-frames is performed across all groups (cross-sectional) or time-points (longitudinal) by default, users can choose to run the clustering separately for each group or time-point. After clustering, BrainCAP assigns each time-point to a CAP state (cluster-centroid of timeframes), and quantifies the temporal characteristics of state variations over time [4]. This entire workflow is facilitated through a single YAML configuration file. This enables users to easily tailor every element of the analysis to their specific hypothesis. Quantifiable outputs and quality control statistics can be visualized using python, R and WorkBench [3].

Conclusions:

BrainCAP provides an open analytic platform for quantifying fMRI co-activations, supporting a reproducible mapping of brain and behavioral measurements. A developing version of BrainCAP is available at GitHub [6]. A future user guide will be added to the WikiPage with the official release. Future updates will include group comparisons of CAP-derived features for cross-sectional studies and longitudinal data analysis, such as mixed effect models.

Modeling and Analysis Methods:

Classification and Predictive Modeling
Connectivity (eg. functional, effective, structural)
fMRI Connectivity and Network Modeling 1
Methods Development 2

Neuroinformatics and Data Sharing:

Workflows

Keywords:

Data analysis
FUNCTIONAL MRI
Informatics
Open-Source Software
Workflows

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.

Please indicate below if your study was a "resting state" or "task-activation” study.

Resting state

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

Yes

Are you Internal Review Board (IRB) certified? Please note: Failure to have IRB, if applicable will lead to automatic rejection of abstract.

Yes, I have IRB or AUCC approval

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Functional MRI
Behavior

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

Other, Please list  -   We develop a new open-source package

Provide references using APA citation style.

[1] Liu, X. (2013). Time-varying functional network information extracted from brief instances of spontaneous brain activity. Proc Natl Acad Sci U S A, 110(11), 4392-7.
[2] Liu, X. (2018). Co-activation patterns in resting-state fMRI signals. Neuroimage, 180(Pt B), 485-494.
[3] Van Essen, DC. (2012). The Human Connectome Project: a data acquisition perspective. Neuroimage, 62(4), 2222-31.
[4] Lee, K. (2024). Human brain state dynamics are highly reproducible and associated with neural and behavioral features. PLoS Biol, 22(9), e3002808.
[5] Gorgolewski, KJ. (2016). The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments. Sci Data, 3:160044.
[6] https://github.com/Kangjoo/BrainCAP/tree/develop

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No