Effects of Parameter Selection on Connectome-Based Reservoir Computing

Poster No:

1274 

Submission Type:

Late-Breaking Abstract Submission 

Authors:

Iva Ilioska1, Ágoston Mihalik2, Benjamin Chidiac3, Maroš Rovný3, Laura Suárez4, Sanjukta Krishnagopal5, Koen Helwegen6, Martijn van den Heuvel7, Sarah Morgan8, Bratislav Misic9, Duncan Astle1, Petra Vértes3

Institutions:

1Cambridge University, Cambridge, United Kingdom, 2University of Cambridge, Cambridge, United Kingdom, 3University of Cambridge, Cambridge, Cambridgeshire, 4Mila - Quebec Artifical Intelligence Institute, Montréal, Quebec, 5University of California, Santa Barbara, CA, 6Vrije Universiteit, Amsterdam, Netherlands, 7Vrije Universiteit, Amsterdam, North Holland, 8School of Biomedical Engineering and Imaging Sciences, King's College London, London, London, 9Montreal Neurological Institute, Montreal, Quebec

First Author:

Iva Ilioska  
Cambridge University
Cambridge, United Kingdom

Co-Author(s):

Ágoston Mihalik  
University of Cambridge
Cambridge, United Kingdom
Benjamin Chidiac  
University of Cambridge
Cambridge, Cambridgeshire
Maroš Rovný  
University of Cambridge
Cambridge, Cambridgeshire
Laura Suárez  
Mila - Quebec Artifical Intelligence Institute
Montréal, Quebec
Sanjukta Krishnagopal  
University of California
Santa Barbara, CA
Koen Helwegen  
Vrije Universiteit
Amsterdam, Netherlands
Martijn van den Heuvel  
Vrije Universiteit
Amsterdam, North Holland
Sarah Morgan  
School of Biomedical Engineering and Imaging Sciences, King's College London
London, London
Bratislav Misic  
Montreal Neurological Institute
Montreal, Quebec
Duncan Astle  
Cambridge University
Cambridge, United Kingdom
Petra Vértes  
University of Cambridge
Cambridge, Cambridgeshire

Late Breaking Reviewer(s):

Giulia Baracchini  
The University of Sydney
Sydney, New South Wales
Andreia Faria  
Johns Hopkins University
Baltimore, MD
Wei Zhang  
Washington University in St. Louis
Saint Louis, MO

Introduction:

In reservoir computing, the reservoir is a recurrent neural network with fixed internal weights and trained output weights [1]. The fixed connections of reservoir networks resemble structural brain connectivity, making them particularly suitable models for studying the impact of structural brain organization on network dynamics and computational capacity [2, 3]. When using neuromorphic reservoirs, researchers make various parameter choices which influence both the topology of the reservoir and the dynamics of the system. For example, network topology depends on how connectomes are thresholded (determining density), while system dynamics are shaped by input signal scale, spectral radius, and activation function. In turn, the reservoir's architecture and dynamics will determine both task-performance and the system's sensitivity to the precise topology and weighting of empirical connectomes.

Here we dissect the complex interplay of a range of parameters. For example, we show how network density and input scale affect task-performance and sensitivity to topology and weighting of connectome-based reservoirs in prediction and memory of Mackey Glass time series [4].

Methods:

We selected 69 DTI connectomes from participants from the CALM cohort [5], based on WASI score > 55 [6]. Connectomes were parcellated using the Human Brainnetome Atlas [7] and Yeo 7-network assignments [8], creating subject-specific matrices.
We performed parameter sweeps across network densities from sparse (0.01, maximum spanning tree) to dense (0.1) with three input scaling factors (1e-5, 1e-3, 1).
For each configuration, we measured: (1) task performance (cumulative R² across horizons 1-30 for prediction, -1 to -30 for memory), (2) topology sensitivity (performance difference between intact connectome and degree-preserved randomized model), and (3) weight location sensitivity (performance difference between structural connectome and topology-preserved weight-randomized model).
We tested Mackey-Glass chaotic time series (τ=30) memory and prediction across spectral radius values (0.2 to 4.2, step=0.2) spanning stable, critical, and chaotic regimes [9]. Visual network nodes served as inputs, and somatomotor nodes as outputs. Output weights were trained using ridge regression (λ=1e-8 for memory, λ=1e-5 for prediction). Reservoir dynamics is defined according to:

X(t+1) = tanh(W_in u(t+1) + W x(t)) (1)

Where nodes use the hyperbolic tangent function, X(t) represents the reservoir states, W_in is a vector with the constant value of the input scale, u(t) is the input signal vector and W is the reservoir adjacency matrix.

Results:

Network density had only modest effects on task-performance (except for the lowest density corresponding to a spanning tree structure). Sensitivity, however, varied significantly, with denser networks (0.1 density) exhibiting the highest sensitivity to topology and weight location (Figure 1, A: panels 2, 3, 5, and 6).

Input scaling emerged as a key factor influencing all three measured variables: performance, topology sensitivity, and weight location sensitivity. Across both tasks, an input scale of 1 (original signal amplitude: -1 to 1) provided the best performance and consistently positive sensitivity (Figure 1, B). However, the optimal input scale depends on the activation function and weight distribution.
Supporting Image: Figure_OHBM2025.png
 

Conclusions:

Our findings emphasize the importance of carefully calibrating parameters to fully leverage the topological and topographic advantages of neurobiological architectures-critical when investigating the relationship between brain organization and computational capacity.

Modeling and Analysis Methods:

Connectivity (eg. functional, effective, structural) 1
Diffusion MRI Modeling and Analysis 2

Keywords:

Computational Neuroscience
Computing
Informatics
Machine Learning
Modeling

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I do not want to participate in the reproducibility challenge.

Please indicate below if your study was a "resting state" or "task-activation” study.

Other

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Diffusion MRI
Computational modeling

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

FSL

Provide references using APA citation style.

1. Lukoševičius, M. and H. Jaeger, Reservoir computing approaches to recurrent neural network training. Computer science review, 2009. 3(3): p. 127-149.
2. Suárez, L.E., et al., Learning function from structure in neuromorphic networks. Nature Machine Intelligence, 2021. 3(9): p. 771-786.
3. Suárez, L.E., et al., Connectome-based reservoir computing with the conn2res toolbox. Nature Communications, 2024. 15(1): p. 656.
4. Mackey, M.C. and L. Glass, Oscillation and chaos in physiological control systems. Science, 1977. 197(4300): p. 287-289.
5. Holmes, J., et al., Protocol for a transdiagnostic study of children with problems of attention, learning and memory (CALM). BMC pediatrics, 2019. 19: p. 1-11.
6. Wechsler, D., Wechsler abbreviated scale of intelligence. 1999.
7. Fan, L., et al., The Human Brainnetome Atlas: A New Brain Atlas Based on Connectional Architecture. Cereb Cortex, 2016. 26(8): p. 3508-26.
8. Yeo, B.T., et al., The organization of the human cerebral cortex estimated by intrinsic functional connectivity. Journal of neurophysiology, 2011.
9. O’Byrne, J. and K. Jerbi, How critical is brain criticality? Trends in Neurosciences, 2022. 45(11): p. 820-837.

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No