The Q1K EEG/Eye Tracking Experimental Test Battery: Contribution to Open Science

Poster No:

340 

Submission Type:

Abstract Submission 

Authors:

Christian O'Reilly1, James Desjardins2, Gabriel Blanco-Gomez3, Scott Huberty3, Anthony Hosein Poitras Loewen4, Diksha Srishyla1, Inga Knoth4, Charles-Olivier Martin4, Jonathan Batten5, Stefon van Noordt6, Noemie Hebert-Lalonde3, Fabienne Samson3, Tim Smith7, Sarah Lippe4, Mayada Elsabbagh3

Institutions:

1University of South Carolina, Columbia, SC, 2Compute Ontario, St. Catharines, Ontario, 3McGill University, Montreal, Quebec, 4Universite de Montreal, Montreal, Quebec, 5SR Research Ltd., Ottawa, Ontario, 6Mount Saint Vincent University, Halifax, Nova Scotia, 7University of the Arts, London, London

First Author:

Christian O'Reilly  
University of South Carolina
Columbia, SC

Co-Author(s):

James Desjardins  
Compute Ontario
St. Catharines, Ontario
Gabriel Blanco-Gomez  
McGill University
Montreal, Quebec
Scott Huberty  
McGill University
Montreal, Quebec
Anthony Hosein Poitras Loewen  
Universite de Montreal
Montreal, Quebec
Diksha Srishyla  
University of South Carolina
Columbia, SC
Inga Knoth  
Universite de Montreal
Montreal, Quebec
Charles-Olivier Martin  
Universite de Montreal
Montreal, Quebec
Jonathan Batten  
SR Research Ltd.
Ottawa, Ontario
Stefon van Noordt  
Mount Saint Vincent University
Halifax, Nova Scotia
Noemie Hebert-Lalonde  
McGill University
Montreal, Quebec
Fabienne Samson  
McGill University
Montreal, Quebec
Tim Smith  
University of the Arts
London, London
Sarah Lippe  
Universite de Montreal
Montreal, Quebec
Mayada Elsabbagh  
McGill University
Montreal, Quebec

Introduction:

The Quebec 1000 Families (Q1K) project aims to study a large cohort of families with at least one member with an autism spectrum condition (ASC). As part of its Open Science philosophy, it also aims to design an open and reusable experimental protocol and open-source tools for ASC research. Q1K is a three-phase multi-site project, including 1) a research registry, 2) light phenotyping, and 3) deep phenotyping. The latter phase builds on collected demographics, questionnaires, and biosamples and adds multimodal deep phenotyping, including high-density EEG, eye tracking (ET), behavioral, and 7T MRI data. Here, we focus on Q1K's EEG/ET experimental protocol and discuss its general approach, design philosophy, multimodal EEG/ET integration, and open release.

Methods:

We aim to deep-phenotype 200 persons with ASC and at least one of their family members. We have already collected about a quarter of that target sample. We aim to build a strongly multimodal dataset assessing a wide array of constructs, built on well-established tasks, and including the full spectrum of symptoms severity and a wide age range (5-89 years old). We used tasks developed and tested in infants and toddlers to ensure inclusivity. We aimed for a 1-hour protocol. Understanding that some participants may be unable to participate in the whole protocol (due to lack of time or sensory sensitivity), we ordered the tasks from higher to lower priority. We also considered the comfort of the participants by avoiding running back-to-back boring (e.g., oddball), passive (e.g., resting state), or "abrasive" tasks (e.g., pupillary light reflex and the visual steady state tasks). We first ran the resting state task to avoid "spillover" effects from more engaging/stimulating tasks on oscillatory measures. In order, the protocol includes the following tasks: 1) resting-state (Neuhaus et al., 2021), 2) tone oddball (Green et al., 2020), 3) gap overlap task (Portugal et al., 2021), 4) visual steady-state (Lalancette et al., 2022), 5) auditory steady-state (Edgar et al., 2016), 6) naturalistic social preference (Saez de Urabain et al., 2017), 7) pupillary light reflexes (Nyström et al., 2018), and 8) visual search task (Gliga et al., 2015). These tasks were selected to cover a wide range of constructs, including general brain state (1), sensory (2, 4, 5), perception (4, 5, 8), attention (2, 3), executive function (3), social motivation (6), and sensorimotor (7).
The recording system includes an EGI 128-channel system synchronized with an EyeLink 1000+ eye-tracking system. Robust synchronization with visual and auditory stimuli is ensured by a Cedrus StimTracker box connected to the audio output and two photodiodes mounted on the presentation screen. EEG and ET data are synchronized and integrated into BIDS format recordings, allowing direct comparison of eye gaze position and pupil signals with EEG signals. A software infrastructure has been implemented to preprocess the dataset into a state optimal for sharing and reuse.

Results:

Figure 1 illustrates the high quality (high inter-trial coherence indicating reliable synchronization and high signal-to-noise ratio) and the excellent between-site reliability. Figure 2 demonstrates the EEG/ET integration. Besides sharing the data, we are sharing the protocol we designed (as Experiment Builder files) and the software infrastructure we implemented to convert the source files in BIDS format, preprocess them using the PyLossless pipeline (Huberty, Desjardins, et al., 2024), and post-process them in epoched data for easier analyses. The resources released by this project will support the reuse and easy deployment of this protocol for other projects.
Supporting Image: vep_caption.png
   ·Figure 1. Inter-trial coherence (ITC) for the Visual Steady-State Task.
Supporting Image: plr_caption.png
   ·Figure 2. Pupil and EEG responses during the Pupillary Light Reflex task.
 

Conclusions:

Q1K was designed as a large-cohort, inclusive, Open Science project. By releasing the dataset, the protocol, and the software architecture used to run its EEG/ET protocol, we hope to support standardization and data pooling into larger, more diverse datasets for the study of ASC.

Disorders of the Nervous System:

Neurodevelopmental/ Early Life (eg. ADHD, autism) 1

Modeling and Analysis Methods:

EEG/MEG Modeling and Analysis

Neuroinformatics and Data Sharing:

Databasing and Data Sharing 2

Novel Imaging Acquisition Methods:

EEG

Keywords:

Autism
Data analysis
Design and Analysis
Development
DISORDERS
Electroencephaolography (EEG)
Experimental Design
Open Data
Open-Source Code
Open-Source Software

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I do not want to participate in the reproducibility challenge.

Please indicate below if your study was a "resting state" or "task-activation” study.

Resting state
Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Patients

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

No

Please indicate which methods were used in your research:

EEG/ERP
Other, Please specify  -   Eye tracking

Which processing packages did you use for your study?

Other, Please list  -   MNE-Python

Provide references using APA citation style.

Edgar, J. C., Fisk, C. L., Liu, S., Pandey, J., Herrington, J. D., Schultz, R. T., & Roberts, T. P. L. (2016). Translating Adult Electrophysiology Findings to Younger Patient Populations: Difficulty Measuring 40-Hz Auditory Steady-State Responses in Typically Developing Children and Children with Autism Spectrum Disorder. Developmental Neuroscience, 38(1), 1–14.
Gliga, T., Bedford, R., Charman, T., Johnson, M. H., Baron-Cohen, S., Bolton, P., Cheung, C., Davies, K., Liew, M., Fernandes, J., Gammer, I., Maris, H., Salomone, E., Pasco, G., Pickles, A., Ribeiro, H., & Tucker, L. (2015). Enhanced Visual Search in Infancy Predicts Emerging Autism Symptoms. Current Biology, 25(13), 1727–1730.
Green, H. L., Shuffrey, L. C., Levinson, L., Shen, G., Avery, T., Randazzo Wagner, M., Sepulveda, D. M., Garcia, P., Maddox, C., Garcia, F., Hassan, S., & Froud, K. (2020). Evaluation of mismatch negativity as a marker for language impairment in autism spectrum disorder. Journal of Communication Disorders, 87, 105997.
Lalancette, E., Charlebois-Poirier, A.-R., Agbogba, K., Knoth, I. S., Jones, E. J. H., Mason, L., Perreault, S., & Lippé, S. (2022). Steady-state visual evoked potentials in children with neurofibromatosis type 1: Associations with behavioral rating scales and impact of psychostimulant medication. Journal of Neurodevelopmental Disorders, 14(1), 42.
Neuhaus, E., Lowry, S. J., Santhosh, M., Kresse, A., Edwards, L. A., Keller, J., Libsack, E. J., Kang, V. Y., Naples, A., Jack, A., Jeste, S., McPartland, J. C., Aylward, E., Bernier, R., Bookheimer, S., Dapretto, M., Van Horn, J. D., Pelphrey, K., Webb, S. J., & and the ACE GENDAAR Network. (2021). Resting state EEG in youth with ASD: Age, sex, and relation to phenotype. Journal of Neurodevelopmental Disorders, 13(1), 33.
Nyström, P., Gliga, T., Nilsson Jobs, E., Gredebäck, G., Charman, T., Johnson, M. H., Bölte, S., & Falck-Ytter, T. (2018). Enhanced pupillary light reflex in infancy is associated with autism diagnosis in toddlerhood. Nature Communications, 9(1), Article 1.
Portugal, A. M., Bedford, R., Cheung, C. H. M., Gliga, T., & Smith, T. J. (2021). Saliency-Driven Visual Search Performance in Toddlers With Low- vs High-Touch Screen Use. JAMA Pediatrics, 175(1), 96–97.
Saez de Urabain, I. R., Nuthmann, A., Johnson, M. H., & Smith, T. J. (2017). Disentangling the mechanisms underlying infant fixation durations in scene perception: A computational account. Vision Research, 134, 43–59.

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No