Mapping Auditory Conscious Perception with Machine Learning - Beyond Subjective Report

Poster No:

2029 

Submission Type:

Abstract Submission 

Authors:

Shanae Aerts1, Anjali Mangla1, Will Sanok-Dufallo1, Sharif Kronemer1, Aya Khalaf1, Thomas Xin1, Ayushe Sharma2, Taruna Yadav1, Kate Christison-Lagay1, Al Powers1, Hal Blumenfeld1

Institutions:

1Yale University School of Medicine, New Haven, CT, 2Yale University, New Haven, CT

First Author:

Shanae Aerts  
Yale University School of Medicine
New Haven, CT

Co-Author(s):

Anjali Mangla  
Yale University School of Medicine
New Haven, CT
Will Sanok-Dufallo  
Yale University School of Medicine
New Haven, CT
Sharif Kronemer, PhD  
Yale University School of Medicine
New Haven, CT
Aya Khalaf, PhD  
Yale University School of Medicine
New Haven, CT
Thomas Xin  
Yale University School of Medicine
New Haven, CT
Ayushe Sharma, PhD  
Yale University
New Haven, CT
Taruna Yadav, PhD  
Yale University School of Medicine
New Haven, CT
Kate Christison-Lagay  
Yale University School of Medicine
New Haven, CT
Al Powers, MD, PhD  
Yale University School of Medicine
New Haven, CT
Hal Blumenfeld, MD, PhD  
Yale University School of Medicine
New Haven, CT

Introduction:

Patients with disorders of consciousness (DoC) cannot easily report their experience, resulting in up to 40% of patients being misdiagnosed. This holds serious consequences for end-of-life decision making. Further, the neural activity linked to the report of perception introduces confounding signals that are not specific for perceptual awareness. Prior work from our lab in the visual modality identified cortical and subcortical networks related to perception without report. The auditory domain is well suited to investigating DoC due to relative portability and ability test patients with closed eyes. Here, we identify brain networks for report-independent auditory perception using machine learning and fMRI.

Methods:

Our study presents auditory stimuli to each ear separately at individualized perceptual threshold intensities. Subjects report these at-threshold sounds as perceived or nonperceived in the reported-on ear while simultaneous asynchronous sounds are presented in the nonreported-on ear. Behavioral responses and reflexive eye metrics (pupil diameter, blink rate, microsaccade rate) are recorded during MRI acquisition. Eye metrics from the report condition train a machine learning classifier, which is subsequently predicts perceptual events in non-reported trials. Functional MRI analyses assess perception-related network activity for perceived versus nonperceived trials.

Results:

Behavioral data (n=24) for the report ear show threshold-level stimuli were heard in 55% of trials, sound identification accuracy (either whistle, laser or water drop) was 89% for these heard stimuli (chance = 33%), and blank trials resulted in a 10% false-positive rate. Eye metrics (n=24) demonstrated significant differences in post-stimulus timecourses between perceived and nonperceived stimuli (p<0.05), including pupil dilation (mm across 200–2500 ms), blink rate (% across 1130–1880 ms), and microsaccade rate (% across 153–510 ms). Machine learning based on pilot data alone achieved acceptable classification of auditory perception (ROC AUC = 0.74). fMRI results (n=10) during the report condition revealed significant activation (p<0.05) in detection, arousal, and salience (DAS) networks (i.e., frontal eye fields and anterior cingulate) 3 seconds post perceived stimulus, followed by activation in task-positive network (TPN) areas (i.e., anterior medial frontal gyrus and dorsal inferior parietal lobule) and deactivation of default-mode network (DMN) areas (i.e., ventral medial prefrontal cortex and posterior cingulate) at 6 seconds.

Conclusions:

Reflexive eye metrics effectively classify auditory perception, supporting their utility in disentangling perceptual processes from report-related confounds. Functional neuroimaging results corroborate findings in visual perception, suggesting modality-independent mechanisms for conscious perception. Future work will expand this framework to investigate potential targets for clinical stimulation in disorders of consciousness.

Modeling and Analysis Methods:

Activation (eg. BOLD task-fMRI)
Classification and Predictive Modeling

Perception, Attention and Motor Behavior:

Consciousness and Awareness 2
Perception: Auditory/ Vestibular 1

Keywords:

Autonomics
Consciousness
Machine Learning
MRI
Perception
Other - Auditory

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.

Please indicate below if your study was a "resting state" or "task-activation” study.

Resting state
Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

Yes

Are you Internal Review Board (IRB) certified? Please note: Failure to have IRB, if applicable will lead to automatic rejection of abstract.

Yes, I have IRB or AUCC approval

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Yes

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

Functional MRI
Structural MRI
Behavior
Computational modeling
Other, Please specify  -   Eye tracking

For human MRI, what field strength scanner do you use?

3.0T

Which processing packages did you use for your study?

SPM

Provide references using APA citation style.

Kronemer, S. I., et al, (2022). Human visual consciousness involves large scale cortical and subcortical networks independent of task report and eye movement activity. Nature Communications, 13, Article 7342. https://doi.org/10.1038/s41467-022-35117-4

Schnakers, C., et al. (2009). Diagnostic accuracy of the vegetative and minimally conscious state: Clinical consensus versus standardized neurobehavioral assessment. BMC Neurology, 9, Article 35. https://doi.org/10.1186/1471-2377-9-35

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No