Auditory stimuli extend the temporal window of visual integration by modulating alpha oscillations

Poster No:

2028 

Submission Type:

Late-Breaking Abstract Submission 

Authors:

Qi Chen1

Institutions:

1South China Normal University, Guangzhou, Guangdong Province

First Author:

Qi Chen  
South China Normal University
Guangzhou, Guangdong Province

Late Breaking Reviewer(s):

Sylvain Baillet  
Montreal Neurological Institute
Montreal, Quebec
Michael Breakspear, PhD  
The University of Newcastle
New Lambton Heights, NSW
Sofie Valk  
Max Planck Institute for Human Cognitive and Brain Sciences
Leipzig, Saxony

Introduction:

In daily life, our brains integrate sensory information into coherent experiences, a process complicated by cross-modal interactions (Shams et al., 2002). The temporal relationship between visual and auditory stimuli significantly impacts visual temporal resolution (Fendrich & Corballis, 2001; Shimojo et al., 2001), suggesting auditory stimuli may influence visual processing, though the neural mechanisms remain unclear. Temporal windows, critical for visual perception, are intervals where stimuli interact to shape perception (Samaha & Romei, 2023). Alpha oscillations are thought to define these windows (Baumgarten et al., 2015). In audiovisual processing, adding sound to two visual flashes increases fusion illusions (Andersen et al., 2004), possibly by modulating alpha oscillations.

Methods:

34 healthy right-handed participants were recruited. Visual stimuli were two white circles (2° radius) on a gray background, displayed for 10 ms each, 5° below fixation. Auditory stimuli were 10 ms, 3000 Hz tones (~50 dB) delivered via speakers. Two conditions were tested: F2 (two flashes) and F2B1 (two flashes with a beep).
A pretest determined individual fusion thresholds for bistable trials, used as ISIs in the EEG experiment. Participants completed 560 trials: 15% short (ISI 30 ms), 15% long (ISI 100 ms), and 70% bistable (fusion threshold ISI), divided into 10 blocks. Trials began with 500–1500 ms fixation, followed by a 10 ms flash (with/without beep), a variable ISI, and a second 10 ms flash. Responses were allowed within 2000 ms.
EEG was recorded at 1000 Hz using a 64-channel Neuronscan SynAmps2 system. Data were filtered, down-sampled to 500 Hz, re-referenced to average, and epoched. Baseline correction and artifact removal (ICA) were applied.
Focus was on bistable trials in F2 and F2B1. FFT generated power spectra (1–30 Hz). Instantaneous alpha frequency (IAF) and power were computed for occipital electrodes. Phase opposition sum (POS) assessed phase differences, with significance tested via permutation.
Cluster-based correction addressed multiple comparisons. For time-frequency points, clusters were defined by suprathreshold elements (p < 0.05). For time points, paired t-tests identified temporal clusters, with cluster-level statistics calculated from summed t-values.
EEG data simulations were conducted using MATLAB.

Results:

Auditory Input Extends Visual Integration via Alpha Frequency Reduction:
EEG data showed auditory input lowered occipital alpha frequency in F2B1, extending the visual integration window. Reduced alpha frequency correlated with longer fusion thresholds (r = 0.45, p = 7.55 × 10⁻³), confirmed by control analyses.
Auditory Input Disrupts Alpha Frequency Predictive Role:
In F2, higher alpha frequency predicted 2-flash percepts pre- and post-stimulus (p = 0.02; p = 4.00 × 10⁻³). In F2B1, this predictive role was absent, with a significant interaction (F(1,33) = 9.86, p = 3.55 × 10⁻³).
Auditory Input Enhances Alpha Phase Predictive Role:
F2B1 showed significant phase opposition between 1-flash and 2-flash trials (p = 3.03 × 10⁻³), absent in F2. Phase modulation was stronger in F2B1 (F(1,33) = 6.45, p = 0.02).
Phase-Resetting Model Explains Findings:
Simulations replicated results, showing auditory input resets alpha phase, lowers frequency, and extends integration windows. Increased alpha ITC in F2B1 suggested phase resetting, with no power changes, supporting spontaneous oscillation resetting in occipital regions.

Conclusions:

Auditory input extended the visual integration window, with poststimulus alpha frequency decreasing and correlating with the prolonged window, supporting alpha oscillations' role in timing. Prestimulus alpha frequency's predictive role decreased, while alpha phase's role increased. A phase-resetting model replicated these findings, indicating auditory input resets alpha phase in the visual cortex, reducing alpha frequency and extending integration.

Novel Imaging Acquisition Methods:

EEG 2

Perception, Attention and Motor Behavior:

Perception and Attention Other 1

Keywords:

Other - Multisensory perception,alpha oscillations,fusion illusion,phase reset

1|2Indicates the priority used for review

Abstract Information

By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.

I accept

The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information. Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:

I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.

Please indicate below if your study was a "resting state" or "task-activation” study.

Task-activation

Healthy subjects only or patients (note that patient studies may also involve healthy subjects):

Healthy subjects

Was this research conducted in the United States?

No

Were any human subjects research approved by the relevant Institutional Review Board or ethics panel? NOTE: Any human subjects studies without IRB approval will be automatically rejected.

Not applicable

Were any animal research approved by the relevant IACUC or other animal research panel? NOTE: Any animal studies without IACUC approval will be automatically rejected.

Not applicable

Please indicate which methods were used in your research:

EEG/ERP
Behavior
Computational modeling

Provide references using APA citation style.

(Shams et al., 2002).Shams L, Kamitani Y, Shimojo S. 2002. Visual illusion induced by sound. Cognitive brain research 14: 147-152.
(Fendrich and Corballis, 2001).Fendrich R, Corballis PM. 2001. The temporal cross-capture of audition and vision. Perception & Psychophysics 63: 719-725.
(Shimojo et al., 2001).Shimojo S, Scheier C, Nijhawan R, Shams L, Kamitani Y, Watanabe K. 2001. Beyond perceptual modality: Auditory effects on visual perception. Acoustical Science and Technology 22: 61-67.
(Samaha and Romei, 2023).Samaha J, Romei V. 2023. Alpha-band frequency and temporal windows in perception: a review and living meta-analysis of 27 experiments (and counting). Journal of Cognitive Neuroscience: 1-15.
(Baumgarten et al., 2015).Baumgarten TJ, Schnitzler A, Lange J. 2015. Beta oscillations define discrete perceptual cycles in the somatosensory domain. Proceedings of the National Academy of Sciences 112: 12187-12192.
(Andersen et al., 2004).Andersen TS, Tiippana K, Sams M. 2004. Factors influencing audiovisual fission and fusion illusions. Cognitive Brain Research 21: 301-308.

UNESCO Institute of Statistics and World Bank Waiver Form

I attest that I currently live, work, or study in a country on the UNESCO Institute of Statistics and World Bank List of Low and Middle Income Countries list provided.

No