Poster No:
2033
Submission Type:
Abstract Submission
Authors:
Anna Behler1, Megan Campbell2, Brodi Bathgate3, Johan van der Meer4, Saurabh Sonkusare5, Michael Breakspear6
Institutions:
1The University of Newcastle, Challaghan, NSW, 2The University of Queensland, Brisbane, AK, 3The University of Newcastle, Newcastle, NSW, 4Queensland University of Technology, Brisbane, QLD, 5University of Newcastle, Newcastle, New South Wales, 6The University of Newcastle, New Lambton Heights, NSW
First Author:
Anna Behler
The University of Newcastle
Challaghan, NSW
Co-Author(s):
Introduction:
With the rise of streaming platforms, closed captions have become a staple of daily life, with over 50% of viewers opting for subtitles while streaming multimedia (Preply, 2024). Watching captioned media demands rapid shifts between observing content and reading captions, potentially increasing cognitive load and limiting information processing (Alzahabi & Becker, 2013). Yet, the integration of these inputs typically appears seamless to the viewer (d'Ydewalle & De Bruycker, 2007). During movie viewing, the brain undergoes well-defined functional states, with transitions that are temporally aligned to specific features of the movie, such as scene changes or narrative elements (van der Meer et al., 2020). Incorporating closed captions as an additional visual stream within the multimodal sensory experience underscores the intricate interplay between visual and auditory processing. This prompts critical questions about the mechanisms underlying visual attention and the brain's strategies for optimising the integration and acquisition of sensory information.
Methods:
Using eye-tracking and functional Magnetic Resonance Imaging (fMRI), this study explores how eye movement behaviour and brain states vary between reading captions and gazing at visual scenes, and how these dynamics are modulated by the presence or absence of audio.
Eye-tracking data were collected from 18 healthy participants who viewed a 20-minute movie with alternating audio availbality. From that data, eye movements like saccades and fixations were analysed within designated areas of interest. In addition, fMRI data with concurrent eye-tracking were acquired from 15 additional participants who underwent the same movie-viewing paradigm. This allowed for the identification of brain states and their transitions using a Hidden Markov. The availability of gaze positions allowed to link the processed visual input, i.e., text or scene, with brain state transitions and provides a comprehensive approach to understanding both behavioural and neural correlates of multisensory processing during media consumption.
Results:
Viewers seamlessly integrate captions and visual scenes by frequently shifting gaze between the two inputs (Fig. 1). In the absence of audio, viewers adapt by adjusting transition probabilities between reading captions and observing scenes, rather than altering the duration of reading intervals. fMRI data, analysed using Hidden Markov Models, revealed distinct temporal dynamics of brain states. These states demonstrated a reshaping of functional network activity and reliable transitions that accompany the switch from purely visual to visual-auditory perception, reflecting dynamic adaptation to missing auditory input. Language, emotional content, and narrative elements strongly influenced engagement with captions, directing attention within the multisensory media experience.
Conclusions:
These findings demonstrate how gaze behaviour and brain states are shaped by perceptual and contextual features, with viewers dynamically adapting to auditory absence. By investigating the integration of captions into media experiences, this study provides new insights into how the brain flexibly manages attention and processes sensory streams in complex, real-world environments.
Higher Cognitive Functions:
Higher Cognitive Functions Other
Perception, Attention and Motor Behavior:
Perception: Multisensory and Crossmodal 1
Perception: Visual 2
Keywords:
FUNCTIONAL MRI
Hearing
Modeling
Perception
Vision
1|2Indicates the priority used for review
By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.
I accept
The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information.
Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:
I do not want to participate in the reproducibility challenge.
Please indicate below if your study was a "resting state" or "task-activation” study.
Task-activation
Healthy subjects only or patients (note that patient studies may also involve healthy subjects):
Healthy subjects
Was this research conducted in the United States?
No
Were any human subjects research approved by the relevant Institutional Review Board or ethics panel?
NOTE: Any human subjects studies without IRB approval will be automatically rejected.
Yes
Were any animal research approved by the relevant IACUC or other animal research panel?
NOTE: Any animal studies without IACUC approval will be automatically rejected.
Not applicable
Please indicate which methods were used in your research:
Functional MRI
Other, Please specify
-
eye-tracking
For human MRI, what field strength scanner do you use?
3.0T
Which processing packages did you use for your study?
Other, Please list
-
fMRIprep
Provide references using APA citation style.
Alzahabi, R., & Becker, M. W. (2013). The association between media multitasking, task-switching, and dual-task performance. Journal of Experimental Psychology: Human Perception and Performance, 39(5), 1485–1495. https://doi.org/10.1037/a0031208
Behler, A., Campbell, M. E. J., Bathgate, B., Van Der Meer, J., Sonkusare, S., & Breakspear, M. (2024). Adaptation of eye movement behaviour during closed-captioned movie viewing. Open Science Framework. https://doi.org/10.31219/osf.io/8mvwe
d’Ydewalle, G., & De Bruycker, W. (2007). Eye Movements of Children and Adults While Reading Television Subtitles. European Psychologist, 12(3), 196–205. https://doi.org/10.1027/1016-9040.12.3.196
Preply. (2024). America’s subtitle habits: What the data tells us about the rise in subtitles usage. Retrieved December 11, 2024, from https://preply.com/en/blog/americas-subtitles-use/
van der Meer, J. N., Breakspear, M., Chang, L. J., Sonkusare, S., & Cocchi, L. (2020). Movie viewing elicits rich and reliable brain state dynamics. Nature Communications, 11(1), 5004. https://doi.org/10.1038/s41467-020-18717-w
No