Poster No:
1967
Submission Type:
Abstract Submission
Authors:
Iktae Kim1, Adam Santorelli1, Nicole Burke1, Nathalia Bianchini Esper1, Bryan Gonzalez1, John d’Antonio-Bertagnolli1, Stan Colcombe1, Alexandre Franco1, Gregory Kiar1, Michelle Freund1, Michael Milham1
Institutions:
1Child Mind Institute, New York City, NY
First Author:
Iktae Kim
Child Mind Institute
New York City, NY
Co-Author(s):
Introduction:
Mobile Brain/Body Imaging (MoBI) research seeks to understand neural activity as participants engage in naturalistic, free-moving behaviors. Such work often entails collecting electroencephalogram (EEG), motion capture, physiological signals, and other sensor data under variable and ecologically valid conditions (Gramann, 2019). Integrating, monitoring, and assessing these inputs in real-time presents a challenge, as existing tools commonly emphasize post-hoc analysis or single-modality visualization (Makeig et al., 2009). Real-time visualization and basic processing of diverse data streams facilitate immediate feedback, guide protocol adjustments, and minimize costly data loss. Prior MoBI experiments demonstrated the feasibility and benefits of capturing neural signals in real-world tasks (Makeig et al., 2009), but existing approaches rarely provide integrated, real-time multimodal feedback. Current alternatives, such as Stream Viewer (Intheon, 2016) and mobilab (SCCN, 2017) are limited by maintenance issues, high dependencies on external packages, and lack of live visualization.To address these limitations, we present MoBI-View, a standalone platform that utilizes the Lab Streaming Layer (LSL) (Kothe, 2014) to synchronize and visualize multiple data streams as they are acquired, with reduced dependencies on external packages. By displaying EEG, motion capture, and other sensor data together, this platform aims to provide researchers with immediate insights into data quality and consistency, offering a base for incorporating more advanced quality control (QC) and analytic methods.
Methods:
MoBI-View is implemented as a Python application with a PyQt5-based interface, following a Model-View-Presenter architecture. Data streams are acquired through LSL, enabling concurrent synchronization of multiple data streams. Basic processing functions, including adjustable filtering and channel selection, support immediate quality assessments. The software's modular design encourages incremental integration of more advanced QC methods and analysis pipelines, and is well documented to facilitate contributions by external users. Thus, future developments can incorporate established QC strategies or tailored analytic algorithms without extensive refactoring.

·Unified Modeling Language style schematic of the MoBI-View architecture.
Results:
Preliminary evaluations of MoBI-View show reliable real-time visualization of multiple synchronized data feeds. The integrated view allowed early identification of unexpected packet losses, connection issues, or external signal disturbances. Performance will be evaluated across various metrics, including synchronization accuracy, latency, system resource utilization, and user experience feedback. These evaluations will assess the platform's ability to maintain data integrity and operational efficiency under varying experimental conditions. The system's extensible architecture further sets the stage for integrating robust QC measures and analytic models, including those that have been explored in previous MoBI studies (Jungnickel et al., 2019).

·A screenshot of the MoBI-View during an ongoing data collection session.
Conclusions:
MoBI-View offers a practical foundation for real-time, multimodal data integration and preliminary processing in MoBI research. By reducing reliance on post-hoc corrections and allowing immediate adjustments, data collection is aligned more closely with the intended experimental protocols. As additional QC and analysis features are integrated, we anticipate that the platform will further improve data quality, reduce the need for data re-collection, and enhance the overall efficiency and reliability of MoBI workflows.
Modeling and Analysis Methods:
Methods Development
Multivariate Approaches
Neuroinformatics and Data Sharing:
Workflows
Informatics Other 2
Novel Imaging Acquisition Methods:
Multi-Modal Imaging 1
Keywords:
Computational Neuroscience
Data analysis
Data Organization
Electroencephaolography (EEG)
ELECTROPHYSIOLOGY
Multivariate
Open-Source Code
Open-Source Software
Workflows
Other - Mobile Brain/Body Imaging (MoBI)
1|2Indicates the priority used for review
By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.
I accept
The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information.
Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:
I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.
Please indicate below if your study was a "resting state" or "task-activation” study.
Other
Healthy subjects only or patients (note that patient studies may also involve healthy subjects):
Patients
Was this research conducted in the United States?
Yes
Are you Internal Review Board (IRB) certified?
Please note: Failure to have IRB, if applicable will lead to automatic rejection of abstract.
Yes, I have IRB or AUCC approval
Were any human subjects research approved by the relevant Institutional Review Board or ethics panel?
NOTE: Any human subjects studies without IRB approval will be automatically rejected.
Yes
Were any animal research approved by the relevant IACUC or other animal research panel?
NOTE: Any animal studies without IACUC approval will be automatically rejected.
Not applicable
Please indicate which methods were used in your research:
EEG/ERP
Neurophysiology
Behavior
Neuropsychological testing
Computational modeling
Provide references using APA citation style.
Gramann, K. (2019). Mobile Brain/Body Imaging (MoBI) – Understanding Brain Function During Active Behavior. IEEE Pervasive Computing, 18(1), 20–29. https://doi.org/10.1109/MPRV.2019.2899335
Makeig, S., Gramann, K., Jung, T.-P., Sejnowski, T. J., & Poizner, H. (2009). Linking brain, mind and behavior. International Journal of Psychophysiology, 73(2), 95–100. https://doi.org/10.1016/j.ijpsycho.2008.11.008
Intheon. (2016). Stream Viewer [Computer software]. GitHub. https://github.com/intheon/stream_viewer
Swartz Center for Computational Neuroscience (SCCN). (2017). mobilab [Computer software]. GitHub. https://github.com/sccn/mobilab
Kothe, C. A. (2014). Lab Streaming Layer (LSL) [Computer software]. GitHub. https://github.com/sccn/labstreaminglayer
Jungnickel, E., Gehrke, L., Klug, M., & Gramann, K. (2019). MoBI—Mobile brain/body imaging. Neuroergonomics, 59-63.
No