Poster No:
1936
Submission Type:
Abstract Submission
Authors:
Michelangelo Tani1,2,3,4, Federica Bencivenga1,5, Krishnendu Vyas2,3,4, Federico Giove4,6, Steve Gazzitano4, Sabrina Pitzalis4,7, Gaspare Galati2,4
Institutions:
1Shared co-first authorship, Rome, Italy, 2Department of Psychology, Sapienza, University of Rome, Rome, Italy, 3PhD Program in Behavioral Neuroscience, Department of Psychology, Sapienza, University of Rome, Rome, Italy, 4Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia), Rome, Italy, 5Department of Neuroscience, University of Montréal, QC, Canada, 6Marbilab, Enrico Fermi Centre, Rome, Italy, 7Department of Movement, Human and Health Sciences, “Foro Italico” University, Rome, Italy
First Author:
Michelangelo Tani
Shared co-first authorship|Department of Psychology, Sapienza, University of Rome|PhD Program in Behavioral Neuroscience, Department of Psychology, Sapienza, University of Rome|Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)
Rome, Italy|Rome, Italy|Rome, Italy|Rome, Italy
Co-Author(s):
Federica Bencivenga
Shared co-first authorship|Department of Neuroscience, University of Montréal
Rome, Italy|QC, Canada
Krishnendu Vyas
Department of Psychology, Sapienza, University of Rome|PhD Program in Behavioral Neuroscience, Department of Psychology, Sapienza, University of Rome|Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)
Rome, Italy|Rome, Italy|Rome, Italy
Federico Giove
Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)|Marbilab, Enrico Fermi Centre
Rome, Italy|Rome, Italy
Steve Gazzitano
Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)
Rome, Italy
Sabrina Pitzalis
Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)|Department of Movement, Human and Health Sciences, “Foro Italico” University
Rome, Italy|Rome, Italy
Gaspare Galati
Department of Psychology, Sapienza, University of Rome|Neuroimaging Laboratory, Santa Lucia Foundation (IRCCS Fondazione Santa Lucia)
Rome, Italy|Rome, Italy
Introduction:
Neuroimaging research on sensorimotor interactions faces theoretical and methodological challenges. fMRI studies on the topic are usually restricted to oversimplified, impoverished experimental setups, with static environmental stimuli and absent or constrained visual feedback of the limbs. Here, we developed and tested MOTUM (Motion Online Tracking Under MRI), a hardware and software setup for fMRI that combines virtual reality (VR) and motion capture to track body movements and reproduce them in real time in a VR environment. MOTUM allows the generation of dynamically responsive experimental scenarios that capture the complexity of body-environment interactions with fine precision. As a proof-of-concept, we present an investigation of reach-to-grasp visually guided movements, disentangling the effect of visual feedback on one's movement.
Methods:
MOTUM is composed of two acquisition devices (Fig.1): an MR-compatible glove (Data Glove Ultra, 5DT, Pretoria, South Africa) with 14 sensors tracking flexion and extension movements of the right-hand fingers, and a motion capture system of three MR-compatible cameras (Oqus, Qualisys, Göteborg, Sweden) mounted on the front wall within the MR room, used to track limb movements (in this study, right forearm and wrist rotational and translational movements). Data are fed into a control PC running Qualysis Track Manager to reconstruct images from cameras into a 3D skeleton representation, which is then fed into Unity (Unity Technologies, San Francisco, US) and used to animate a first-person humanoid avatar through ad-hoc software package (publicly available). Visual output is then presented through a binocular MR-compatible headset (Visual System HD, NordicNeuroLab, Bergen, Norway) providing an immersive experience.
We tested the system in a Siemens Prisma scanner on 7 right-handed healthy volunteers asked to grasp narrow or wide parts of a custom 3D object through a precision or power grip. In separate blocks, participants (a) grasped without visual feedback, (b) grasped with online feedback of the hand movement through MOTUM, (c) observed replays of movements recorded in previous trials or (d) observed the static scene (baseline). Data were analyzed using fMRIprep and SPM12. We quantified head movements for each volume by computing the framewise displacement (FD) and estimated the trial-by-trial amount of arm motion. Both head- and arm motion estimates were used as modulatory regressors in the GLM to control for motion artifacts.

Results:
The system successfully tracked and streamed the reach-to-grasp kinematics of all participants in real-time, with occasional sub-second loss of tracking, less frequent when hands were closer to the scanner bore end (e.g., for taller volunteers). Head movements were not critically higher than the usually accepted standards (i.e., FD < 0.9 mm).
The main effect of movement factor showed the expected clusters in contralateral M1, dPMC, and vPMC and in bilateral aIPS, SMA, S1, SPL, mCC, and cerebellum.
The main effect of movement observation showed activity in visual areas such as hMT+ and SOG and in aIPS and lateral SPL, and cerebellum bilaterally. The interaction resulted in activity within dorsal secondary visual areas in most subjects and in the M1-S1 hand area in four of them.
Conclusions:
The MOTUM system provides a high-quality immersive experience while not introducing evident movement-related artifacts and is a promising tool for studying human visuomotor functions. Although we tested it during hand movements, it can be easily expanded to other body parts (e.g., lower limbs) by adding some additional cameras, which would also improve the quality of hand tracking. This paves the way for a wide range of real-life actions to be performed during fMRI scans, with an extensive repertoire of possible virtual (realistic or non-realistic) scenarios, potentially determining a breakthrough in the research field of sensorimotor integration and beyond.
Motor Behavior:
Motor Planning and Execution
Visuo-Motor Functions 2
Motor Behavior Other
Novel Imaging Acquisition Methods:
Imaging Methods Other 1
Keywords:
FUNCTIONAL MRI
Motor
Open-Source Code
Other - fMRI apparatus; Virtual Reality; Kinematics; Sensorimotor Integration
1|2Indicates the priority used for review
By submitting your proposal, you grant permission for the Organization for Human Brain Mapping (OHBM) to distribute your work in any format, including video, audio print and electronic text through OHBM OnDemand, social media channels, the OHBM website, or other electronic publications and media.
I accept
The Open Science Special Interest Group (OSSIG) is introducing a reproducibility challenge for OHBM 2025. This new initiative aims to enhance the reproducibility of scientific results and foster collaborations between labs. Teams will consist of a “source” party and a “reproducing” party, and will be evaluated on the success of their replication, the openness of the source work, and additional deliverables. Click here for more information.
Propose your OHBM abstract(s) as source work for future OHBM meetings by selecting one of the following options:
I am submitting this abstract as an original work to be reproduced. I am available to be the “source party” in an upcoming team and consent to have this work listed on the OSSIG website. I agree to be contacted by OSSIG regarding the challenge and may share data used in this abstract with another team.
Please indicate below if your study was a "resting state" or "task-activation” study.
Task-activation
Healthy subjects only or patients (note that patient studies may also involve healthy subjects):
Healthy subjects
Was this research conducted in the United States?
No
Were any human subjects research approved by the relevant Institutional Review Board or ethics panel?
NOTE: Any human subjects studies without IRB approval will be automatically rejected.
Yes
Were any animal research approved by the relevant IACUC or other animal research panel?
NOTE: Any animal studies without IACUC approval will be automatically rejected.
Not applicable
Please indicate which methods were used in your research:
Functional MRI
For human MRI, what field strength scanner do you use?
3.0T
Which processing packages did you use for your study?
AFNI
SPM
FSL
Free Surfer
Other, Please list
-
fMRIPrep
Provide references using APA citation style.
not applicable.
No