Time-dependent contextual feedback signals increase V1 responses to unpredictable scene information

Zirui Zhang Presenter
University of Glasgow
School of Psychology and Neuroscience
GLASGOW, Scotland 
United Kingdom
 
Wednesday, Jun 26: 11:30 AM - 12:45 PM
3065 
Oral Sessions 
COEX 
Room: Grand Ballroom 104-105 
Introduction Information processing operations in the visual cortex are tuned to the statistical regularities of sensory inputs and are crucially dependent on context. Neurobiologically inspired computational frameworks of visual processing emphasise functional interactions between higher and lower cortical areas, whereby higher areas send feedback signals that influence feedforward processing in lower areas. Using a partial visual occlusion approach in which a mask covers the lower right quadrant of natural scene images, we can isolate feedback signals in the retinotopic visual cortex that processes the occluded image portion (Muckli et al., 2015, Morgan 2019, Muckli 2023). Based on our earlier findings using apparent motion stimulation where feedback signals suppress predictable sensory inputs (Alink et al., 2010), we hypothesised that a priming contextual scene would increase the response to subsequent unpredictable sensory information, while it would reduce or stabilise the response to consistent, expected sensory information.