Research Focus
The Visual and Cognitive Neuroscience Lab investigates how information from audition, emotion, action and cognition influences the brain processes that create our visual perception of the world.
Human visual cortex is often thought of as mainly processing information arriving from the eyes. However, research from our lab and others has shown that visual cortex also receives and represents information from the rest of the brain. The aim of our research is to find out how visual processing is influenced by non-visual information and how this affects our visual perception. We focus on auditory, emotional, action-related, predictive and cognitive influences on vision (see Research Topics). In addition, we investigate whether these influences on visual processing occur on a conscious or unconscious level, and in the presence and absence of sight, e.g. in congenital blindness.
The lab is currently funded by a PRIMA Grant from the Swiss National Science Foundation (1.5 Mio CHF, 5 years, Principal Investigator: Prof. Petra Vetter), a research grant from the BIAL Foundation (40K EUR, 3 years, PI: Prof. Petra Vetter, Co-PI: Prof. Carolyn McGettigan, UCL), and, from 2024, a SNSF Consolidator Grant (1.75 Mio CHF, 5 years, PI: Prof. Petra Vetter).
Petra Vetter co-hosts an international online seminar series on multisensory perception and plasticity, together with Prof. Stephanie Badde at Tufts University. Feel free to join our seminars here: https://www.world-wide.org/Neuro/Multisensory-Perception-and-Plasticity/
Methods
We employ a range of methods from cognitive neuroscience: fMRI (including multivariate pattern analysis and retinotopic mapping), eye-tracking, psychophysics (continuous flash suppression, binocular rivalry, dual-task paradigms), psychophysiology (fear-conditioning, skin conductance response), transcranial magnetic stimulation (TMS).
Key Publications
Hu, J., & Vetter, P. (2024). How the eyes respond to sounds. Annals of the New York Academy of Sciences, 1532(1), 18–36.
Bola, Ł.*, Vetter, P.*, Wenger, M. & Amedi, A. (2023). Decoding reach direction in early “visual” cortex of congenitally blind individuals. Journal of Neuroscience, 43(46), 7868–7878.
Vetter, P.*, Bola, L.*, Reich, L., Bennett, M., Muckli, L., & Amedi, A. (2020). Decoding natural sounds in early “visual” cortex of congenitally blind individuals. Current Biology, 30 (15), 3039-3044.e2.
Vetter, P., Smith, F. W., & Muckli, L. (2014). Decoding sound and imagery content in early visual cortex. Current Biology, 24 (11), 1256–1262.
Vetter, P.*, Badde, S.*, Phelps, E. A., & Carrasco, M. (2019). Emotional faces guide the eyes in the absence of awareness. ELife, 8, e43467.
Vetter, P., Grosbras, M.-H. & Muckli, L. (2015). TMS over V5 disrupts motion prediction. Cerebral Cortex, 25(4), 1052-9.
Vetter, P. & Newen, A. (2014). Varieties of cognitive penetration in visual perception. Consciousness & Cognition, 27C, 62-75.