Our sensory world contains statistical regularities and we use those regularities to shape what we perceive. In this talk, I will present psychophysical, modelling and neuroimaging (EEG and 7T MRI) work from the lab that considers how. The proposed mechanisms involve using predictions about what is likely to be there to render perception (1) accurate in the face of continuous streams of noisy information, and (2) optimised for future learning. How humans achieve optimisation on both fronts is a tricky puzzle to solve, and I will present the possibilities we are examining.
I started my academic life in 2003, as a Research Assistant at the Institute of Cognitive Neuroscience, UCL, with Patrick Haggard, examining visual-tactile integration mechanisms. I subsequently undertook my PhD with Cecilia Heyes to ask about the role of domain general associative learning processes in visual-motor integration. My doctoral work was followed by three fellowships. The first, based again with Celia, the second, based with Martin Eimer, and the final with James Kilner and Karl Friston. Here I asked a variety of questions about sensorimotor interactions, usually concerning their role in social processes. I took up my first faculty position at the University of Reading in 2010, before moving to Birkbeck in 2012. In 2024 I moved to UCL, where I am a PI at both the Department of Experimental Psychology and Wellcome Centre for Human Neuroimaging. I am head of the Action and Perception Lab, where we ask a variety of questions concerning how action and perception mechanisms operate together to allow us to control our actions and perceive our world.