Today neuroscientists can manipulate what an animal sees by intervening on its visual neurons and can decode what people see from fMRI scans of their brains. These advances add to existing work on how our brains encode and process sensory signals to suggest that our perceptual experience of the world is just a neural reconstruction in the head. My research questions this idea. I develop the view that some of our experiences go beyond neural representations in sensory cortex. These experiences emerge out of our interaction with distal stimuli and explain how perception connects us to the world.
I take a hybrid approach on which perceptual experience has both relational and representational components. Neural activity in the brain does process sensory input to build a perceptual representation of the distal environment, and this representation becomes part of our perceptual experience, but it does not exhaust our experience. Our sensory systems are also engaged in other activities besides representation, such as tracking distal stimuli in space and guiding our movement. These activities involve interactions with distal stimuli which we thereby perceive. Our overall perceptual experience is constituted both by representational content encoded in sensory neural activity and by the distal objects with which we interact through our sensory systems. This view accommodates the empirical evidence for neural representationalism while also explaining how we could consciously perceive things not explicitly represented by the brain and capturing the kind of connection perception in fact provides to the world.