The purpose of this project is to investigate across-channel processing of human audition. Across-channel processing denotes processing that occurs in different frequency channels, in different ears and across phonemic boundaries. In this project we try to elucidate the brain mechanisms behind across-channel processing.
Experiments are conducted on gap detection and speech identification at the behavioural level, which we combine with electrophysiological measurements of brain activity such as auditory brainstem response (ABR) and magnetoencephalogram (MEG). Furthermore, we extend the notion of across-channel processing to perception of geminates as well as visual perception. Modelling and theory are an integral part of the project providing a means to probe the underlying mechanism from periphery to behaviour.
This project will provide a novel approach to studying human time, space, and speech perception by investigating common mechanisms, and offering a new perspective on how across-channel processing also operates in human vision.