Visual facilitation of auditory encoding and the role of slow network activity

Seeing a speaker’s face enhances speech intelligibility in adverse environments. From many previous studies we know that slow rhythmic network activity likely plays a role in this process. I here I first review studies demonstrating a direct link between local rhythmic activity in auditory cortex and the excitability of individual neurons. Furthermore, I explore recent evidence demonstrating a direct link between the state of pre-stimulus activity and subjects’ perceptual performance. I then review to recent MEG studies investigating the network mechanisms underlying the visual facilitation of speech perception.  In one study we quantified local speech representations and directed connectivity in MEG data obtained while human participants listened to speech of varying acoustic SNR and visual context.  We found that during high acoustic SNR speech encoding by entrained brain activity was strong in temporal and inferior frontal cortex, while during low SNR strong entrainment emerged in premotor and superior frontal cortex. These changes in local encoding were accompanied by changes in directed connectivity along the ventral stream and the auditory-premotor axis. Importantly, the behavioural benefit arising from seeing the speaker’s face was not predicted by changes in local encoding but rather by enhanced functional connectivity between temporal and inferior frontal cortex. These results demonstrate a role of auditory-motor interactions in visual speech representations and suggest that functional connectivity along the ventral pathway facilitates speech comprehension in multisensory environments.

Go to Editor View