Dr Yifei He | Exploring gesture-speech interaction using multimodal neuroscientific methods: a translational perspective

Guest Lecture

  • Date: Nov 12, 2019
  • Time: 11:00 - 12:00
  • Speaker: Dr Yifei He
  • Translational Neuroimaging Lab, Department of Psychiatry and Psychotherapy, Philipps University Marburg, Germany
  • Location: MPI for Human Cognitive and Brain Sciences
  • Room: Wilhelm Wundt Room (A400)
Human daily communication is realized in a multisensory manner. Besides auditory speech, visual input such as hand gesture also plays an important role. Despite advances in neuroscientific investigations on language processing, only limited research has been conducted on how gesture interacts with language processing during daily communication. In this talk, I will present evidence from EEG, fMRI, and simultaneous EEG-fMRI from controlled and naturalistic stimuli, showing the brain dynamics on how gesture and language interacts with each other during comprehension, and how the two input channels are integrated as coherent semantic representations. In addition to basic research, I will present fMRI results from patients with schizophrenia, showing their modality- and category-specific neural impairments in gesture and language processing at semantic level. Lastly, I will present findings from non-invasive brain stimulation (e.g., tDCS) on how the method can be used to modulate/normalize gesture-speech interaction in both healthy population and patients with schizophrenia.

Go to Editor View