My research examines the cognitive and neural mechanisms of the body's impact on communication via hand gesture and sign language. A major focus of my work is how the relationship between gesture and speech differs in typical and atypical language development and processing. In vivo neuroscience techniques such as electroencephalography or EEG, functional magnetic resonance imaging or fMRI, and functional near infrared spectroscopy or fNIRS have been instrumental in revealing the neural mechanisms of hand gesture and sign language.
In addition, eye tracking provides insight into real-time processing of hand gesture and sign language. My work was the first to show that observing hand gestures conveying pitch contours helps speakers of atonal languages, such as English, learn lexical tone and tonal languages such as Mandarin. I'm currently using EEG and fNIRS to investigate the neural signatures of this effect.
This protocol provides a means to investigate gesture production in the presence of communication challenges. I've used it successfully to investigate gesture production in second language learners and individuals with autism spectrum disorders in in-person settings, and I plan to expand it to additional populations and virtual settings. My findings provide insight into how observing and producing hand gestures can facilitate language acquisition and processing.
Moreover, they're helping to reveal potential biomarkers of language atypicalities as well as treatment efficacy.