An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory CortexReportar como inadecuado




An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus pSTS and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices such as STS. Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex.



Autor: Kayoko Okada, Jonathan H. Venezia, William Matchin, Kourosh Saberi, Gregory Hickok

Fuente: http://plos.srce.hr/



DESCARGAR PDF




Documentos relacionados