International Meeting for Autism Research (London, May 15-17, 2008): Gesture and Speech Integration in High-Functioning Autism

Gesture and Speech Integration in High-Functioning Autism

Thursday, May 15, 2008
Champagne Terrace/Bordeaux (Novotel London West)
10:30 AM
L. B. Silverman , Strong Center for Developmental Disabilities, University of Rochester Medical Center, Rochester, NY
E. Campana , Department of Arts, Media, and Engineering, Arizona State University
L. Bennetto , Department of Clinical and Social Sciences in Psychology, University of Rochester
M. K. Tanenhaus , Department of Brain and Cognitive Sciences, University of Rochester
Background: Iconic gestures routinely accompany speech, are ubiquitous, and provide vital communicative information to the listener.  Individuals with autism show a constellation of social and communicative impairments, yet it is unknown whether difficulties with iconic gesture comprehension contribute to the core features of autism.

Objectives: The purpose of this study was to examine iconic gesture comprehension in autism, and to assess whether cross-modal processing difficulties may impede gesture and speech integration in this population.

Methods: Participants were 19 adolescents with high-functioning autism (mean age:15.6 yrs) and 20 typically developing controls (mean age:15.2 yrs), matched on age, gender, VIQ, and SES.  Iconic gesture comprehension was assessed through quantitative analyses of eye fixations during a video-based task. Participants watched videos of a woman describing one of four shapes shown on a computer screen.  Half of the videos depicted natural speech-and-gesture combinations, while the other half depicted speech-only descriptions (using comparable verbal information). Participants clicked on the shape that the speaker described. Since gesture typically precedes speech, we hypothesized that typically developing controls would visually fixate on the target shape earlier on speech-and-gesture trials compared to speech-only trials, indicating immediate integration of visual and auditory information across sensory modalities. We further hypothesized that participants with autism would not show this effect.

Results: Analyses of eye movements revealed that controls identified the target more quickly when iconic gestures accompanied speech. Conversely, individuals with autism showed slowed comprehension when gestures were present compared to when speech occurred alone. This effect was not accounted for by unimodal speech-only or gesture-only processing difficulties.

Conclusions: These findings suggest that individuals with autism have cross-modal processing difficulties that significantly hinder gesture and speech comprehension. They also implicate brain regions responsible for social cognition and biological motion perception.

See more of: Communication Posters 1
See more of: Poster Presentations