Socially Contextualized Multisensory Integration in Autism

Friday, May 18, 2012
Sheraton Hall (Sheraton Centre Toronto)
2:00 PM
J. I. Borjon1, S. V. Shepherd2, A. Trubanova1, W. Jones1, A. Klin1 and A. A. Ghazanfar2, (1)Marcus Autism Center, Children's Healthcare of Atlanta & Emory School of Medicine, Atlanta, GA, (2)Neuroscience Institute, Princeton University, Princeton, NJ
Background: The human brain is both pervasively social and deeply multisensory. Remarkably, perceived social cues can alter sensory perception. If a burst of noise is presented after a face exhibiting averted gaze, typically developed (TD) adults will systematically, and erroneously, perceive a shift in the sound’s location. This perceptual shift is in accordance with the direction of perceived gaze: rightward eyes bias listeners to perceive sounds as if from the right, while leftward eyes bias to the left. This shift also occurs with images of arrows. Distinct neural pathways for the initial visual processing of gaze and arrows have been demonstrated in human lesion case studies, yet fMRI studies have demonstrated only a subtle differentiation between the orienting networks activated by gaze and arrows.  Thus, the extent to which gaze and arrow cues tap into purely “social” mechanisms versus more generic “attentional” mechanisms is still unknown. Individuals with autism (ASD) exhibit impaired gaze following, but intact following of non-social directional cues.

Objectives: A psychophysical paradigm will be utilized to examine the extent to which perceived gaze cues and arrows influence sound localization in individuals with autism and matched controls.

Methods: Fifteen individuals with autism and fifteen matched controls volunteered for the study. The paradigm was a variant of the Posner attention-cuing psychophysics paradigm. Participants were presented with a visual cue: a face with neutral affect gazing 30o to the right or left; a double-headed arrow pointing to the right or left; or a centered fixation cross. A brief, directionally tuned broadband noise was delivered via headphones 300 ms after the visual cue. Participants were instructed to gaze continuously towards the screen and indicate by button press the sound’s origin: right or left. The visual cue then disappeared and the next trial began. Reaction time and performance data were collected. Participants were monitored via video to ensure task completion.

Results: Preliminary data suggest TD participants exhibit a perceptual shift induced by gaze cues while ASD participants do not. When presented in the same paradigm as gaze cues, arrows exerted no significant bias on sound localization for both ASD and TD participants. Reaction times data for both ASD and TD participants were significantly influenced by the difficulty of identifying the sound’s location regardless of the presented visual cue. Further, for both groups of ASD and TD participants, perceived arrow cues exhibited a significant congruency effect, in which participants were quicker to respond to congruent pairings of stimuli compared to incongruent pairings.

Conclusions: Prior research has shown arrow and gaze cues induce a perceptual shift in sound localization when the paradigms are independently presented. Within the same paradigm, gaze cues exerted a significant shift in perception in controls while arrows did not. For ASD participants, neither face nor arrow cues induced a significant shift in perception. These results indicate TD controls are uniquely sensitive to social cues, integrating them into a unified percept, while ASD subjects do not exhibit a response consistent with a perceptual shift.

| More