22136
Children with Autism Are Hypo-Responsive to Human Eyes Presented without Sound, but Hyper-Responsive to Eyes Presented after Social Sounds

Thursday, May 12, 2016: 11:30 AM-1:30 PM
Hall A (Baltimore Convention Center)
J. L. Kleberg1, E. Thorup2 and T. Falck-Ytter3, (1)Box 1225, Uppsala University, Uppsala, Sweden, (2)Uppsala universitet, Hägersten, Sweden, (3)Uppsala University, Uppsala, Sweden
Background: Children with this autism spectrum disorder (ASD) often fail to make use of human gaze to learn about the social world. A subcortical network involving the superior colliculus, amygdala and putamen underlies rapid orientation to human faces. Influential theories have suggested that altered functioning of this network is an important part of the ASD phenotype. Recent studies have questioned these theories by demonstrating intact orienting to face-like configurations in people with ASD.

Objectives: We hypothesized that children with ASD would be slower than typically developing (TD) children to orient to isolated eyes. Our study was based on the prediction that visual orienting to human eyes rather than to faces as a whole is reduced in ASD. This prediction was based on single-cell recordings in monkeys and electrophysiological data from humans demonstrating that the neural mechanisms underlying responses to eyes and whole faces are partly dissociable. In addition, we examined the effect of spatially nonpredictive social and nonsocial sounds on visual orienting in children with and without ASD.

Methods: Seventeen children with ASD (mean age: 6.5 years) participated in the experiment along with a typically developing (TD) control group matched for age and nonverbal IQ. We presented images of human eyes with direct gaze along with three nonsocial objects. Trials could be preceded by a social sound (a single phoneme spoken by a human voice), by a common nonsocial sound or by no sound. The eye regions of fearful faces were used as stimuli. We used eyes displaying fear to maximize the likelihood of quick orienting of attention (based on previous research). The latency of the saccades was measured with corneal reflection eye tracker at a sample rate of 60 hz.

Results: We found a strong interaction effect between group and sound type (p <.001), indicating that the presence of sound had a different impact on saccadic latency in the two groups. As expected, the ASD group was slower to orient to the eyes than the TD group on silent trials (p = .01). In contrast, children with ASD were faster to orient to the eyes than the TD group after social sound (p <.001). No group difference was found after nonsocial sounds. Higher degrees of ASD symptoms as measured with the ADOS-2 predicted faster orienting to the eyes after social sounds (r = .61; p <.05).

Conclusions: Our results provide clear evidence of reduced social orienting in ASD, but suggest that brain mechanism underlying responses to eyes rather than to whole faces are the source of this effect. The strong modulation by non-visual information provides support for the view that ASD is characterized by network-wide atypicalities in information integration. Our results show that the multisensory environment is highly important for early social attention in children with ASD. These results have implications for our understanding of early attentional impairments in ASD.