Cue-Driven Face Scanning in Typical and Atypical Development

Friday, May 18, 2012
Sheraton Hall (Sheraton Centre Toronto)
1:00 PM
R. Bedford1, M. Elsabbagh2, A. Senju2, T. Charman1, A. Pickles3, M. H. Johnson4 and .. BASIS team2, (1)Centre for Research in Autism and Education, Institute of Education, London, United Kingdom, (2)Centre for Brain and Cognitive Development, Birkbeck, London, United Kingdom, (3)Institute of Psychiatry, King's College London, London, United Kingdom, (4)Centre for Brain and Cognitive Development, Birkbeck, University of London, London, United Kingdom
Background: From immediately after birth human infants preferentially attend to socially relevant stimuli such as faces. It could be that atypical scanning of social scenes, i.e., reduced fixation on the eyes (e.g., Klin et al., 2002), contributes to the subsequent development of the social communication problems which characterise individuals with an autism spectrum disorder (ASD). Further, Young et al. (2009) showed that individual differences in face scanning, with increased mouth relative to eye fixation, predicted subsequent expressive language.

Objectives:  The primary aim of this study is to investigate the origins and the developmental consequences of variability in face scanning both in typical development and in the broader autism phenotype. Our participants were a longitudinal sample of infants aged 7 and 14 months at high risk for autism (due to having an older sibling with a diagnosis) and low-risk controls. We aimed to establish whether any early differences in allocation of attention to a face in high-risk infants might contribute to subsequent outcomes at 36 months.

Methods:  Participants were 54 infants at high risk for ASD and 50 low-risk controls recruited through the British Autism Study of Infant Siblings (BASIS). We employed Tobii eye-tracking techniques to record infants’ looking behaviour. There were four different trial types (with a repetition of each by a different actress). Each trial began with a 5-second period where the face was still, followed by one of four dynamic sequences. Conditions 1-3 had one moving region, while no other face part was moving: (1) the eyes showed gaze shifts towards or away from the infant (2) the mouth displayed vowel articulation movements (3) the hands displayed upward to downward motion next to the face (4) the eyes, mouth, and hands moved displaying a ‘peekaboo’ sequence. The measures calculated were proportion of looking to the cued location in conditions 1-3, and the ratio of eyes to mouth looking index (henceforth EMI) in condition 4.

Results:  Structural equation modelling, controlling for non-verbal Mullen t-score, demonstrated that EMI at 7 and 14 months did not predict either risk group or subsequent clinical outcome at 36 months. However, an autoregressive cross-lagged model showed that EMI at 7-month did predict 36-month expressive language (EL), though not receptive language (RL). Finally, we found that a latent variable reflecting infants’ looking behaviour during the simple, single cue conditions was a strong predictor of EMI during the peekaboo condition, and indirectly predicted EL.

Conclusions:  Individual variation in attentional distribution reflects default biases, orienting to motion cues and learning from prior experience. Looking behaviour during the simple cue conditions and complex dynamic conditions is strongly related, with the latter directly predicting subsequent expressive language. Scanning of social scenes is not atypical early in development in high-risk infants who subsequently develop ASD but it is possible that atypical interactions with the social environment result in increasing behavioural differences over the course of development.

  * The BASIS Team: S. Baron-Cohen, P. Bolton, S. Chandler, J. Fernandes, H. Garwood, K. Hudry, L. Tucker, A. Volein.

 

| More