23031
Enhancing Joint Attention Skills in Children with Autism Spectrum Disorder through an Augmented Reality Technology-Based Intervention

Friday, May 13, 2016: 11:30 AM-1:30 PM
Hall A (Baltimore Convention Center)
P. Perez Fuster1, G. Herrera1, A. Ferrer2, M. Mademtzi3 and F. Shic3, (1)Autism Research Group. Robotics Institute, University of Valencia, Valencia, Spain, (2)Educational and Developmental Psychology, School of Psychology, University of Valencia, Valencia, Spain, (3)Yale Child Study Center, Yale University School of Medicine, New Haven, CT
Background: Joint attention (JA), defined as the ability to coordinate attention with an interactive social partner to share awareness of an object or event, has been identified as a key deficit among children with ASD (Mundy et al., 1986). In particular, children with ASD do not engage in responding to JA (RJA) behaviors, such as gaze following and pointing, in the same way as typically developing individuals (Mundy, 2003). Because these skills are crucial for child development and learning (Charman, 2003), previous behavioral interventions have been implemented (Kasari et al., 2006). Also several studies have used technology as a mediating tool to improve RJA skills, leading to significant overall improvements in some children (Cheng & Huang, 2012).

Objectives: The objective of this study was to enhance the RJA skills of gaze following and pointing in children with ASD through an intervention based on the use of Pictogram Room. This technology is a Kinect-based Augmented Reality (AR) system comprised of multiple educational video games designed for enhancing a variety of skills including RJA in children (Herrera et al., 2012). 

Methods: Two males and one female of 3, 5 and 8 years old with ASD participated in the study. ADOS-2 (Lord et al., 2012) and ESCS (Mundy et al., 2003) standardized tools were used to assess participants’ RJA skills. A single-subject multiple-baseline design was used over twelve weeks. The intervention consisted of six sessions of 30 minutes: 15 minutes for the intervention (i.e. participants used Pictogram Room to learn how follow the gaze of a virtual dummy and touch the object of shared attention in an AR environment) and 15 minutes for the assessment (i.e. the dependent variable was measured as the number of times that participants gaze followed a real dummy by pointing out the object of shared attention in the physical world). Maintenance was evaluated through follow up assessments. Another assessment in which the students had to respond to a person’s gaze was used to evaluate generalization.   

Results: ADOS-2 and ESCS assessments identified students’ difficulties for RJA, which made them to be eligible for participating in the study. For measuring the dependent variable, two blind raters independently coded the videos and an inter-observer reliability of >90 was reached. The Percentage of All Non-Overlapping Data was 96% and Pearson Phi statistic analysis showed an effect size of 0.92 (p< .01), which indicated that the intervention was highly effective for improving the RJA skills of gaze following and pointing of the three students. Improvements were maintained after one month and generalized to a person’s gaze.  

Conclusions: This study suggests that a novel AR technology-based intervention can be effective for the improvement of RJA skills in children with ASD. Although the intervention has been implemented for only three children, the findings are significant and promising. Therefore, it would be desirable as future work to evaluate the impact of this intervention on a larger sample. Also a randomized controlled trial would help to explore its full efficacy.