Robot-Led Intervention for Improving Emotional Cognition in Children with ASD

Friday, May 13, 2016: 10:00 AM-1:30 PM
Hall A (Baltimore Convention Center)
S. A. Koch1, J. B. Lebersfeld1, C. D. Clesi1, C. E. Stevens1, M. E. McNew1,2, A. G. Parker1, F. J. Biasini1 and M. I. Hopkins1, (1)Psychology, University of Alabama at Birmingham, Birmingham, AL, (2)Psychology, Florida International University, Miami, FL

Mounting research supports the use of technology, including social robotics, to improve social communication in individuals with Autism Spectrum Disorder (ASD). Robot-assisted therapy has been shown to increase engagement and promote novel social behaviors (i.e., turn-taking, imitation, joint attention) during human-led sessions. However, increased value would be achieved by a robot-led intervention that could be duplicated and more easily accessed. SAM (Socially Animated Machine) was designed to independently lead a social skills intervention aimed at improving emotional cognition in children with ASD.


The aims of this study were to examine whether the robot-led intervention 1) improves task-specific emotion recognition skills, 2) improves generalized social perception skills, and 3) elicits an enjoyable and engaging environment.


13 children with ASD and average cognitive skills (ages 5-11) completed this study. Participants were randomly assigned to treatment (n=7) and control (n=6) groups. Participants in the treatment group completed eight sessions with SAM involving several games designed to teach children to identify emotions using pictures, drawings, and social scenarios. Participants in the control group completed pre- and post-intervention sessions with SAM, but were otherwise assigned to a waiting list. Social perception skills, including emotion-matching accuracy and scores on the NEPSY-II Affect Recognition and Theory of Mind subtests, were compared across groups at pre- and post-intervention. Self-reported levels of enjoyment while interacting with SAM were also obtained.


Analyses were performed using a series of univariate ANCOVAs with adjustment for pre-intervention scores. Although participants in the intervention group improved in their percent accuracy for matching SAM’s emotions from pre- to post-intervention (M=80.95 to M=93.45) and control participants did not (M=79.17 to M=81.94), the difference between groups was not significant, F(1,10) = 3.106, p = .108. On the NEPSY-II, there were no significant differences in post-intervention scores between the groups on Affect Recognition, F(1,10) = .826, p = .385, or Theory of Mind, F(1,10) = 4.171, p = .068. Descriptive statistics were used to examine enjoyment ratings at post-intervention. All participants, regardless of group placement, reported feeling very happy (M=9.31, SD=1.18) and comfortable (M=9.09, SD=1.80) while talking with SAM, and were eager to have additional interactions (M=8.23, SD=2.86).


Preliminary findings suggest that children who completed the robot-led intervention improved in their ability to match SAM’s emotional expressions; we expect this trend to reach significance with a larger sample size (estimated N=20). Children in the intervention group did not show improved performance on a measure of social perception compared to the control group. Notably, many of the participants achieved scores at or above the average range on this measure at pre-intervention, and as such had less room for improvement following intervention. Overall, participants enjoyed working with SAM and requested additional interactions. These findings suggest that SAM’s robotic design fits the needs of this population and the intervention succeeds at teaching task-specific emotion recognition skills. Future research will be collected to assess the efficacy of the intervention and effect on generalized social perception skills in a group of children with lower cognitive and emotional functioning.