18459
Socially Animated Machine (SAM): A Novel Design for Robotic Research in Children with Autism Spectrum Disorders

Friday, May 15, 2015: 10:00 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
S. A. Koch1, C. D. Clesi1, J. B. Lebersfeld1, C. E. Stevens1, A. G. Parker1, M. E. McNew1, M. I. Hopkins2, F. R. Amthor2 and F. J. Biasini2, (1)University of Alabama at Birmingham, Birmingham, AL, (2)Psychology, University of Alabama at Birmingham, Birmingham, AL
Background:  

Children with Autism Spectrum Disorder (ASD) display significant impairments in social communication that can have serious implications for long-term social and emotional functioning. Recent research suggests the value of using robots to improve social communication. Researchers investigating robots as tools for therapy in ASD have reported increased engagement and novel social behaviors (i.e., turn-taking, imitation, joint attention) when robots are part of the social interaction. Although a wide variety of robotic designs have been employed in previous studies, research shows that children with ASD respond more positively to robots that are cartoonish than robots that take on a humanoid form. However, as these animal-like robots typically offer only a limited range of facial expressions, they may not translate well to the human face and therefore skills learned within the session may fail to generalize to real-world social situations.

Objectives:  

The overall objective of this study is to design a novel social robot, or Socially Animated Machine (SAM), with a unique mix of anthropomorphic and non-humanoid features. The aims of this study are to 1) conduct a pilot study to explore whether SAM is capable of forming complex facial expressions similar to those observed in the human face, and 2) conduct a usability study to examine the acceptability of SAM.

Methods:  

Approximately 100 typically developing children ranging from 7-12 years will participate in the pilot study. These children will be asked to label and match photos of SAM expressing various emotions to schematic drawings and photos of human faces displaying these emotions. Approximately 20 children with ASD ranging from 7-12 years will participate in the usability study. These children will participate in a fifteen-minute social interaction with SAM. During the interaction, level of engagement will be measured based on eye-gaze patterns using faceLABTMeye-tracking technology. Self-reported levels of enjoyment will also be measured using ten-point Likert-type scale items.

Results:  

Preliminary results of the pilot study (N=9) suggest that typically developing children are able to label SAM’s emotions with 67% accuracy and match SAM’s emotions to schematic drawings and human expressions with 69% and 54% accuracy, respectively. Preliminary results of the usability study (N=8) suggest that children with ASD are able to maintain gaze on SAM’s face during the social interaction. Children who completed the robot-based interaction reported feeling very happy (M=9.57, SD=1.13) and comfortable (M=10, SD=0) while talking with SAM. They were also very eager to have an additional interaction with SAM (M=9.86, SD=.38).

Conclusions:  

Preliminary findings suggest that SAM is capable of forming facial expressions that can be labeled and matched with adequate accuracy. The degree of accuracy is expected to increase with a larger sample size. Additionally, results indicate that children with ASD are engaged and enjoy interacting with SAM. Overall, these findings suggest that SAM’s robotic design helps to fill the void in current research by offering an animal-like, approachable appearance while still maintaining some subtle details of the human face. Additional data collection is expected to clarify and strengthen results.