18119
Children-Robot Interaction: Eye Gaze Analysis of Children with Autism during Social Interactions

Friday, May 16, 2014
Meeting Room A601 & A602 (Marriott Marquis Atlanta)
S. Mavadati1, H. Feng1, S. Silver2, A. Gutierrez3 and M. H. Mahoor1, (1)Electrical and Computer Engineering, University of Denver, Denver, CO, (2)University of Denver, Denver, CO, (3)Psychology, Florida International University, Miami, FL
Background:

 Children with Autism Spectrum Disorder (ASD)demonstrate a deviant pattern of mutual eye gaze that is visible in early ages and may lead to other social deficits (e.g. delaying development of social cognition and affective construal processes). Currently, clinical work employing intensive behavioral and educational programs to teach individuals with ASDs appropriate social skills in an effort to make them more successful in social situations. However, an empirical question remains regarding the effectiveness of these training approaches to teach the fundamental face-to-face communication skills, such as understanding and regulating facial expressions and visual attentions in the social environment. Recent studies reveal that children with ASD have superior engagement to the robot-based interaction, and it can effectively trigger positive behaviors (e.g. eye gaze attention). This suggests that interacting with robots may be a promising intervention approach for children with ASD.

Objectives:

 The main objective of this multidisciplinary research is to utilize humanoid robot technology along with psychological and engineering sciences to better improve the social skills of children with High Functioning Autism (HFA). The designed intervention protocol focuses on different skillsets, such as eye gaze attention, joint attention, facial expression recognition and imitation. The current study is designed to evaluate the eye gaze patterns of children with ASD during verbal communication with a humanoid robot.

Methods:

 Participants in this study are 14 male children ages 7-17 (M=11 years) diagnosed with ASD.  The study employs NAO, an autonomous, programmable humanoid robot to interact with ASD children in a series of conversations and interactive games across 3 sessions. During different game segments, NAO and children exchange stories and having conversation on different context. During every session of the game, five cameras recorded the entire interaction of a child and NAO. Videos were later scored to analyze the gaze patterns of the children for two different contexts. Studying eye gaze fixation and eye gaze shifting while: 1) NAO is speaking; 2) child is speaking.

Results:

To analyze the eye gaze of participants, every frame of video was manually coded as Gaze Averted(‘0’) or Gaze At(‘1’) w.r.t NAO. To accurately analysis the gaze patterns of children during the conversation, the video segments of ‘NAO speaking’ and ‘child speaking’ have been selected. The averages of four measures were employed to report the static and dynamic properties of eye gaze patterns:

1)     ‘NAO speaking’: Gaze At NAO (GAN)= %55.3, Gaze Shifting (GS)=%3.4, GAN/GS = 34.10, Entropy GS: 0.20

2)     ‘child speaking’: GAN = %43.8, GS=%4.2, GAN/GS = 11.6, Entropy GS = 0.27

Where ‘Entropy GS’ refers to uncertainty of eye gaze shifting and is within a range of [0-1]. 

Conclusions:  

The results indicates that children with ASD having more eye contact and less gaze shifting while NAO is speaking (Higher GAN/GS and lower Entropy GS), however they prefer to shift their gaze more often and have less fixation on the robot as they are speaking. These results will serve as an important basis to significantly advance the emerging field of robot-assisted therapy for children with ASD.