International Meeting for Autism Research (May 7 - 9, 2009): Toward Designing Interactive Technologies for Supporting Research in Autism Spectrum Disorders

Toward Designing Interactive Technologies for Supporting Research in Autism Spectrum Disorders

Friday, May 8, 2009
Boulevard (Chicago Hilton)
D. Feil-Seifer , Computer Science, University of Southern California, Los Angeles, CA
M. P. Black , Electrical Engineering, University of Southern California, Los Angeles, CA
M. J. Mataric , Computer Science, University of Southern California, Los Angeles, CA
S. Narayanan , Electrical Engineering, University of Southern California, Los Angeles, CA
Background:

Mounting evidence suggests that children with autism spectrum disorders (ASD) tend to increase their levels of social behavior when interacting with a robot. There are many explanations for the possible effects that a robot has on children with ASD.  Many individuals with ASD have shown the ability to display higher-level behaviors in structured social settings.  Since children with ASD have difficulty with the self-initiation of social behavior, and that initiation of behavior is important for social skill development, a robot that helps with initiation of social behavior could be potentially valuable in both research and intervention.

Objectives:

The aim of this work is to better define the link between interactive robots and increased social activity in children with ASD.  Specifically, we wish to determine if a link exists, and if so, what elements of a robot's form and function (contingency, anthropomorphism, embodiment, etc.) correlate with changes in social behavior.  We also aim to detect and interpret the child's interactions, especially social behavior relevant to ASD (body position, head direction, gestures, vocal prosody, etc.).  Finally, we are designing a robot system that can act appropriately in a social setting that could be used to augment established diagnostic and therapeutic regimens for children with ASD.

Methods: To accomplish these objectives, we are designing an interactive robot-based experimental environment.  The robot consists of a mobile base and a human upper-torso with actuated arms, neck and face. The robot can use movement and has speakers to play synthetic and pre-recorded speech sounds.  The robot can also be equipped with relevant toys, such as a bubble-blower.  We have created variations on this design to isolate contingent from random behavior, anthropomorphic from more mechanical appearance, and embodiment from non-embodiment (e.g., a virtual agent on a screen).  The robot and the environment are both instrumented with cameras and microphones to capture the interactions from multiple perspectives.

We are planning on single and multi-session within-subject experiments to test the effectiveness of the various robot configurations.  We wish to observe changes in both human-human and human-robot interaction, so a parent will always be in the room; in some cases, a clinical psychologist may also be present.  We will evaluate the interactions using established human-rated behavior metrics.  We are also developing means for autonomously detecting and quantifying social behavior of children from the multi-modal audio-video signals.  This automatic sensing and interpretation could be used to measure interaction quality on-line, and could be a valuable tool for psychologists/therapists in the future.

Results:

Our experiments are currently in progress, so we will report results in the future.

Conclusions:

At the workshop, we plan to share our experiences with the iterative design of these systems and the insights and findings from our ongoing experiments with children with ASD.