International Meeting for Autism Research (May 7 - 9, 2009): Rich Spontaneous, Social Engagement with a Dinosaur Robot

Rich Spontaneous, Social Engagement with a Dinosaur Robot

Friday, May 8, 2009
Boulevard (Chicago Hilton)
E. S. Kim , Computer Science, Yale University, New Haven, CT
D. Leyzberg , Computer Science, Yale University, New Haven, CT
E. Short , Computer Science, Yale University, New Haven, CT
R. Paul , Child Study Center, Yale University School of Medicine, New Haven, CT
B. Scassellati , Computer Science, Yale University, New Haven, CT
Background: Children with ASD exhibit atypical social behaviors, including reduced eye contact. They also have been shown to take special interest in mechanical objects, including robots.

Objectives: We compare the social behaviors observed in children with ASD during interaction with a socially expressive Pleo robot, to those seen in interaction with an adult. Our long-term goal is to assess the feasibility of robots as therapeutic tools that promote conventional social behavior.

Methods: We compared social engagement in verbal interactions between four males with ASD, and a socially expressive robot, against that in structured interviews with a trained experimenter, in the Yale In-vivo Pragmatics Probe (YIPP). Participants included one nine year-old, ten year-old twins, and a 15 year-old.

The Pleo robot is a 2-foot, commercially distributed, toy dinosaur robot. Custom software, behavior, and sounds afford Pleo social expressiveness. The robot's behavior is triggered by an experimenter with a hidden remote control. The robot's responses and vocalizations are designed to engage and sustain social interactions with the child.

Pleo walked across a 4'-long mat illustrated with a jungle scene. For each participant, Pleo crossed four painted rivers. At each, Pleo stopped walking and exclaimed in surprise. Participants were instructed to help Pleo walk across the mat, by talking in an encouraging tone of voice when Pleo expressed fear of water. If the participant did not respond encouragingly, he received an increasingly restrictive sequence of cues suggesting he help Pleo, including instruction and finally modeling from the experimenter.

In YIPP, participants spoke freely on any subject. They were confronted with (1) a tacit opportunity to help the experimenter solve problems and (2) role-playing scenarios expecting choice of pragmatically appropriate language and affective expression.

Pleo and YIPP interactions were video-recorded. An independent experimenter, who was blind to diagnosis, annotated social behaviors in the videos. Eight 30-second clips were sampled throughout each video to normalize for differing interaction durations. For each 30s clip, eye contact and affective prosody were annotated.

Results: Pleo interactions lasted 5-13 minutes. YIPP interactions lasted 16-20 minutes.

For all participants, we observed longer eye contact with Pleo (m = 19.9s, sd = 8.2s) than with the human YIPP interviewer (m=2.0s, sd = 1.4s), for the maximum-duration, sustained eye contact episode in each 30s clip.

For one twin, the annotator described greater variety in the types of affect expressed in prosody (eg, chiding, frustrated, soothing), in his speech to Pleo than to the YIPP interviewer (17 versus 10 total, over all clips). For the other twin, the average intensity of his affective prosody, rated on a scale of 0 (no affect) to 2 (strongly emotional), was greater in speech to Pleo (m=1.8) than in speech to the YIPP interviewer (m=1.2).

Conclusions: All four ASD participants spontaneously exhibited longer eye contact with the Pleo robot than with a human. We have also observed greater variety and affective intensity in prosody, during interaction with Pleo, for some participants.