International Meeting for Autism Research (May 7 - 9, 2009): Inferences on Cognition in Nonverbal Children Via Real-Time Analysis of Eye Gaze

Inferences on Cognition in Nonverbal Children Via Real-Time Analysis of Eye Gaze

Friday, May 8, 2009
Boulevard (Chicago Hilton)
J. Munson , Autism Center, University of Washington, Seattle, WA
Background:  

The assessment of cognition and language in children with autism who have little spontaneous communication presents clinicians, educators, and researchers with numerous challenges due to the nature of impairments in autism and characteristics of the assessment situation itself.  Most standardized assessments involve an examiner, whom the child does not know, who attempts to elicit responses from the child while referencing various test materials.  For this common methodology to yield meaningful results it requires a foundation of an extended bout of successful responsive joint attention by the child.  Without this foundation these tests are simply unable to provide much meaningful information often resulting in the commonly observed “floor score.” 

Objectives:

As an example, recent research at the University of Washington Autism Center, 483 cognitive assessments of preschoolers with autism spectrum disorders have been conducted using the Mullen Early Scales of Learning.  Thirty percent of these children obtained a composite score at the floor of the scale while 63% had at least one subscale at the floor.  Given the design requirements of many research protocols, children who fail to obtain some minimum threshold are often excluded from participation.  Thus, we have a limited number of tools that provide insight into why these children struggle with these tasks.  In addition, traditional examiner-driven assessments provide little information regarding the type of information-processing tasks these children spontaneously solve when interacting with their environment.  As a field we need to capitalize on the innovative use of technology to provide child-driven experiences that aid our understanding of the cognition of this understudied group of children.

Methods:

This project uses eye-tracking and real-time 3D graphics to provide a virtual environment the child can visually explore at their own initiative.  The child watches virtual scene presented via a 3D graphics engine used in contemporary video games while a remote infrared camera eye tracker (SmartEye Pro 5.3) estimates where the child is looking.  Various scenarios will be presented to the child including an onscreen character speaking and presenting written language, as well as physics-based interaction of objects (e.g., stacked blocks falling, ball rolling).  The child’s gaze location is fed back to the display computer (@60Hz) in real-time allowing aspects of the virtual scene to be contingent on the child’s gaze behavior.  No explicit behavioral response from the child is required beyond sitting in the chair and watching the scene.

Results:

The demonstration of this system will include two computers, one running the SmartEye software from a recording of a child with autism with very limited expressive communication who is watching the virtual scene.  The second computer will run the display program and will receive input from the SmartEye computer.  This will completely simulate the system in operation.  Summaries of gaze behavior during different phases of the scene will be provided. 

Conclusions:

It is hoped this demonstration can provide the field one example of an assessment methodology that can allow the exploration of cognition in children with autism who historically have been an understudied but very important group.