International Meeting for Autism Research (May 7 - 9, 2009): Museum Hunt: a Computerized Eye-Tracking Game

Museum Hunt: a Computerized Eye-Tracking Game

Friday, May 8, 2009
Boulevard (Chicago Hilton)
F. Hurewitz , Department of Psychology, Drexel University, Phila, PA
M. Brennan , Computer Science, Drexel University, Phila, PA
E. Boucher , Digital Media & Design, Drexel University, Phila, PA
F. Lee , Computer Science & Psychology, Drexel University, Phila, PA
Background: There has been increasing interest in the creation of technology based interventions to facilitate learning of social skills such as facial recognition in individuals with autism.  One lack in current programs is that they usually involve static faces, and they measure the user's ability to attend to the relevant information via explicit response (key presses) in lieu of the on-line abilities required to navigate real social interactions.  A further limitation is that programs requiring rote memorization can be seen as boring by the user, and may not be reinforcing enough for longterm use.
Objectives: We present as a "proof of concept" a demonstration of an interactive video game designed to assist individuals with autism in learning reciprocal social skills such as gaze following, social referencing, and attention to facial expressions and facial configurations.  This game uniquely uses eyetracking technology (implemented on a TOBII T60 eyetracker) to assess if users are attending in real time to relevant aspects of the virtual world.   Based on the users eyegaze, the game increases or decreases the number and explicitness of social/attentional cues.  Furthermore, eyetracking data is used to determine if the software based intervention increases the automaticity in which the game user notices these cues.

Methods: The game, Museum Hunt, is comprised of a series of expandable and reconfigurable scenarios which encourage children to solve a mystery by assisting an avatar detective.  The detective (and the computer user) must use active cues that occur in the course of a real time adventure to solve the mystery.  Activities include matching mug shots to faces of avatars that passed by carrying a purloined object, following the gaze of a crowd of people to find out where the thief is hiding, or determining emotion/facial expression of an avatar.  Difficulty levels can be adjusted (e.g. to make the mugshot task more difficult, it may use a disguised individual who has dyed hair and glasses.) By creating this interaction in a gaming format, we create a fun and reinforcing format to teach social skills. Additionally, the game has procedurally generated stories and activity sets which give the player new experiences each time they play. It is developed in flash and has a mode without eye tracking which allows it to run on any standard web browser.

Results: Our presentation will include a demonstration of the eyetracking paradigm, and discussion of the prospects for expanding this technology to new scenarios, including contingent eyegaze interactions with avatars.

Conclusions: We demonstrate the feasibility of creating fun, usable and dynamic software-based interventions for individuals with autism to practice social skills.  We also for the first time establish that monitor-embedded eyetrackers can be used as input devices for avatar based gaming.  As eyetracking technology becomes more accessible in terms of cost and usability, this is a promising medium for delivering interventions that train individuals to modulate attention and the social use of eyegaze.