International Meeting for Autism Research: Using Virtual Reality to Provide Controlled Ecologically Valid Social Interaction Paradigms for Studying Cognitive Control of Initiating Joint Attention

Using Virtual Reality to Provide Controlled Ecologically Valid Social Interaction Paradigms for Studying Cognitive Control of Initiating Joint Attention

Thursday, May 20, 2010
Franklin Hall B Level 4 (Philadelphia Marriott Downtown)
2:00 PM
W. L. Jarrold , MIND Institute, U.C. Davis, Sacramento, CA
M. Solomon , UC Davis Department of Psychiatry and Behavioral Sciences, MIND Institute, Imaging Research Center, Sacramento, CA
J. Bailenson , Department of Communication, Stanford, Stanford, CA
M. Gwaltney , MIND Institute, U.C. Davis, Sacramento, CA
S. Ozonoff , Psychiatry and Behavioral Sciences, M.I.N.D. Institute, University of California at Davis, Sacramento, CA
P. C. Mundy , MIND Institute, U.C. Davis, Sacramento, CA
Background: The inability to initiate joint attention to share experience spontaneously with others is a cardinal symptom of autism (Mundy, 2003; Mundy & Sigman, 1989). This joint attention disturbance reflects social information processing/cognitive control disturbances including difficulty in self-monitoring visual attention; diminished tendency to attend to others’ gaze and affect; and failure to flexibly integrate this self/other information. It has been suggested that these three functions are supported by a distributed cortical network involving components of the dorsal medial frontal, orbito-frontal, parietal (precuneous) and temporal cortices (Mundy et al. 2000; Mundy & Newell, 2007). These same systems appear to be engaged when virtual reality (VR) agents make eye contact with a participant, or when a participant follows the gaze of a VR agent (Schilbach et al. 2006).
Objectives: To develop and validate VR paradigms for the assessment and examination of social attention in higher functioning children with autism. In particular , to capitalize on the features of VR to provided controlled measures of spontaneous initiation of social attention in children with autism. 
Methods: Ten individuals with HFA and ten with typical development all aged 8-17 will be recruited via the MIND Institute subject tracking system. Participants will be presented with VR tasks in a 10 X 14 foot laboratory room, using a VR system produced by WorldViz ( The paradigm is based on methods used to improve the use of social gaze in teaching situations (Bailenson et al. 2008). In the no fade condition of our paradigm, participants are told to do their best to make eye contact with each of 9 virtual peers when telling a story. In the fade condition, ignored agents fade -- this prompts better attention distribution because an agent’s opacity is not restored until the participant's head "looks" at that agent’s face.  For subjects in which fade precedes no fade conditions, we investigate whether “training” from the first condition improves shared attention during the second condition when the agent-fading prompt is removed.
Results: We have implemented Bailenson’s paradigm with several small changes. First, agents’ eyes blink to maximize realism.  Second, agents’ head motion, obtained while measuring actual human head motions watching lectures, now also includes head nods. Third, agents can respond interactively to the attention of the subject such as nodding, raising eyebrows or re-engaging attention (e.g. if the virtual agent’s attention starts to wander). Videos of the virtual environment will be shown. Data collected to date indicate that individual differences in the deployment of social attention are detectable and that attention prompts [i.e. fading] do measurably increase the distribution of attention. 
Conclusions: VR provides a potentially highly beneficial experimental platform for the study of the cognitive control processes involved with joint social attention. Stimuli can interact in complex socially meaningful and ecologically valid ways with minimal error variance compared to studies using human confederates. Extensive fine-grained behavioral measurements (e.g. head orientation over time, mean looking times, patterns of looking at individual agents) can be collected and analyzed using multivariate methods and machine learning algorithms.
See more of: Cognition
See more of: Autism Symptoms