International Meeting for Autism Research: A Pilot Investigation Of Visual Exploration During Face-To-Face Social Interaction In Virtual Reality

A Pilot Investigation Of Visual Exploration During Face-To-Face Social Interaction In Virtual Reality

Friday, May 13, 2011
Elizabeth Ballroom E-F and Lirenta Foyer Level 2 (Manchester Grand Hyatt)
10:00 AM
O. Grynszpan1, J. Constant2, J. C. Martin3, J. Simonin4 and J. Nadel5, (1)CNRS USR 3246, Université Pierre et Marie Curie, Paris, France, (2)Hôpitaux de Chartres, Chartres, France, (3)LIMSI-CNRS, Université Paris Sud, Orsay, France, (4)Holo3 Inc., Schiltigheim, France, (5)CNRS USR 3246, Paris, France
Background: Individual with High Functioning Autism Spectrum Disorders (HFASD) exhibit profound pragmatic difficulties that have been linked to atypicalities in the visual exploration of facial expressions. While most studies have focused on difficulties in recognizing emotions and attending to relevant features of the face, little is known about the ability to regulate one’s own eye movements in a conversational context.

Objectives: The present study investigates the existence of impairments in the self-monitoring of eye motion during social interactions.

Methods: We designed a task where a virtual character addresses the participants and utters a key sentence that can be interpreted either literally or non-literally. The character’s facial expressions enable disambiguating this key sentence and therefore understanding the whole message. After each such animated scenario, participants are asked two closed-choice questions that evaluate their performance in non-literal interpretations.  Sixty different social scenarios were constructed using two virtual characters, a female and a male, that were embedded in videos of real life settings, thus providing a naturalistic context. Thirteen adolescents and adults with HFASD and fourteen typical individuals were assessed with this task according to two conditions. The experimental condition relied on an eye-tracking system that simulated a gaze-contingent lens: the entire visual display was blurred in real-time, except for an area centred on the focal point of the participant. In the control condition, the participant’s eye movements were merely tracked. A remote eye-tracker was used so that the participants were not constrained by a helmet. The experimental protocol followed an ABA design: the gaze-contingent lens was first deactivated (baseline condition), then activated (experimental condition) and finally deactivated again (final condition). At the end of the experiment, participants were inquired on whether or not they had noticed that they were in control of the lens. The gaze data was analysed with a software prototype, adapted for the present study, which could handle eye-tracking on dynamic visual displays. The data was analysed using an ANOVA with Condition as the within-subjects variable and Group (HFASD vs. typical) as the between-subjects variable.

Results: Fixation data revealed that the HFASD group did not modulate their eye movements in the experimental condition as efficiently as the typical group. Additionally, significantly fewer participants with HFASD noticed that the lens was controlled by them. Finally, for the HFASD group, performances on the task were correlated with the time spent on fixating the face in the experimental condition, but not in the control condition.

Conclusions: This experiment provides some direct evidence for impairments in self-monitoring of eye movements in HFASD and, consistently, suggests an alteration in the sense of agency. This outcome seems conducive in characterizing the atypicalities of visual exploration in HFASD. Additionally, by constraining the visual field, the gaze-contingent lens presumably hindered compensatory strategies based on lateral vision, thus yielding a setting that might prove highly beneficial for educational purposes. The presentation will involve interactive demonstrations of the virtual environment, the associated eye-tracking analysis tools and videos of the visual exploration patterns taken from case examples in our study.

| More