International Meeting for Autism Research (May 7 - 9, 2009): Visual Search in Static and Dynamic Self-Motion Environments: An Eye-Tracking Study

Visual Search in Static and Dynamic Self-Motion Environments: An Eye-Tracking Study

Friday, May 8, 2009
Northwest Hall (Chicago Hilton)
3:30 PM
E. Sheppard , School of Psychology, University of Nottingham Malaysia Campus, Semenyih, Malaysia
D. Ropar , School of Psychology, University of Nottingham, Nottingham, United Kingdom
G. Underwood , School of Psychology, University of Nottingham, Nottingham, United Kingdom
E. Van Loon , School of Psychology, University of Nottingham, Nottingham, United Kingdom
Background: Research has shown that those with ASD excel on visual search tasks that require them to find a target stimulus within a figure or array of objects (e.g. Shah and Frith, 1983, O’Riordan et al., 2001). These findings can be explained by various theories of perceptual processing in ASD including Enhanced Perceptual Functioning (Mottron & Burack, 2001), Weak Central Coherence (Frith, 1989), and superior systemising skills (Baron-Cohen, 2002). However, previous studies have only used static stimuli, so it is unclear whether superior visual search skills would be evident with moving stimuli. This is important as in everyday life we frequently need to search for objects within moving arrays, often whilst being in motion ourselves. Additionally, previous research has not explored visual search for targets embedded in social stimuli. It has been found that individuals with ASD tend not to orient to social aspects of their environment (e.g. Klin et al., 2002), suggesting that they might find search for targets embedded within social stimuli more challenging.

Objectives:

This study aimed to explore the effects of motion and social relevance on visual search ability in participants with and without ASD. It was predicted that those with ASD would show superior visual search performance to comparison participants in both static and dynamic conditions. However, this advantage would disappear when searching for a target within a social stimulus.

Methods:

20 adult males with HFA or AS, and 40 matched comparison participants (20 male, 20 female) participated. They viewed 40 three-dimensional graphical animations of a driver’s view of road scenes and 40 still images of similar scenes. The dynamic scenes contained simulated self-motion, as though the viewer was moving through the environment. Each scene contained a target shape (circle or triangle) hidden within the scene. In each condition (static and dynamic) the target was on a social stimulus (i.e. a person) 25% of the time, and on a non-social object (e.g. car, road, building) 75% of the time, roughly corresponding to the proportion of the screen covered by social and non-social stimuli respectively. Participants were instructed to respond with a key-press as soon as they located the hidden shape. They then identified verbally which shape was present and where it was within the scene. Accuracy and response time were recorded. Participants’ eye movements were recorded using a tobii portable eye-tracker.

Results: Initial analyses suggest that whilst the groups did not differ in reaction time for either task (static or dynamic), participants with ASD were more accurate (i.e. correctly identified more targets) than male comparison participants on the static version of the task. Analysis of the eye movement data will also be presented. 

Conclusions: Implications of the findings for theories of perceptual and social processing in ASD will be discussed.

See more of: Poster IV
See more of: Poster Presentations