21134
Differences in Reaction Time to Detect Emotion Faces Varies Based on Autistic Social Skills and Communication Abilities in Young Adults

Thursday, May 12, 2016: 11:30 AM-1:30 PM
Hall A (Baltimore Convention Center)
J. Burk1, C. Dickter1, J. Zeman2 and K. M. Fleckenstein3, (1)College of William & Mary, Williamsburg, VA, (2)Psychology, College of William and Mary, Williamsburg, VA, (3)College of William and Mary, Williamsburg, VA
Background:  Individuals with autism spectrum disorders (ASD) show challenges with tasks that involve rule switching or ignoring distracting stimuli, but better performance in visual search tasks that require identification of specific features in the target stimulus.

Objectives:  The goal of the present study was twofold. We wanted to test a group of participants who were along the broader autism phenotype (BAP) in a visual search task. We modified the task to include emotion faces to test whether advantages in visual search performance would be stronger with these stimuli, rather than neutral stimuli.

Methods:  Participants were 136 non-ASD undergraduate students (44.4% men; Mage =19.1 years; 76.3% Caucasian). Participants completed the Autism Quotient (AQ, Baron-Cohen et al., 2001).), a measure that assesses self-report of autistic-like behaviors. The AQ has five sub-scales assessing autistic behaviors including social skills, attention switching, communication, imagination, and attention to detail. Participants completed a modified visual search task, a procedure that typically measures attention by assessing response speed whenthere is a discrepant stimulus among other stimuli displaying the same properties (Plaisted et al., 2010). The visual search task was modified by using faces that displayed either basic emotions (i.e., happiness,anger) or complex emotions (i.e., surprise, fear). Participants indicated whether a group of 8 or 16 faces on the screen depicted the same emotion or different emotion by pressing a certain key on a keyboard.

Results:  For the total AQ and each subscale, the top and bottom third of participants were compared. Reaction time data were analyzed using a 2 (Target: basic, complex) x 2 (Distractor: basic, complex) x 2 (Number of Stimuli: 8, 16) x 2 (AQ scale: bottom third [low AQ], top third [high AQ]) mixed model ANOVA. Only significant effects involving AQ are reported below. For total AQ, there was a between-subjects main effect, with low AQ participants responding more slowly (M = 3905.99, SE = 263.01) than high AQ participants (M = 3105.19, SE = 241.93), F(1, 70) = 5.02, p = .03. For the Social Skills subscale, there was a Target x AQ interaction, F(1, 81) = 5.04, p = .03. For complex targets, low AQ participants were significantly slower than high AQ participants, F(1, 82) = 5.69, p = .02 (Figure 1A). There was also a Distractor x AQ interaction, F(1, 81) = 4.19, p = .04. For complex distractors, low AQ participants were significantly slower than high AQ participants, F(1, 82) = 6.20, p = .02 (Figure 1B). For the Communication subscale, there was a Distractor x AQ interaction, F(1, 70) = 4.71, p = .03. For basic emotions, F(1, 70) = 5.63, p = .02 and complex emotion, F(1, 70) = 8.39, p = .01, distractors, low AQ participants were slower than high AQ participants.

Conclusions:  Individuals who scored high on the total AQ and the socioemotional AQ subscales, and may be along the BAP, show faster identification on a visual search task compared to those who score low on these measures, even when the target stimuli are emotion faces.