19194
Emotion Attribution from Dynamic Faces: Bold Differences Pre- and Post- Decision in ASDs

Saturday, May 16, 2015: 11:30 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
L. S. McKay1, R. S. Brezis2, T. Wong3, L. Bidaut4 and J. Piggot5, (1)University of Dundee, Dundee, United Kingdom, (2)Interdisciplinary Center, Herzliya, Israel, (3)Department of Radiology, University of Washington, Seattle, WA, (4)Clinical Research Imaging Facility (CRIF), University of Dundee, Dundee, United Kingdom, (5)Psychiatry, University of Dundee, Dundee, United Kingdom
Background:  

Atypical emotion attribution from facial expressions in Autism Spectrum Disorder (ASD) has been widely reported. However, to our knowledge, previous studies have not disentangled neural activation pre- and post-emotion attribution in ASD.  To address this limitation, a novel dynamic facial expressions paradigm (DFEP) was developed to elucidate the neural processes engaged pre- and post-attribution of emotion from developing naturalistic facial expressions. 

Objectives:  

Determine if there are differences in neural activation pre- and/or post-attribution of emotion in individuals with ASD compared to typically developing individuals.

Methods:  

Twenty subjects with ASD and 15 matched typical developed (TD) controls (8-18yrs) watched 10s displays of dynamic faces inside an MRI scanner.  Subjects pressed a button once they “were sure” that the face, which started with a neutral expression, was expressing happiness, sadness or remaining neutral.  Subjects completed 2 runs, each containing 16 blocks of each emotion.

Using the time-to-decision, subject-specific design files that split each display into a pre- and post-decision phase (DP) were created, giving a 2 (group) by 3 (emotion) by 2(DP) design. BOLD signal was compared using a 2x3x2 random effects ANCOVA, with age and verbal IQ entered as covariates to control for potentially confounding effects.  Main effects and interactions were thresholded at p < 0.005, corrected for multiple comparisons using cluster size threshold estimation. 

Results:  

We found a main effect of Group in the Post-Central Gyrus (PCG), with BOLD activity being higher in the ASD than the TD group.  Social Responsivity Scale (SRS; Constantino 2000) score significantly predicted activity in this region during the Sad emotions pre-decision in the ASD, but not the TD group.  A Group x DP interaction was found in the Caudate, driven by increased activation post- relative to pre-decision in the TD group, but not in the ASD group. A Group x Emotion interaction was found in the Supra-Marginal Gyrus (SMG): the TD group had decreased activation for Neutral relative to Sad faces, but the ASD group showed decreased activation for Sad relative to Happy faces.  Finally, there was a complex 3-way interaction in the left Middle Frontal Gyrus (MFG), driven by differences between the groups and emotions in the post-DP.

Conclusions:  

The DFEP shows the ASD group had significantly greater PCG activation than the TD group across all emotions and DPs, which correlated with autistic symptoms before the emotion attribution decision for sad faces.  Only the TD group demonstrated increased caudate activation post- relative to pre-decision, supporting reduced activation from social stimuli in reward areas of the brain in individuals with ASD.  The SMG showed decreased activation for Neutral relative to Sad faces in the TD group, but decreased activation for Sad relative to Happy faces in ASD subjects, suggesting SMG engagement may subserve processing of different emotions in the ASD and TD groups.  The left MFG activation differences between the ASD and TD Groups across emotions in the post-DP suggests that this area, which is involved in emotion attribution and empathy, is atypically activated after the attribution of emotion in individuals with ASD.