Neural Correlates of Emotion Processing during Simulated Social Interactions in Adults with Autism Spectrum Disorder and Schizophrenia

Friday, May 13, 2016: 4:20 PM
Room 310 (Baltimore Convention Center)
K. Deckert1, J. H. Foss-Feig2, A. Naples2, E. J. Levy1, K. K. Stavropoulos2, M. Rolison2, L. Mohamed1, C. Schleifer3, N. Santamauro4, A. Anticevic4, V. Srihari4 and J. McPartland2, (1)Yale Child Study Center, New Haven, CT, (2)Child Study Center, Yale School of Medicine, New Haven, CT, (3)Yale University, New Haven, CT, (4)Yale University School of Medicine, New Haven, CT
Background: Both autism spectrum disorder (ASD) and schizophrenia (SCZ) are characterized by social deficits, including in emotional recognition, eye contact, and theory of mind.  Previous research utilizing electrophysiology and measuring event-related potentials (ERPs) shows atypical structural encoding of faces, as indexed by N170 amplitude and latency, in both adults with ASD and SCZ.

Objectives: Using gaze contingent ERP in response to happy and fearful faces that responded to participant gaze, the current study examined emotional processing and ERP components (N170, P300) in adults with ASD, SCZ, and typical development (TD). We examined diagnosis-specific findings as well as transdiagnostic associations between neural processes and behavioral correlates of emotion recognition abilities. These results address divergence between disorders but also highlight common biological processes underlying disorders characterized by social communicative deficits.

Methods: 42 adults (TD=16, ASD=12, SZ=14) completed EEG, along with self-report questionnaires and direct assessment clinical measures. Groups were matched on IQ, age, and sex. Emotion recognition was assessed using the Reading the Mind in the Eyes Task (RMET).  EEG was recorded with high density 128 channel Geodesic Sensor Nets with concurrent eye tracking, in response to neutral faces that dynamically changed to express either happy or fearful emotional expressions upon participant gaze towards the eyes. ERP was time-locked to emotion onset. N170 amplitude and latency were analyzed over bilateral occipitotemporal electrodes; P300 was analyzed over midline parietal sites. A repeated measures ANOVA was conducted to analyze amplitude and latencies of ERP components (within-subject factors: Hemisphere, Emotion; between-subjects factor: Diagnosis). Bivariate correlations with RMET scores were also conducted.

Results: Adults with SCZ showed longer N170 latency compared to controls, F(2,39) = 2.67, p = .081. Irrespective of diagnostic group, right hemisphere N170 amplitude (r = -.324, p = .038) and latency (r = -.410, p = .007) were correlated with emotion recognition ability, with faster and more robust N170 responses in individuals with better emotional identification. In the left hemisphere, N170 response to happy versus fearful faces showed greater amplitude in ASD relative to SCZ and controls, F(2,39) = 5.49, p = .008. Left hemisphere N170 latency did not differ by group, but happy faces elicited a faster N170 response across groups, F(2,39) = 5.04, p = .031. P300 amplitude was enhanced to fearful versus happy faces in ASD and TD, but not in SCZ, F(2,39) = 4.95, p = .012. SCZ was characterized by attenuated P300 amplitude across emotions, F(2,39) = 2.73, p = .077. 

Conclusions: Our results show that individuals with ASD and schizophrenia have altered neural responses to emotional face stimuli. A nuanced pattern of results indicates both common and distinct mechanisms in these disorders; moreover, delayed and attenuated neural response to faces is associated with reduced emotion recognition abilities across clinical and non-clinical populations alike. These findings have important implications in determining the neurobiological underpinnings of emotion recognition and social functioning deficits in ASD and other disorders affecting social cognition and for the utility of diagnostic taxonomies based on behavioral symptoms versus dimensional measurement of specific functional processes.