Eye Gaze and ERP Correlates of Emotion Processing Across Adults with ASD and Schizophrenia

Saturday, May 16, 2015: 11:30 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
E. J. Levy1, A. Naples1, J. H. Foss-Feig1, R. Tillman1, H. S. Reuman1, K. Law1, H. Samson2, V. Srihari3, A. Anticevic3 and J. McPartland1, (1)Child Study Center, Yale University, New Haven, CT, (2)Yale University, New Haven, CT, (3)Psychiatry, Yale University, New Haven, CT
Background: Both Autism Spectrum Disorder (ASD) and schizophrenia (SZ) are characterized by difficulty with social cognition. In particular, emotion processing is a notable weakness across ASD and SZ, with both groups displaying deficits in emotion recognition and theory of mind. Eye-tracking (ET) studies reveal atypical gaze patterns to emotional faces in ASD and SZ, while event-related potential (ERP) experiments also suggest common dysfunction in neural systems subserving facial emotion processing. In line with the Research Domain Criteria (RDoC) initiative, the current study is the first to co-register ET and ERP methodologies for the study of emotion processing across multiple neurodevelopmental disorders.

Objectives: Using ERPs, eye-tracking, and behavioral data, the current study aimed to (i) identify overlap in emotion-specific brain responses and gaze patterns across ASD, SZ, and typical development (TD), (ii) characterize ASD-specific electrophysiological and behavioral markers of atypical emotion processing, and (iii) identify relationships between behavioral measures of social functioning and biological indices of emotion processing.

Methods: EEG was recorded from 7 adults with ASD, 8 with SZ, and 10 with TD using a 128-electrode Hydrocel Geodesic Sensor net. ET was recorded simultaneously with an Eyelink-1000 remote eye tracker. Participants viewed a crosshair followed by an emotion-neutral face. Contingent upon participants’ fixation to the eye region, the face changed to display a happy or fearful expression. ERPs were segmented to the presentation of the emotional face. To investigate the earliest stages of emotion processing, N170 amplitude and latency were extracted from occipitotemporal electrodes. ET data were collected to quantify gaze patterns to the eye and mouth regions. Behavioral and self-report measures of emotion processing and social cognition were collected, including the Reading the Mind in the Eyes Test (RMET), Broader Autism Phenotype Questionnaire (BAPQ), and Social Responsiveness Scale (SRS).

Results: Analyses revealed no significant between-group differences in N170 amplitude or gaze toward the eyes of emotional faces. However, across all participants, more robust N170 response to emotional faces correlated with better emotion recognition (RMET), all rs<-.500, ps=.015. Results examining time spent looking to the mouth revealed a main effect of emotion (F(1,24)=4.531, p=.045) and a group by emotion interaction (F(2,24)=3.463, p=.049). Whereas individuals with SZ and TD directed their gaze toward the mouth more when viewing happy compared to fearful faces, the ASD group showed the opposite pattern (i.e., more mouth-looking to fearful faces). Across happy and fearful faces, more time looking to the eyes correlated with lower autistic symptomatology (BAPQ) and higher social responsiveness (SRS), all rs<-.413, ps<.045.

Conclusions: This study assessed electrophysiological and behavioral markers of atypical emotion processing across ASD, SZ, and TD. Neural indices of face decoding were associated with emotion recognition abilities across groups, while preferential gaze to the eyes of emotional faces was associated with measures of social cognition. Individuals with ASD were characterized from those with SZ and TD by increased gaze toward the mouth to fearful than happy faces. Thus, in addition to identifying biomarkers of social dysfunction across diagnostic groups, the current study also identifies ASD-specific markers of emotion processing.