22426
Processing of Facial Expressions and Their Mental Imagery in ASD: An EEG Study with Feasibility Analysis for a Neurofeedback Approach

Thursday, May 12, 2016: 11:30 AM-1:30 PM
Hall A (Baltimore Convention Center)
M. Simoes1, J. Andrade1, R. Monteiro1, S. Mouga1, P. Carvalho1, G. G. Oliveira2 and M. Castelo-Branco1, (1)University of Coimbra, Coimbra, Portugal, (2)Hospital Pediátrico de Coimbra, Coimbra, Portugal
Background:

Facial expression (FE) processing deficits have been identified in ASD. Studies on this topic usually use static photographic stimuli of facial expressions. Here we explored induced brain dynamics of facial expression morphing. These stimuli may be potentially relevant in helping attributing mental states to others in ASD, by facilitating the capability of imagining another person performing an action such as a facial expression.

Objectives:

In this study we investigated brain responses to dynamic FE stimuli. Additionally, we searched the neural correlates of imagery of a third person performing FEs, and assessed the viability of a neurofeedback approach based on such correlates.

Methods:

EEG data on 58 scalp locations were so far collected from eleven male teenagers with high-functioning ASD (16.91 ± 2.51 years old) and seven neurotypical male teenagers (15.57 ± 3.31 years old), performing a task divided in two parts: visual stimulation and mental imagery. On the visual stimulation part, a virtual male teenager (always present on the screen) performed dynamic happy and sad facial expressions, starting and returning to the neutral expression (morphing duration: 250ms and FE duration: 1500ms). On the mental imagery task, the participant was asked to imagine the virtual person performing the FE (happy or sad), after a visual instruction and an auditory trigger. EEG data were preprocessed and cleaned from noise and artifacts. Event-related potentials (ERP) for each FE stimulus were computed, and their peak and respective latencies were extracted. For the imagery part, event related spectral perturbation (ERSP) was computed for each FE. Finally, a linear Support Vector Machine (SVM) was used to discriminate EEG segments as FE imagery or no imagery, using power variations from a neutral baseline as feature after the application of a Common Spatial Patterns algorithm on the train data. Cross-validation was used to assess the accuracy of the classifier in the cleaned and original data, as simulation for an online application.

Results:

ERP responses to sad expressions were more sustained than for happy stimuli. Peak responses for sad stimuli were found delayed for the ASD group in frontal and left central cortex (p<0.05). ERSP analysis showed power decreases in theta rhythms during the imagery process, mainly in fronto-temporal and parieto-occipital areas. ASD group theta power deactivations were statistically significantly smaller than controls (p<0.05), but yet significantly different than baseline in the occipital region. Regarding the classifier, an accuracy of 74(±2)% was achieved in single-trial level for both groups. Those results were statistically different from the chance level for every participant (p<0.01).

Conclusions:

Results from the ERP analysis suggest abnormal responses to sad facial expressions in the ASD group. Theta event-related desynchronization in frontal and parieto-occipital areas was present during the FE imagery task on both groups, although with significantly lower intensity in the ASD group. That group difference opens the possibility of using theta band as target for neurofeedback approach using FE imagery, as validated by the SVM classifier which identified imagery segments with an accuracy of 74% in single-trial level.