Level of Autistic Traits Modulates Activity in Face and Action Perception Systems

Saturday, May 19, 2012: 11:30 AM
Grand Ballroom West (Sheraton Centre Toronto)
10:15 AM
J. McPartland1, M. Coffman1, S. Faja2, A. Kresse3, C. Mukerji1, A. Naples1 and R. Bernier3, (1)Yale Child Study Center, New Haven, CT, (2)Box 357920, University of Washington, Seattle, WA, United States, (3)University of Washington, Seattle, WA
Background: The social motivation hypothesis posits that reduced social drive leads to inattention to people and consequent failure of developmental specialization in experience-driven brain systems for processing faces. Abnormalities in face perception and recognition and in the action perception system are evident early in life in individuals with ASD and have been documented throughout the lifespan. The current work focuses on these two facets of social brain circuitry and their relationship to social perception and autistic traits.

Objectives: To apply an innovative experimental paradigm to (a) examine electrophysiological markers of both face processing and action perception and to (b) test models of effective connectivity between these systems and real-world social behavior to (c) understand connectivity in social brain systems and (d) its relation to autistic traits. 

Methods: The paradigm employed a novel stimulus set of 210 unique 3D photorealistic face stimuli capable of producing movements consistent with human musculoskeletal structure. Typically developing adult participants viewed a 500ms static initial pose which segued into 500ms facial movement of three types: (1) affective movement (fearful expression); (2) neutral movement (puffed cheeks); and (3) biologically impossible movement (upward dislocation of eyes and mouth). ERPs (reflecting stages of face processing) were time locked to onset of static face stimuli, and oscillatory EEG power in the mu range (reflecting activation in the action-perception system) was extracted during periods of facial movement. Autistic traits and social perception were assessed via self-report on the Autism Quotient (AQ) and Reading the Mind in the Eyes Task (RMET), respectively.

Results: An ERP index of emotion decoding (N250) differentiated conditions, such that fear (-0.43µv) elicited enhanced amplitude relative to neutral (-0.01µv), impossible (0.162µv), or puffed (-0.05µv) poses. N250 amplitude correlated with RMET score across conditions (rs range from -.55 to -.62), such that larger amplitude was associated with stronger social perception and lower levels of autistic traits. For an ERP index of face structural encoding (N170), amplitude correlated with AQ score across conditions (rs range from .24 to .31), such that attenuated amplitude was associated with higher levels of autistic traits. For mu attenuation, fearful movement elicited greater attenuation than neutral or impossible movements (p<.05). Bayesian structural equation models were applied to examine shared versus distinct latent sources of variability for ERPs and EEGs to faces, as well as an integrative model incorporating brain activity and behavior, as a function of level of autistic traits.

Conclusions: Neural responses differentiated emotional facial expressions in terms of both ERPs to static poses and mu attenuation to dynamic movement. Furthermore, ERP markers of face perception correlated with behavioral measures of social function; stronger response at an emotion decoding component associated with better emotion perception, and attenuated response at an early face perceptual component associated with higher levels of autistic traits. The integration of EEG and ERP in a latent variable framework with standardized measures of subclinical traits holds promise to empirically derive models of effective connectivity in brain systems subserving social behavior.

| More