Thursday, May 7, 2009
Northwest Hall (Chicago Hilton)
2:30 PM
Background:
Impairment in social interaction is a core symptom of autism spectrum disorders (ASD), and weak processing of nonverbal affective information is considered a critical skill impacting social reciprocity. Although there have been numerous studies testifying to weaknesses in facial affect recognition, both at behavioral and neurological levels, there have been few studies documenting that children with ASD also have difficulties understanding the feelings of others as conveyed through vocal prosody.
Objectives:
The purpose of the study was to measure sensitivity to affective expressions in multiple non-verbal modalities -- facial, vocal, and situational -- in children with ASD vs. Typical Development (TD).
Methods:
Seventy nine children ages 4-8 underwent diagnostic and neurocognitive assessments enabling classification of 37 children into the ASD group, 42 children into the TD group, and neuropsychological characterization of the ASD children. Affect measures included a computerized version of Facial and Situational Affect Matching Tasks (Fein, et al., 1985; 1992) in which participants pointed to one of four affects, using a computer touch screen, that matched the facial emotion of a targeted figure (Facial Affect Task) or the emotion conveyed by a person, with face covered up, engaged in a conventional social situation (e.g., a child at a birthday party) (Situational Affect Task). In the Vocal Affect Recognition Task, participants pointed to one of four affects that matched the feeling conveyed in a verbal communication. Professional actors, male/female/adult/child, were asked to read neutral sentences (i.e. “it is round”) in context to help elicit the emotion (i.e., scared: “I see what's hidden—it is round! It's a bomb!”) with utterances then extracted from context. Using multiple speakers ensured that deep processing of affect was required and not just recognition of repetitive acoustic patterns of one speaker. Practice items established that instructions were understood, and all subjects were trained to 100% mastery on response options.
Results:
Data were analyzed using Analysis of Covariance, with age as covariate. Compared with the ASD group, the TD group performed significantly better (p<0.001) on Vocal Affect Recognition (63% vs. 48%), Situational Affect (78% vs. 58%), and Facial Affect (78 vs. 65%). On two control tasks, the TD group performed somewhat better on a Facial Recognition Task (p<0.025, 80% vs. 69%), and not significantly better on an Object Recognition Task (p>0.15, 77% vs. 71%). Planned comparisons showed 2-way interactions (p<0.015) between Group (ASD vs. TD) and Task (average of the three Affect Tasks vs. Object Recognition Task) and between Group and each individual affect task vs. the Object Recognition Task, indicating between-group differences on the affect tasks could not be attributed to a broad visual processing deficit.
Conclusions:
Findings confirm prior evidence for significantly weak recognition of facial affect and situational affect in children with ASD (Fein, et al.; Ozonoff, et al.). Significant difficulties also found in this study to discern the emotional tone of verbal communications adds importantly to our understanding of the social interaction and communicative reciprocity difficulties of children with ASD.