22410
Adaptive Neural Mechanisms in Individuals with Autism for Integrating Multisensory Real-World Stimuli

Thursday, May 12, 2016: 11:30 AM-1:30 PM
Hall A (Baltimore Convention Center)
P. J. Webster1, C. Frum1, C. Bauer1, A. Kurkowski-Burt2, N. Mardmomen3, M. Gray4 and J. W. Lewis1, (1)Neurobiology & Anatomy, West Virginia University, Morgantown, WV, (2)Occupational Therapy, West Virginia University, Morgantown, WV, (3)Biology, West Virginia University, Morgantown, WV, (4)Psychology, West Virginia University, Morgantown, WV
Background: Individuals on the autism spectrum can over respond or under respond to every-day sights, sounds, smells, etc. This sensory processing dysfunction is a pervasive aspect of autism that increases anxiety and negatively impacts language development and social interactions. The ability to appropriately process sensory information is developed early in life based upon our experiences. After birth our brain learns to integrate what we see and hear so that information comes together in the correct timeframe, e.g. listening to someone speak while watching their mouth move. It is critical that the brain processes auditory information in synch with the visual information as this increases attention and is crucial for language development. Individuals with autism have been shown to integrate sensory information, but over a wider timeframe than their peers. This wider temporal binding window can impact their brain’s ability to benefit from multiple sensory inputs and may contribute to aberrant sensory processing. Exactly how the brain can be re-organized to process audiovisual interactions remains unresolved. 

Objectives:  We are using functional magnetic resonance imaging (fMRI) to characterize adaptive cortical mechanisms for processing audiovisual information in high-functioning individuals with autism. Neuroimaging results will be correlated with behavioral measures of sensory processing and integration. We also sought to use socially relevant stimuli in order to more precisely examine how the brain responds when processing real-world events in the environment.

Methods:  While in the 3T MRI scanner, participants watched a video of someone bouncing a basketball (a socially relevant stimulus). Their task was to press a button when they perceived the ball to touch the ground. The video included an audiovisual condition (bi-modal; see and hear the action), a visual only condition (uni-modal; only see the ball dribbled), and a resting condition (baseline control; actor holding basketball). Participants included high-functioning individuals with autism (aged 18-28 years) as well as individuals without autism matched for age and gender. Brain regions activated during the unimodal and bimodal conditions, relative to the baseline condition, were modeled using multiple linear regression analyses (NIH AFNI software). Individual datasets were transformed into Talairach coordinate space, and groups were compared and contrasted across conditions using t-tests.  

Results:  Both groups showed similar activation in primary auditory and primary visual cortices when processing audiovisual information. However, differences were seen between the two groups across the two conditions (uni-modal vs. bi-modal; p<0.01, corrected). The group with autism revealed increased activation in portions of cingulate cortex and middle temporal gyrus compared to individuals without autism.  Conversely, individuals without autism showed increased activation in left inferior medial-posterior insula, which persisted in all individual datasets.

Conclusions:  The functional roles of the cortical regions differentially activated provide important clues as to possible adaptive mechanisms that high-functioning adults with autism are using at a systems level to cope with audiovisual interactions. Correlations between neuroimaging results and sensory profile sub-scores may reveal autism subtypes that can be explored further.