Friday, 3 May 2013: 09:00-13:00
Banquet Hall (Kursaal Centre)
12:00
Background: Accessibility of computing technology comprises at least two components, economic and ergonomic. Until very recently these two aspects of accessibility have been at odds: cheap, mass-produced computers for the home market offered only a video monitor for visual-spatial output and a keyboard for motor-spatial or motor-symbolic input. Conventional keyboards demanded a high degree of fine motor skill and closed-loop proprioceptive-motor and/or visual-motor accuracy - a skill impaired in autism (Haswell et al., 2009). A monitor spatially distinct from the keyboard demands spatial re-mapping, and even touch-screen software usually assumes typical fine motor targeting and execution. A computer that can teach pointing amongst multiple response options in space, and sequencing these pointing actions to build symbolic representations, might provide a manual-motor communication alternative for persons with autism whose oral-motor dyspraxia may preclude communicative speech.
Objectives: To design and pilot-test an iPad game that will develop the foundational skill of pointing to select amongst multiple response options, and the higher-level skill of representing objects with sequences of motor outputs, both in the manual motor domain of pointing and the oral motor domain of speaking, and in which movements of the client are distinguished from movements of the device by the therapist.
Methods: The design exploits the autistic fascination with iconic rather than symbolic representations, and multi-sensory redundancies and sensory-motor contingencies, but avoids occasions for repetitive behaviour, and renders the communicative content spatially and temporally coincident with the most physically salient, attention-capturing stimulus (Chen et al., 2012). By pairing perception of the spatial sequences inherent in jigsaw-puzzle pictures with production of the manual motor or oral-motor sequences making up the typed or spoken words for these pictures, the game aims to develop manual motor and oral motor skills and to bootstrap the development of symbolic from iconic representations. In several modes spatial sequences of jigsaw pieces can be dragged into place with a finger across the display, typed into place by pointing to sequences of characters spelling the object's name, or spoken into place by vocalising sequences of sounds in the object's spoken name. In all these input modalities, a high tolerance for manual targeting error or oral articulatory error is allowed. All manual and oral inputs from the user, as well as movements of the iPad itself, are logged.
Results: The software is being piloted at two clinics specialising in autism spectrum conditions, in Bangalore, India and in Providence, Rhode Island, in clinically diagnosed autistic children ages 3 to 7 years who lack functional communicative speech. Initial measures, internal to the game software, suggest with continued game play decreases in motor targeting error and response latency and an increase in multiple-choice set size for successful responses.
Conclusions: Mass-market touch-screen devices, combined with user interfaces that allow for open-loop, approximate motor performance, and with game designs that work with instead of against detail-oriented autistic cognitive profiles so as to develop skill at representing objects as symbolic sequences, can promote the development of communicative skills.