International Meeting for Autism Research (May 7 - 9, 2009): User-Centered Design of Technology for Just-in-Time, in-Situ Exploration of Facial Affect for Persons on the Autism Spectrum

User-Centered Design of Technology for Just-in-Time, in-Situ Exploration of Facial Affect for Persons on the Autism Spectrum

Friday, May 8, 2009
Boulevard (Chicago Hilton)
M. Eckhardt , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
M. Madsen , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
Y. Kashef , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
A. R. Nasser , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
M. E. Hoque , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
R. E. Kaliouby , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
M. S. Goodwin , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
R. W. Picard , Media Laboratory, Massachusetts Institute of Technology, Cambridge, MA
Background: Many people on the autism spectrum understand the semantics involved in social interaction; however, embodied information such as facial expressions, gestures, and voice often prove elusive. First-hand accounts from people with autism highlight the challenges inherent in processing these complex and unpredictable social cues. These challenges can be debilitating, complicating social interaction and making integration with society difficult. While many intervention methods have been developed to provide help, the majority fail to include rich, real-world social interactions in their methodology.

Objectives: Our goal is to develop a technology-based intervention that helps individuals on the autism spectrum capture, analyze, systemize, and reflect on social-emotional signals communicated by facial and head movements in natural, everyday social interactions. Our approach utilizes an ultra-mobile computer customized with a video camera and pattern analysis algorithms that can automatically identify facial expressions using facial feature tracking. In an effort to make our system robust to real-world conditions and usable by individuals with cognitive, motor, and sensory impairments, we have engaged in a number of user-centered design sessions with people on the autism spectrum and their caregivers.

Methods: We conducted five usability sessions with seven verbal adolescents on the autism spectrum and their teachers to address various hardware and software functionality issues related to our system.

Results: Our initial interface design using facial expression graphs and points superimposed on the video to indicate features on the face was confusing and not engaging enough for the participants. Based on iterative feedback, interactive affective tagging components were added and the interface was made customizable to suit each participant's interests and difficulties in recognizing particular facial expressions. For example, some participants were good at recognizing happiness, sadness, and anger. For those participants, we were able to instantly customize the interface to handle a more challenging set of affect labels, such as confusion and excitement. In terms of form factor, many participants found the mobile computer's keyboard and track pad distracting. To overcome this, we made custom covers that shield exterior input controls and utilized the ultra-mobile computer's touch screen to input data. We also adjusted the placement and size of touch screen buttons to allow participants to use their thumbs for interaction. Finally, some participants had difficulty reading the text labels describing identified facial expressions. We are currently exploring the use of images instead of text to accommodate reading difficulties.

Conclusions: The user-centered design sessions provided insights into the usability of the system and were critical to the development of our technology, underscoring the importance of including people on the autism spectrum and their caregivers in the design process of new technologies. For these technologies to be effective, they need to accommodate the perceptual, motor, and cognitive disabilities of their users. An experimental evaluation of our redesigned system is forthcoming to determine if just-in-time, in-situ assistance can help facilitate learning of facial expressions and underlying emotions for persons on the autism spectrum.