Objectives: Our goal is to develop a technology-based intervention that helps individuals on the autism spectrum capture, analyze, systemize, and reflect on social-emotional signals communicated by facial and head movements in natural, everyday social interactions. Our approach utilizes an ultra-mobile computer customized with a video camera and pattern analysis algorithms that can automatically identify facial expressions using facial feature tracking. In an effort to make our system robust to real-world conditions and usable by individuals with cognitive, motor, and sensory impairments, we have engaged in a number of user-centered design sessions with people on the autism spectrum and their caregivers.
Methods: We conducted five usability sessions with seven verbal adolescents on the autism spectrum and their teachers to address various hardware and software functionality issues related to our system.
Results: Our initial interface design using facial expression graphs and points superimposed on the video to indicate features on the face was confusing and not engaging enough for the participants. Based on iterative feedback, interactive affective tagging components were added and the interface was made customizable to suit each participant's interests and difficulties in recognizing particular facial expressions. For example, some participants were good at recognizing happiness, sadness, and anger. For those participants, we were able to instantly customize the interface to handle a more challenging set of affect labels, such as confusion and excitement. In terms of form factor, many participants found the mobile computer's keyboard and track pad distracting. To overcome this, we made custom covers that shield exterior input controls and utilized the ultra-mobile computer's touch screen to input data. We also adjusted the placement and size of touch screen buttons to allow participants to use their thumbs for interaction. Finally, some participants had difficulty reading the text labels describing identified facial expressions. We are currently exploring the use of images instead of text to accommodate reading difficulties.
Conclusions: The user-centered design sessions provided insights into the usability of the system and were critical to the development of our technology, underscoring the importance of including people on the autism spectrum and their caregivers in the design process of new technologies. For these technologies to be effective, they need to accommodate the perceptual, motor, and cognitive disabilities of their users. An experimental evaluation of our redesigned system is forthcoming to determine if just-in-time, in-situ assistance can help facilitate learning of facial expressions and underlying emotions for persons on the autism spectrum.