Grant is 10 years old and has severe cerebral palsy. He is fully included with his typical peers in a grade 5 classroom. He is currently struggling to keep pace with the curriculum, not because of the learning required, but rather because of his slow access to AAC and other technologies. He lacks the range and precision of movement in any single modality to use direct selection and has to rely on scanning, a very inefficient method of access. He is exhausted by the end of the school day. He requires a better, more efficient access technique.
Challenge: Significant advances in AAC access technologies have enabled more individuals with complex communication needs (CCN) who have severe motor impairments to operate computer technology using eye tracking, head tracking, and touch sensitive interfaces. However, all of these methods rely on a single access modality. There are significant challenges associated with using a single access modality (e.g., using eye gaze alone) to control technology, including fatigue, over-use injuries, and inefficiency.
Goals: In this development project, we will create new integrated access methods that combine multiple access modalities to control smart technologies (smartphones, tablet devices, and computers).
Objectives: RERC on AAC engineers will build on our prior work and will integrate multiple access modalities such as speech, gestures, eye tracking, and head tracking to best meet the needs of individuals with CCN and severe motor impairments. Evaluation will focus on comparing multimodal access strategy to single access modalities to determine the impact on rate, efficiency of access, and personal preference of these individuals. Specifically, we will test two hypotheses:
- These individuals will demonstrate increased accuracy and efficiency using multimodal access strategies compared to a single modality; and
- They will report higher ratings of consumer satisfaction for multimodal access compared to single mode access. The evaluation will be conducted in a series of separate but parallel studies, each designed to test the effects of multimodality access on three populations with severe motor impairments who require alternate access:
- children with CP,
- adults with CP, and
- adults with cervical spinal cord injury.
Multimodal project example: Eye-tracking and switch scanning
In the video below, a prototype access method that combines eye-tracking with switch scanning is demonstrated. The eye-tracking is used to locate a larger grouping of letters. Once the target letter highlighted within the larger group, the switch is selected to begin scanning all letters within this group. The target letter is then selected with the switch. With this approach, the individual using the system is not required to have optimal eye-tracking capabilities. The eye-tracking narrows the set of letters that are to be scanned reducing the need to scan all letters on the onscreen keyboard.
Fager, S., Jakobs, T., & Beukelman, D. (2015, November). Integrating Speech Recognition Into AAC Technology. Presentation at the annual conference of the American Speech and Hearing Association, Denver CO. Handout as a pdf
Fager, S. & Jakobs, T. (January, 2015). Integrating speech recognition into AAC technology. Presentation at Annual Conference of the Assistive Technology Industry Association (ATIA), Orlando, FL. Handout as a pdf