Ahani, A., Moghadamfalahi, M., & Erdogmus, D. (2018). Language-model assisted and icon-based communication through a brain–computer interface with different presentation paradigms. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26, 1835-1844. DOI: 10.1109/TNSRE.2018.2859432
Abstract: Augmentative and alternative communication (AAC) is typically used by people with severe speech and physical disabilities and is one of the main application areas for the brain–computer interface (BCI) technology. The target population includes people with cerebral palsy, amyotrophic lateral sclerosis, and locked-in syndrome. Word-based AAC systems are mainly faster than letter-based counterparts and are usually supplemented by icons to aid the users. Those icon-based AAC systems that use binary signaling methods such as single click can convert into a single-input BCI system such as event-related potential (ERP) detection. Matrix speller paradigm is typically used to help users identify their target icon on the screen; however, it ties screen space to vocabulary size and navigation complexity, which may require users to make repetitive head, neck, or eye movements to visually locate their intended targets on the screen. Rapid serial visual presentation (RSVP) is an alternative interface that minimizes required movement by displaying all icons at a fixed location, one at a time. IconMessenger is an icon-based BCI–AAC system that combines ERP signal detection with a unified framework for different presentation paradigms including RSVP, matrix speller row&column presentation, and matrix speller single character presentation. IconMessenger also takes advantage of a unique sem-gram language model, incorporated tightly in the inference engine. In this paper, we assess the ERP shape, classification accuracy, and typing performance of different presentation paradigms on 10 healthy participants.
Submitted to NARIC database