Using Large Language Models to Accelerate Communication for Users with Severe Motor Impairments

Kavli Affiliate: Michael P. Brenner

| First 5 Authors: Shanqing Cai, Subhashini Venugopalan, Katie Seaver, Xiang Xiao, Katrin Tomanek

| Summary:

Finding ways to accelerate text input for individuals with profound motor
impairments has been a long-standing area of research. Closing the speed gap
for augmentative and alternative communication (AAC) devices such as
eye-tracking keyboards is important for improving the quality of life for such
individuals. Recent advances in neural networks of natural language pose new
opportunities for re-thinking strategies and user interfaces for enhanced
text-entry for AAC users. In this paper, we present SpeakFaster, consisting of
large language models (LLMs) and a co-designed user interface for text entry in
a highly-abbreviated form, allowing saving 57% more motor actions than
traditional predictive keyboards in offline simulation. A pilot study with 19
non-AAC participants typing on a mobile device by hand demonstrated gains in
motor savings in line with the offline simulation, while introducing relatively
small effects on overall typing speed. Lab and field testing on two eye-gaze
typing users with amyotrophic lateral sclerosis (ALS) demonstrated text-entry
rates 29-60% faster than traditional baselines, due to significant saving of
expensive keystrokes achieved through phrase and word predictions from
context-aware LLMs. These findings provide a strong foundation for further
exploration of substantially-accelerated text communication for motor-impaired
users and demonstrate a direction for applying LLMs to text-based user
interfaces.

| Search Query: ArXiv Query: search_query=au:”Michael P. Brenner”&id_list=&start=0&max_results=3

Read More