Abstract
Sign-to-911 offers a compact mobile system solution to fast and runtime American Sign Language (ASL) and English translations. It is designated as 911 call services for ASL users with hearing disabilities upon emergencies. It enables bidirectional translations of ASL-to-English and English-to-ASL. The signer wears the AR glasses, runs Sign-to-911 on his/her smartphone and glasses, and interacts with a 911 operator. The design of Sign-to-911 departs from the popular deep learning based solution paradigm, and adopts simpler traditional AI/machine learning (ML) models. The key is to exploit ASL linguistic features to simplify the model structures and improve accuracy and speed. It further leverages recent component solutions from graphics, vision, natural language processing, and AI/ML. Our evaluation with six ASL signers and 911 call records has confirmed its viability.
Author supplied keywords
Cite
CITATION STYLE
Guo, Y., Zhao, J., Ding, B., Tan, C., Ling, W., Tan, Z., … Lu, S. (2023). Sign-to-911: Emergency Call Service for Sign Language Users with Assistive AR Glasses. In Proceedings of the Annual International Conference on Mobile Computing and Networking, MOBICOM (pp. 691–705). Association for Computing Machinery. https://doi.org/10.1145/3570361.3613260
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.