Machine Translation between Spoken Languages and Signed Languages Represented in SignWriting

5Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

This paper presents work on novel machine translation (MT) systems between spoken and signed languages, where signed languages are represented in SignWriting, a sign language writing system. Our work1 seeks to address the lack of out-of-the-box support for signed languages in current MT systems and is based on the SignBank dataset, which contains pairs of spoken language text and SignWriting content. We introduce novel methods to parse, factorize, decode, and evaluate SignWriting, leveraging ideas from neural factored MT. In a bilingual setup—translating from American Sign Language to (American) English—our method achieves over 30 BLEU, while in two multilingual setups— translating in both directions between spoken languages and signed languages—we achieve over 20 BLEU. We find that common MT techniques used to improve spoken language translation similarly affect the performance of sign language translation. These findings validate our use of an intermediate text representation for signed languages to include them in NLP research.

Cite

CITATION STYLE

APA

Jiang, Z., Moryossef, A., Müller, M., & Ebling, S. (2023). Machine Translation between Spoken Languages and Signed Languages Represented in SignWriting. In EACL 2023 - 17th Conference of the European Chapter of the Association for Computational Linguistics, Findings of EACL 2023 (pp. 1661–1679). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-eacl.127

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free