Abstract
The problem of sequence labelling in language understanding would benefit from approaches inspired by semantic priming phenomena. We propose that an attention-based RNN architecture can be used to simulate semantic priming for sequence labelling. Specifically, we employ pre-trained word embeddings to characterize the semantic relationship between utterances and labels. We validate the approach using varying sizes of the ATIS and MEDIA datasets, and show up to 1.4-1.9% improvement in F1 score. The developed framework can enable more explainable and generalizable spoken language understanding systems.
Cite
CITATION STYLE
Wu, J., Banchs, R. E., D’Haro, L. F., Krishnaswamy, P., & Chen, N. (2018). Attention-based Semantic Priming for Slot-filling. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 22–26). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-2404
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.