Abstract
Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations. We propose an approach to semi-supervised learning of semantic dependency parsers based on the CRF autoencoder framework. Our encoder is a discriminative neural semantic dependency parser that predicts the latent parse graph of the input sentence. Our decoder is a generative neural model that reconstructs the input sentence conditioned on the latent parse graph. Our model is arc-factored and therefore parsing and learning are both tractable. Experiments show our model achieves significant and consistent improvement over the supervised baseline.
Cite
CITATION STYLE
Jia, Z., Ma, Y., Cai, J., & Tu, K. (2020). Semi-supervised semantic dependency parsing using CRF autoencoders. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 6795–6805). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.607
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.