In this paper we introduce a semantic role labeler for Korean, an agglutinative language with rich morphology. First, we create a novel training source by semantically annotating a Korean corpus containing fine-grained morphological and syntactic information. We then develop a supervised SRL model by leveraging morphological features of Korean that tend to correspond with semantic roles. Our model also employs a variety of latent morpheme representations induced from a larger body of unannotated Korean text. These elements lead to state-of-the-art performance of 81.07% labeled F1, representing the best SRL performance reported to date for an agglutinative language. © 2014 Association for Computational Linguistics.
CITATION STYLE
Kim, Y. B., Chae, H., Snyder, B., & Kim, Y. S. (2014). Training a Korean SRL system with rich morphological features. In 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference (Vol. 2, pp. 637–642). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p14-2104
Mendeley helps you to discover research relevant for your work.