Abstract
Discovering latent topics from text corpora has been studied for decades. Many existing topic models adopt a fully unsupervised setting, and their discovered topics may not cater to users' particular interests due to their inability of leveraging user guidance. Although there exist seed-guided topic discovery approaches that leverage user-provided seeds to discover topic-representative terms, they are less concerned with two factors: (1) the existence of out-of-vocabulary seeds and (2) the power of pretrained language models (PLMs). In this paper, we generalize the task of seed-guided topic discovery to allow out-of-vocabulary seeds. We propose a novel framework, named SEETOPIC, wherein the general knowledge of PLMs and the local semantics learned from the input corpus can mutually benefit each other. Experiments on three real datasets from different domains demonstrate the effectiveness of SEETOPIC in terms of topic coherence, accuracy, and diversity.
Cite
CITATION STYLE
Zhang, Y., Meng, Y., Wang, X., Wang, S., & Han, J. (2022). Seed-Guided Topic Discovery with Out-of-Vocabulary Seeds. In NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 279–290). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.naacl-main.21
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.