We present a general framework for incor-porating sequential data and arbitrary features into language modeling. The general framework consists of two parts: a hidden Markov component and a recursive neural network component. We demonstrate the effectiveness of our model by applying it to a specific application: predicting topics and sentiments in dialogues. Experiments on real data demonstrate that our method is substantially more accurate than previ-ous methods.
CITATION STYLE
Yang, M., Tu, W., Yin, W., & Lu, Z. (2015). Deep markov neural network for sequential data classification. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 32–37). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2006
Mendeley helps you to discover research relevant for your work.