Abstract
We present a novel Bayesian topic model for learning discourse-level document structure. Our model leverages insights from discourse theory to constrain latent topic assignments in a way that reflects the underlying organization of document topics. We propose a global model in which both topic selection and ordering are biased to be similar across a collection of related documents. We show that this space of orderings can be effectively represented using a distribution over permutations called the Generalized Mallows Model. We apply our method to three complementary discourse-level tasks: cross-document alignment, document segmentation, and information ordering. Our experiments show that incorporating our permutation-based model in these applications yields substantial improvements in performance over previously proposed methods. © 2009 AI Access Foundation. All right reserved.
Cite
CITATION STYLE
Chen, H., Branavan, S. R. K., Barzilay, R., & Karger, D. R. (2009). Content modeling using latent permutations. Journal of Artificial Intelligence Research, 36, 129–163. https://doi.org/10.1613/jair.2830
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.