Learning hidden markov model topology based on KL divergence for information extraction

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

To locate information embedded in documents, information extraction systems based on rule-based pattern matching have long been used. To further improve the extraction generalization, hidden Markov model (HMM) has recently been adopted for modeling temporal variations of the target patterns with promising results. In this paper, a state-merging method is adopted for learning the topology with the use of a localized Kullback Leibler (KL) divergence. The proposed system has been applied to a set of domain-specific job advertisements and preliminary experiments show promising results.

Cite

CITATION STYLE

APA

Au, K. C., & Cheung, K. W. (2004). Learning hidden markov model topology based on KL divergence for information extraction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3056, pp. 590–594). Springer Verlag. https://doi.org/10.1007/978-3-540-24775-3_70

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free