We propose a hierarchical attention network for document classification. Our model has two distinctive characteristics: (i) it has a hierarchical structure that mirrors the hierarchical structure of documents; (ii) it has two levels of attention mechanisms applied at the word and sentence-level, enabling it to attend differentially to more and less important content when constructing the document representation. Experiments conducted on six large scale text classification tasks demonstrate that the proposed architecture outperform previous methods by a substantial margin. Visualization of the attention layers illustrates that the model selects qualitatively informative words and sentences.
Hovy, E., Smola, A., Yang, D., He, X., Yang, Z., & Dyer, C. (2016). Hierarchical Attention Networks for Document Classification (pp. 1480–1489). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/n16-1174