Interpretable click-through rate prediction through hierarchical attention

92Citations
Citations of this article
96Readers
Mendeley users who have this article in their library.

Abstract

Click-through rate (CTR) prediction is a critical task in online advertising and marketing. For this problem, existing approaches, with shallow or deep architectures, have three major drawbacks. First, they typically lack persuasive rationales to explain the outcomes of the models. Unexplainable predictions and recommendations may be difficult to validate and thus unreliable and untrustworthy. In many applications, inappropriate suggestions may even bring severe consequences. Second, existing approaches have poor efficiency in analyzing high-order feature interactions. Third, the polysemy of feature interactions in different semantic subspaces is largely ignored. In this paper, we propose InterHAt that employs a Transformer with multi-head self-attention for feature learning. On top of that, hierarchical attention layers are utilized for predicting CTR while simultaneously providing interpretable insights of the prediction results. InterHAt captures high-order feature interactions by an efficient attentional aggregation strategy with low computational complexity. Extensive experiments on four public real datasets and one synthetic dataset demonstrate the effectiveness and efficiency of InterHAt.

Cite

CITATION STYLE

APA

Li, Z., Cheng, W., Chen, Y., Chen, H., & Wang, W. (2020). Interpretable click-through rate prediction through hierarchical attention. In WSDM 2020 - Proceedings of the 13th International Conference on Web Search and Data Mining (pp. 313–321). Association for Computing Machinery, Inc. https://doi.org/10.1145/3336191.3371785

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free