Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vectorrepresentation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary.
CITATION STYLE
Wang, S., Lan, Y., Tay, Y., Jiang, J., & Liu, J. (2020). Multi-level head-wise match and aggregation in transformer for textual sequence matching. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 9209–9216). AAAI press. https://doi.org/10.1609/aaai.v34i05.6458
Mendeley helps you to discover research relevant for your work.