Table Search Using a Deep Contextualized Language Model

45Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Pretrained contextualized language models such as BERT have achieved impressive results on various natural language processing benchmarks. Benefiting from multiple pretraining tasks and large scale training corpora, pretrained models can capture complex syntactic word relations. In this paper, we use the deep contextualized language model BERT for the task of ad hoc table retrieval. We investigate how to encode table content considering the table structure and input length limit of BERT. We also propose an approach that incorporates features from prior literature on table retrieval and jointly trains them with BERT. In experiments on public datasets, we show that our best approach can outperform the previous state-of-the-art method and BERT baselines with a large margin under different evaluation metrics.

Cite

CITATION STYLE

APA

Chen, Z., Trabelsi, M., Heflin, J., Xu, Y., & Davison, B. D. (2020). Table Search Using a Deep Contextualized Language Model. In SIGIR 2020 - Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 589–598). Association for Computing Machinery, Inc. https://doi.org/10.1145/3397271.3401044

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free