Sentence Representation

  • Liu Z
  • Lin Y
  • Sun M
N/ACitations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Sentence is an important linguistic unit of natural language. Sentence Representation has remained as a core task in natural language processing, because many important applications in related fields lie on understanding sentences, for example, summarization, machine translation, sentiment analysis, and dialogue system. Sentence representation aims to encode the semantic information into a real-valued representation vector, which will be utilized in further sentence classification or matching tasks. With large-scale text data available on the Internet and recent advances on deep neural networks, researchers tend to employ neural networks (e.g., convolutional neural networks and recurrent neural networks) to learn low-dimensional sentence representations and achieve great progress on relevant tasks. In this chapter, we first introduce the one-hot representation for sentences and the n-gram sentence representation (i.e., probabilistic language model). Then we extensively introduce neural-based models for sentence modeling, including feedforward neural network, convolutional neural network, recurrent neural network, and the latest Transformer, and pre-trained language models. Finally, we introduce several typical applications of sentence representations.

Cite

CITATION STYLE

APA

Liu, Z., Lin, Y., & Sun, M. (2020). Sentence Representation. In Representation Learning for Natural Language Processing (pp. 59–89). Springer Nature Singapore. https://doi.org/10.1007/978-981-15-5573-2_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free