Resurgence of deep learning: Genesis of word embedding

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As the complexity in the structure of natural language increases, the input, output, and processing for a computer system become more challenging. Development of computational techniques and models for automatic analysis and representation of such natural languages is known as natural language processing (NLP). The base unit of any natural language is a word, and its representation is a challenging task as decoding its actual semantic role is vital for any NLP application. One of the most popular computation models is artificial neural network (ANN). However, with the birth of deep learning, a new era has started in computational linguistic research as representation of words has been redefined in terms of word embeddings which capture words semantics in the form of real-valued vectors. This paper presents lifespan of ANN from discovery of first artificial neuron to current era of deep learning. Further, it follows the journey of word embeddings, analyzes their generation methods along with their objective functions, and concludes with current research gaps.

Cite

CITATION STYLE

APA

Soni, V. K., Gopalani, D., & Govil, M. C. (2019). Resurgence of deep learning: Genesis of word embedding. In Advances in Intelligent Systems and Computing (Vol. 669, pp. 129–139). Springer Verlag. https://doi.org/10.1007/978-981-10-8968-8_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free