In recent years, the process humans adopt to learn a foreign language has moved from the strict "Grammar -Translation"method, which is based mainly on grammar and syntax rules, to more innovative processes, resulting to the more modern "Communicative approach". As its name states, this approach focuses on the coherent communication with native speakers and the cultivation of oral skills, without taking into consideration, at least at the first stages, the rules that govern the language. The same trend seems to have been applied to the way machinery can be "educated"to comprehend and reproduce the unfamiliar, human language. The "rule based"Natural Language Generation (NLG) and Natural Language Understanding (NLU) algorithms, on one hand, and the "text based"Large Language Models (LLMs), on the other, are two, analogous to the two human foreign language learning processes, subareas of Natural Language Processing (NLP). This paper presents these two alternative approaches, LLMs (a technology having surfaced as an influential catalyst of NLP, during last years) on the one hand and NLG/NLU on the other, highlighting their applications, their technologies, their capabilities, their differences, their strengths and weaknesses and the challenges they present, contributing to a deeper comprehension of the evolving landscape of Artificial Intelligence and human-computer communication.
CITATION STYLE
Karanikolas, N., Manga, E., Samaridi, N., Tousidou, E., & Vassilakopoulos, M. (2023). Large Language Models versus Natural Language Understanding and Generation. In ACM International Conference Proceeding Series (pp. 278–290). Association for Computing Machinery. https://doi.org/10.1145/3635059.3635104
Mendeley helps you to discover research relevant for your work.