This tutorial reviews the design of common meaning representations, SoTA models for predicting meaning representations, and the applications of meaning representations in a wide range of downstream NLP tasks and real-world applications. Reporting by a diverse team of NLP researchers from academia and industry with extensive experience in designing, building and using meaning representations, our tutorial has three components: (1) an introduction to common meaning representations, including basic concepts and design challenges; (2) a review of SoTA methods on building models for meaning representations; and (3) an overview of applications of meaning representations in downstream NLP tasks and real-world applications. We propose a full-day, cutting-edge tutorial for all stakeholders in the AI community, including NLP researchers, domain-specific practitioners, and students.
CITATION STYLE
Bonn, J., Flanigan, J., Hajič, J., Jindal, I., Li, Y., & Xue, N. (2024). Meaning Representations for Natural Languages: Design, Models and Applications. In 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation, LREC-COLING 2024 - Tutorial Summaries (pp. 13–18). European Language Resources Association (ELRA). https://doi.org/10.18653/v1/2022.emnlp-tutorials.1
Mendeley helps you to discover research relevant for your work.