Masked Measurement Prediction: Learning to Jointly Predict Quantities and Units from Textual Context

4Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.

Abstract

Physical measurements constitute a large portion of numbers in academic papers, engineering reports, and web tables. Current benchmarks fall short of properly evaluating numeracy of pretrained language models on measurements, hindering research on developing new methods and applying them to numerical tasks. To that end, we introduce a novel task, Masked Measurement Prediction (MMP), where a model learns to reconstruct a number together with its associated unit given masked text. MMP is useful for both training new numerically informed models as well as evaluating numeracy of existing systems. To address this task, we introduce a new Generative Masked Measurement (GeMM) model that jointly learns to predict numbers along with their units. We perform fine-grained analyses comparing our model with various ablations and baselines. We use linear probing of traditional pretrained transformer models (RoBERTa) to show that they significantly underperform jointly trained number-unit models, highlighting the difficulty of this new task and the benefits of our proposed pre-training approach. We hope this framework accelerates progress towards building more robust numerical reasoning systems in the future.

Cite

CITATION STYLE

APA

Spokoyny, D., Lee, I., Jin, Z., & Berg-Kirkpatrick, T. (2022). Masked Measurement Prediction: Learning to Jointly Predict Quantities and Units from Textual Context. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 17–29). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free