A rubric for assessing mathematical modelling problems in a scientific-engineering context

1Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Mathematics modelling is a vital competency for students of all ages. In this study, we aim to fill the research gap about valid and reliable tools for assessing and grading mathematical modeling problems, particularly those reflecting multiple steps of the modelling cycle. We present in this paper the design of a reliable and valid assessment tool aimed at gauging the level of mathematical modelling associated with real-world modeling problems in a scientific-engineering context. The study defines and bases the central modelling processes on the proficiency levels identified in PISA Mathematics. A two-dimensional rubric was developed, reflecting the combined assessment of the type and level of a modelling process. We identified criteria that enable a clear comparison and differentiation among the different levels across each of the modelling processes. These criteria allow for concrete theoretical definitions for the various modelling processes, introducing a well-defined mathematical modelling framework from a didactical viewpoint, which can potentially contribute to promoting modelling competencies or the understanding of modelling by teachers and students. Theoretical, methodological and practical implications are discussed.

Cite

CITATION STYLE

APA

Kohen, Z., & Gharra-Badran, Y. (2023). A rubric for assessing mathematical modelling problems in a scientific-engineering context. Teaching Mathematics and Its Applications, 42(3), 266–288. https://doi.org/10.1093/teamat/hrac018

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free