This study aims to address a research gap concerning valid and reliable analytic rubrics for assessing students' performance on Model-Eliciting Activities (MEAs), an extensively researched category of well-structured modeling activities, with a specific focus on evaluating creativity-a fundamental element in engineering-in a consistent and transparent manner. In this empirical study, we present the design and validation process for an analytic rubric intended to assess the cybersecurity problem-solving skills and creativity of engineering students in computer science courses. To gauge the reliability of the rubric, a statistical method was used to measure consistency and agreement among four raters when evaluating the performance of 28 undergraduates on the Cipher Algorithm MEA by using the analytic rubric, specifically in terms of cybersecurity problem-solving and creativity. The results demonstrate a good overall level of inter-rater agreement across the evaluation criteria and illustrate how the analytic rubric with the MEA can be used consistently and transparently to assess the cybersecurity problem-solving skills and creativity of engineering students. Our analytic rubric, designed to address challenges in assessing and grading modeling problems, is expected to contribute by providing a demonstration for instructors interested in incorporating MEAs into their toolkit, aiming to enhance conceptual understanding, problem-solving skills, and creativity in students and facilitate formative assessments that offer precise feedback for improvement across various performance areas.
CITATION STYLE
Kim, Y. R., Yang, J., Lee, Y., & Earwood, B. (2024). Assessing Cybersecurity Problem-Solving Skills and Creativity of Engineering Students Through Model-Eliciting Activities Using an Analytic Rubric. IEEE Access, 12, 5743–5759. https://doi.org/10.1109/ACCESS.2023.3348554
Mendeley helps you to discover research relevant for your work.