The computing education community has shown a long-time interest in how to analyze the Object-Oriented (OO) source code developed by students to provide them with useful formative tips. Instructors need to understand the student's difficulties to provide precise feedback on most frequent mistakes and to shape, design and effectively drive the course. This paper proposes and evaluates an approach allowing to analyze student's source code and to automatically generate feedback about the more common violations of the produced code. The approach is implemented through a cloud-based tool allowing to monitor how students use language constructs based on the analysis of the most common violations of the Object-Oriented paradigm in the student source code. Moreover, the tool supports the generation of reports about student's mistakes and misconceptions that can be used to improve the students' education. The paper reports the results of a quasi-experiment performed in a class of a CS1 course to investigate the effects of the provided reports in terms of coding ability (concerning the correctness and the quality of the produced source code). Results show that after the course the treatment group obtained higher scores and produced better source code than the control group following the feedback provided by the teachers.
CITATION STYLE
Ardimento, P., Bernardi, M. L., & Cimitile, M. (2020). Software Analytics to Support Students in Object-Oriented Programming Tasks: An Empirical Study. IEEE Access, 8, 132171–132187. https://doi.org/10.1109/ACCESS.2020.3010172
Mendeley helps you to discover research relevant for your work.