Comparing Code Explanations Created by Students and Large Language Models

70Citations
Citations of this article
149Readers
Mendeley users who have this article in their library.

Abstract

Reasoning about code and explaining its purpose are fundamental skills for computer scientists. There has been extensive research in the field of computing education on the relationship between a student's ability to explain code and other skills such as writing and tracing code. In particular, the ability to describe at a high-level of abstraction how code will behave over all possible inputs correlates strongly with code writing skills. However, developing the expertise to comprehend and explain code accurately and succinctly is a challenge for many students. Existing pedagogical approaches that scaffold the ability to explain code, such as producing exemplar code explanations on demand, do not currently scale well to large classrooms. The recent emergence of powerful large language models (LLMs) may offer a solution. In this paper, we explore the potential of LLMs in generating explanations that can serve as examples to scaffold students' ability to understand and explain code. To evaluate LLM-created explanations, we compare them with explanations created by students in a large course (n ≈ 1000) with respect to accuracy, understandability and length. We find that LLM-created explanations, which can be produced automatically on demand, are rated as being significantly easier to understand and more accurate summaries of code than student-created explanations. We discuss the significance of this finding, and suggest how such models can be incorporated into introductory programming education.

Cite

CITATION STYLE

APA

Leinonen, J., Denny, P., Macneil, S., Sarsa, S., Bernstein, S., Kim, J., … Hellas, A. (2023). Comparing Code Explanations Created by Students and Large Language Models. In Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE (Vol. 1, pp. 124–130). Association for Computing Machinery. https://doi.org/10.1145/3587102.3588785

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free