Using Lexical Analysis Software to Assess Student Writing in Statistics

  • Kaplan J
  • Haudek K
  • Ha M
  • et al.
N/ACitations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

Meaningful assessments that reveal student thinking are vital to the success of addressing the GAISE recommendation: use assessments to improve and evaluate student learning. Constructed-response questions, also known as open-response or short answer questions, in which students must write an answer in their own words, have been shown to better reveal students' understanding than multiple-choice questions, but they are much more time consuming to grade for classroom use or code for research purposes. This paper describes and illustrates the use of two different software packages to analyze open-response data collected from undergraduate students’ writing. The analysis and results produced by the two packages are contrasted with each other and with the results obtained from hand coding of the same data sets. The article concludes with a discussion of the advantages and limitations of the analysis options for statistics education research.

Cite

CITATION STYLE

APA

Kaplan, J. J., Haudek, K. C., Ha, M., Rogness, N., & Fisher, D. G. (2014). Using Lexical Analysis Software to Assess Student Writing in Statistics. Technology Innovations in Statistics Education, 8(1). https://doi.org/10.5070/t581020235

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free