Meaningful assessments that reveal student thinking are vital to the success of addressing the GAISE recommendation: use assessments to improve and evaluate student learning. Constructed-response questions, also known as open-response or short answer questions, in which students must write an answer in their own words, have been shown to better reveal students' understanding than multiple-choice questions, but they are much more time consuming to grade for classroom use or code for research purposes. This paper describes and illustrates the use of two different software packages to analyze open-response data collected from undergraduate students’ writing. The analysis and results produced by the two packages are contrasted with each other and with the results obtained from hand coding of the same data sets. The article concludes with a discussion of the advantages and limitations of the analysis options for statistics education research.
The CAUSE Research Group is supported in part by a member initiative grant from the American Statistical Association’s Section on Statistics and Data Science Education