By Sangwon Hyun, P Burckhardt, P Elliott, C Evans, K Lin, A Luby, C P Makris, J Orellana, A Reinhart, J Wieczorek, R Yurko, G Weinberg, & R Nugent (Carnegie Mellon University)
Abstract
Data science and statistics are rapidly growing, bringing students from a wide variety of backgrounds to introductory courses at the university level. Teaching data science effectively to this new population of students, who often do not have an extensive background in mathematics or computing, is challenging. Novel introductory courses aim to expose students to sophisticated statistical concepts, while removing barriers such as mathematics and computing.
These courses require carefully targeted instruction to reach their intended audiences. But how can we tell if students with such varying backgrounds have understood the concepts we aim to teach? Our classes need assessments to measure students’ conceptual understanding. Ideally, assessments should also help reveal student thinking and identify misconceptions that may not be obvious to the instructor.
Think-aloud interviews can provide assessment designers with insights into student thinking that may not be clear from test responses alone. We present results from preliminary rounds of think-aloud interviews with introductory students and describe surprising misconceptions we have identified, along with insights from our experience designing assessments and performing think-aloud interviews.