Research

  • Instant messaging is a way of sending short messages to other users who are currently online in "real time" and is a rapidly growing medium by which many students are choosing to communicate with each other. A pilot study into the use of instant messaging was carried out with two large elementary statistics classes. This study will report back on the virtual office-hours service, advantages and disadvantages of online study groups and reflections on how instant messaging could change help-support services for students studying statistics.

  • With the dual goal of providing individualized learning and assessment, while simultaneously preserving academic integrity, we have implemented a computerized testing system to generate, administer and grade quizzes in an introductory statistics course for graduate students. A fundamental reason for individualization is to permit each student to learn at his or her own pace. At the same time, administering individualized instruction must not increase the time involvement of the instructor. The ability of the computer to randomly select questions from a test bank, to randomly generate data for the questions, and to randomly order the answer choices makes it possible for learning and assessment to occur in accord with each student's individual needs while maintaining fairness for the students and the instructor.

  • Statistical literacy, reasoning, and thinking may be the most prominent objectives of statistics education; they are unsatisfactorily defined and demarcated. Therefore, they are difficult to monitor, and assess. As a consequence they are impractical as educational goals. Instead, assessment could be focused on those aspects of specific statistical knowledge that are indicative for different levels of understanding. Factual knowledge directly derived from sources of information indicates a superficial level of understanding; a comprehensive, coherent knowledge structure indicates a more profound level of understanding, and the ability to transfer knowledge (the ability to flexibly engage statistical knowledge in novel tasks) indicates an expert level of understanding. This classification of hierarchically related levels of statistical understanding may produce adequate ways of measurement and assessment.

  • This report focuses on a research project concerning individual curricula regarding the instruction of statistics and of probability theory. Individual curricula will be described as belief systems which contain teachers' subjective knowledge and conceptions about mathematics, about learning and teaching mathematics, and particularly about statistics and probability. This report stresses two aspects: the theoretical settings, and the methodological settings of the research. The theoretical settings concern central assumptions and theoretical constructs. The discussion of the methodological settings which will be illustrated by research results, includes the description of a five-step-methodology used for investigating individual curricula.

  • Use of the internet to support instruction in general and the use of statistical software in particular provides instructors and students with an opportunity to improve learning while maintaining effective use of limited classroom time. We have developed a Web site (http://power.education.uconn.edu/) that encompasses instruction in power analysis issues and teaches students and others how to use the nQuery Advisor© software to establish sample size for research designs ranging from the simple to the complex. The evaluation results of our Power Project Web site and materials are promising, and the purpose of this paper is to share our approach and materials with other instructors of statistics and research design.

  • Modeling and simulation with the software Fathom has become an important part of an introductory course on probability and statistics for future mathematics teachers at our institution. We describe our conception of modeling and simulation competence that students are supposed to acquire. We use various means such as modeling guidelines, simulation plan and a guidebook with examples for simulations to support students' learning processes. We report on results of empirical studies that made us change and extend our initial educational approach.

  • This paper presents an analysis of the meanings of sampling distribution as supplied by some undergraduate students in a dynamic statistics environment (Fathom). The paper identifies stages in the simulation process where multiple and dynamic representations were crucial to students' understanding of the relationships among sample size, the behavior of sampling distributions and the probabilities of some sample results. One of the foremost difficulties observed in the simulation process was linked to the use of symbolic representations in the software, mainly at the formulation of the population model stage.

  • We compare two methods of recording data and making graphic displays: a standard paper-and-pencil technique and a "data-cards" approach in which students record case information on individual cards which they then arrange to make displays. Students using the data cards produced displays that tended to be more complex and informative than displays made by those in the paper-and-pencil group. We explore plausible explanations for this difference by examining structural aspects of the two approaches, such as the saliency of the case and the use of space in organizing the information. Our results call into question the wisdom of the current practice of introducing young students to particular graph types and of the idea that they need to master handling of univariate data before they move on to multivariate data.

  • The paper builds on design-research studies in the domain of probability and statistics. The integration of computers into classroom practice has been established as a complex process involving instrumental genesis (Verillon and Rabardel, 1995), whereby students and teachers need to construct potentialities for the tools as well as techniques for using those tools efficiently (Artigue, 2002). The difficulties of instrumental genesis can perhaps be eased by design methodologies that build the needs of the learner into the fabric of the product. We discuss our interpretation of design research methodology, which has over the last decade guided our own research agenda. Through reference to previous and ongoing studies, we argue that design research allows a sensitive phenomenalisation of a mathematical domain that can capture learners' needs by transforming powerful ideas into situated, meaningful and manipulable phenomena.

  • The paper builds on design-research studies in the domain of probability and statistics conducted in middle-school classrooms. The design, ProbLab (Probability Laboratory), which is part of Wilensky's 'Connected Probability' project, integrates: constructionist projects in traditional media; individual work in the NetLogo modeling-and-simulation environment; and networked participatory simulations in HubNet. An emergent theoretical model, 'learning axes and bridging tools,' frames both the design and the data analysis. The model is explicated by discussing a sample episode, in which a student reinvents sampling by connecting 'local' and 'global' perceptions of a population.

Pages

register