Theory

  • Following a course in elementary statistics, students are able to demonstrate a basic knowledge of statistical concepts and ideas, but often fail to apply this knowledge to concrete problems. From research in cognitive psychology, we know that the organization of knowledge starts with the mental storage of initially isolated concepts and simple principles. A certain amount of conceptual understanding is reached when the student succeeds in forming relationships between these knowledge elements. The task faced by any teacher in statistics, is to enable the student to form such integrated knowledge networks. Research has shown that the formation of such networks is stimulated when students, confronted with a statistical problem that requires the application of their basic knowledge, actively try to explain the solution of the problem to themselves. This paper discusses a didactic method that seeks to stimulate such self explanatory activity in students.

  • This paper is an attempt to suggest features of future statistics education in the light of the emerging paradigms of science, education and mathematics. The paper consists of three sections; an introduction clarifying the context and limitations of the present paper, an explanation of the included concepts and the conclusions about some features of the future statistics education in the light of the studied emerging paradigms. These features included the integration of statistics education through an "applied" approach to problems, rejecting linearity, concentrating on conceptual frameworks and "conditional prediction", and emphasizing the study of probability and the way to deal with the results of statistical analyses.

  • Theoretical frameworks for analyzing teacher subject-matter knowledge in specific mathematical domains are rare. In this paper we propose a theoretical framework for teacher subject-matter knowledge and understanding about probability. The framework comprises of seven aspects: essential features, the strength of probability, different representations and models, alternative ways of approaching, basic repertoire, different forms of knowledge and understanding, and knowledge about mathematics. We explain the importance of each aspect for teacher knowledge of probability, discuss its possible nature and illustrate our claims with specific examples.

  • Models for statistical modes of thinking and problem solving have been developed, and continue to be developed, by teachers and researchers. The purpose of these models range from helping to understand how individual students solve problems to developing instruments for educational research. These models have arisen with particular perspectives and primary uses in mind. In this paper we compare and contrast some statistical thinking models originating from statistics education research (Ben-Zvi & Friedlander, 1997; Jones, Thornton, Langrall, Mooney, Perry & Putt, 2000) with some models arising from the discipline of statistics and sub-disciplines (Wild & Pfannkuch, 1999; Hoerl & Snee, 2001). Drawing upon models from both these areas we discuss issues that include their development and use, how they might illuminate one another and what we can learn from them.

  • In this paper we describe the main ideas in a theoretical model that was developed for mathematics education research and is also applicable to statistics education. This model takes into account the three basic dimensions of teaching and learning processes: epistemic dimension (concerning the nature of statistical knowledge), cognitive dimension (concerning subjective knowledge) and instructional dimension (related to interaction patterns between the teacher and the students in the classroom). These theoretical notions are justified and applied to analyse a teaching process for the median in the introductory training of teachers.

  • Given the importance of instruction in promoting students' statistical literacy, a cohesive picture of the development of students' statistical thinking is needed to better inform classroom teachers and curriculum developers. With this in mind, one of the authors developed a framework to characterize middle school students' statistical thinking within four statistical processes, across four levels of thinking. A subsequent study (Langrall, Mooney, Hofbauer, & Johnson, 2001) addressed gaps in Mooney's framework. This paper describes how the findings of the Langrall et al. study were merged with the framework and reports on resulting modifications to the entire framework.

  • Psychology remains addicted to null hypothesis significance testing despite decades of effort by reformers. Extensive changes in statistical understanding and practices are needed. The authors propose a model of reform-the statistical re-education of psychology-by making an analogy with the conceptual change model of learning. Four diverse components of reform are identified, and illustrated by brief examples of research. Reform is especially challenging because many statistics teachers in psychology first need to achieve conceptual change themselves. In relation to a highly desirable increase in use of confidence intervals (CIs), it seems that many psychologists do not understand CIs well, and guidelines for CI use are lacking. The conceptual change model is offered to guide research needed on many aspects of reform, and the important and exciting task of the statistical re-education of psychology.

  • The cognitive-theoretic instructional and assessment approaches described here represent our efforts to develop and measure students' abilities to reason spontaneously and flexibly with statistics in the context of complex real-world activity. We report results from two instructional projects based on situated cognition, in which students were taught statistical reasoning through guided participation in simulations of authentic professional activities requiring presentation and critique of statistical arguments. Although students' statistical reasoning improved in selected ways, the approach was costly and difficult to implement and sustain. In search of more practical and powerful approaches, current experiments are investigating whether instruction based on video technologies and Cognitive Flexibility Theory can speed development of ability to think flexibly with statistics while seeing interacting themes in real-world situations.

  • Emerging evidence suggests that people do not have difficulty judging covariation per se but rather have difficulty decoding standard displays such as scatterplots. Using the data analysis software Tinkerplots, I demonstrate various alternative representations that students appear to be able to use quite effectively to make judgments about covariation. More generally, I argue that data analysis instruction in K-12 should be structured according to how statistical reasoning develops in young students and should, for the time begin, not target specific graphical representations as objectives of instruction.

  • This paper contrasts two types of educational tools: a route-type series of so-called statistical minitools (Cobb et al., 1997) and a landscape-type construction tool, named Tinkerplots (Konold & Miller, 2001). The design of the minitools is based on a hypothetical learning trajectory (Simon, 1995). Tinkerplots is being designed in collaboration with five mathematics curricula and is open to different approaches. Citing experiences from classroom-based research with students aged ten to thirteen, I show how characteristics of the two types of tools influence the instructional decisions that software designers, curriculum authors, and teachers have to make.

Pages

register