Journal Article

  • The power of computing technology has increased at an astounding rate in the last decade. Today, the personal computer plays a key role in most introductory statistics courses, freeing students from "computational drudgery" as well as enabling a sharper instructional focus on data analysis and the interpretation of statistical results. Computers have also come to play an important role in teaching statistical concepts through simulations. Despite the increased popularity of computer-based statistical simulations, there have been few empirical evaluations of their effectiveness. In this paper, I describe and evaluate three computer-assisted simulations developed for use with SPSS and Microsoft Excel. The simulations are designed to reinforce and enhance students' understanding of sampling distributions, confidence intervals, and significance tests. Results of the evaluation reveal that these simulations can help improve students' comprehension of some of the most difficult material they encounter in the introductory social statistics course.

  • Econometrics is an intellectual game played by rules based on the sampling<br>distribution concept. Most students in econometrics classes are uncomfortable<br>because they do not know these rules and so do not understand what is going<br>on in econometrics. This article contains some explanations for this phenomenon<br>and suggestions for how this problem can be addressed. Instructors are encouraged<br>to use explain-how-to-bootstrap exercises to promote student understanding<br>of the rules of the game.

  • A good representation can be crucial for finding the solution to a problem. Gigerenzer and<br>Hoffrage (Psychol. Rev. 102 (1995) 684; Psychol. Rev. 106 (1999) 425) have shown that representations<br>in terms of natural frequencies, rather than conditional probabilities, facilitate the computation<br>of a cause's probability (or frequency) given an effect - a problem that is usually referred to as<br>Bayesian reasoning. They also have shown that normalized frequencies - which are not natural<br>frequencies - do not lead to computational facilitation, and consequently, do not enhance people's<br>performance. Here, we correct two misconceptions propagated in recent work (Cognition 77 (2000)<br>197; Cognition 78 (2001) 247; Psychol. Rev. 106 (1999) 62; Organ. Behav. Hum. Decision Process.<br>82 (2000) 217): normalized frequencies have been mistaken for natural frequencies and, as a<br>consequence, "nested sets" and the "subset principle" have been proposed as new explanations.<br>These new terms, however, are nothing more than vague labels for the basic properties of natural<br>frequencies.

  • We investigated the way experienced users interpret Null Hypothesis Significance Testing (NHST) outcomes. An<br>empirical study was designed to compare the reactions of two populations of NHST users, psychological researchers<br>and professional applied statisticians, when faced with contradictory situations. The subjects were presented with the<br>results of an experiment designed to test the efficacy of a drug by comparing two groups (treatment/placebo). Four<br>situations were constructed by combining the outcome of the t test (significant vs. nonsignificant) and the observed<br>difference between the two means D (large vs. small). Two of these situations appeared as conflicting (t significant/D<br>small and t nonsignificant/D large). Three fundamental aspects of statistical inference were investigated by means of open<br>questions: drawing inductive conclusions about the magnitude of the true difference from the data in hand, making<br>predictions for future data, and making decisions about stopping the experiment. The subjects were 25 statisticians from<br>pharmaceutical companies in France, subjects well versed in statistics, and 20 psychological researchers from various<br>laboratories in France, all with experience in processing and analyzing experimental data. On the whole, statisticians and<br>psychologists reacted in a similar way and were very impressed by significant results. It must be outlined that professional<br>applied statisticians were not immune to misinterpretations, especially in the case of nonsignificance. However, the interpretations<br>that accustomed users attach to the outcome of NHST can vary from one individual to another, and it is hard to<br>conceive that there could be a consensus in the face of seemingly conflicting situations. In fact, beyond the superficial<br>report of "erroneous" interpretations, it can be seen in the misuses of NHST intuitive judgmental "adjustments" that try to<br>overcome its inherent shortcomings. These findings encourage the many recent attempts to improve the habitual ways of<br>analyzing and reporting experimental data.

  • Null hypothesis significance testing (NHST) is arguably the most widely used<br>approach to hypothesis evaluation among behavioral and social scientists. It is also<br>very controversial. A major concern expressed by critics is that such testing is<br>misunderstood by many of those who use it. Several other objections to its use have<br>also been raised. In this article the author reviews and comments on the claimed<br>misunderstandings as well as on other criticisms of the approach, and he notes<br>arguments that have been advanced in support of NHST. Alternatives and supplements<br>to NHST are considered, as are several related recommendations regarding<br>the interpretation of experimental data. The concluding opinion is that NHST is<br>easily misunderstood and misused but that when applied with good judgment it can<br>be an effective aid to the interpretation of experimental data.

  • This study considers the evolving influence of variation and expectation on the development of school students' appreciation of distribution as displayed in their construction of graphical representations of data sets. Three interview protocols are employed, presenting different contexts within which 109 students, ranging in age from 6 to 15 years, could display and interpret their understanding. Responses are analyzed within a hierarchical cognitive framework. It is hypothesized from the analysis that, contrary to the order in which expectation and variation are introduced in the school curriculum, the natural tendency for students is to acknowledge variation first and then expectation.

  • This paper reviews factors that contribute to the development of middle school students' interest in statistical literacy and its motivational influence on learning. To date very little research has specifically examined the influence of positive affect such as interest on learning in the middle-school statistics context. Two bodies of associated research are available: interest research in a mathematics education context and attitudinal research in a tertiary statistics context. A content analysis of this literature suggests that interest development in middle school statistics will be the result of a complex interplay of classroom influences and individual factors such as: students' knowledge of statistics, their enjoyment of statistics and their perceptions of competency in relation to the learning of statistics.

  • Informal inferential reasoning has shown some promise in developing students' deeper understanding of statistical processes. This paper presents a framework to think about three key principles of informal inference - generalizations 'beyond the data,' probabilistic language, and data as evidence. The authors use primary school classroom episodes and excerpts of interviews with the teachers to illustrate the framework and reiterate the importance of embedding statistical learning within the context of statistical inquiry. Implications for the teaching of more powerful statistical concepts at the primary school level are discussed.

  • This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this study indicate that such a balanced amalgamated approach to learning not only improves student cognition of course material, but student morale as well. An instructional approach that combines mini-lectures with in-class active-learning activities appears to be a better approach than traditional lecturing alone for teaching graduate-level students.

  • The writers describe how combining simulations with a discovery approach offers students a way to discover the concepts associated with sampling distributions. They outline one such approach that used statistical software and another that used a graphing calculator

Pages

register