Research

  • Students in four statistics classes received different amounts of guidance and instruction in interpretive skills. Students who wrote press releases free of statistical jargon acquired better computational and interpretive skills than did students in a traditional class. Emphasis on interpretation was not associated with greater conceptual knowledge. Writing assignments appear to focus students' attention on the context and rationale for the statistics. This technique can also be used in other courses.

  • Ways of knowing statistical concepts are reviewed. A general three-category structure for knowing is proposed: (a) calculations, (b) propositions, and (c) conceptual understanding. Test items were developed that correspond to the first category and to a partitioning of the two latter categories into words and symbols. Thirty-one items covering the five types were administered to 57 graduate students. Correlation of student scores on the 10-item calculations subtest and the 10-item propositions subtest was . 61, whereas the other two intercategory correlations were .40 (Calculations vs. Conceptual Understanding) and .37 (Propositions vs. Conceptual Understandings). The result suggest that students should be tested in more than one domain, and that instructors should expect students to develop conceptual understanding in addition to skills in computation.

  • This study attempted to provide evidence about two questions (1) Can criteria be developed for evaluating introductory statistics texts which are based on the statistical education and text evaluation literature? (2) Using the criteria from (1), how do several introductory statistics texts used in schools and colleges of education compare to one another?

  • TVERSKY and KAHNEMAN's availability heuristic, although originally intended to account for only frequency and probability judgments, has been used to explain almost all kinds of social judgments. Accordingly, the process of judgment formation is mediated by the availability of memorized information, that is, by the ease with which relevant material can be recalled at the time when the judgment is made. Recall operations - either pure retrieval or reconstructive recall - are regarded as the determining subprocess within the process of judgment formation. In this article, empirical evidence is presented which is hardly compatible with such an account. A reaction time experiment, on the egocentric attribution phenomenon is described suggesting that Ss' claim to contribute more to various social activities than their partner is not caused by Ss' tendency to predominantly recall examples of their own activities. Within-judge correlations of recall and judgment latencies are rather low., and an analysis of the facilitating effect of prior judgments on subsequent recall latencies (for the same issues) does not reveal the kind of priming effect that would be expected if recall operations were already involved in the preceding judgments. Negative evidence from some other experiments on illusory correlations is also mentioned. These results are discussed in the context of methodological problems inherent in testing so-called judgmental heuristics.

  • Ubiquitously people have to address the problem of uncertainty. The main focus of this article is an investigation of how people represent their knowledge about the uncertainty of events or regularities, and how they process this knowledge in order to make decisions in order to share their knowledge. In a first experiment it is asked how people gather information about frequencies of events and how this knowledge is interfered with the response mode, numerical vs. verbal estimates, they are required to use. The least interference occurs if the subjects are allowed to give verbal answers. From this it is concluded that processing knowledge about uncertainty by means of verbal expression imposes less mental work load on the organism than does numerical processing. Possibility theory is used as a framework for modeling the individual usage of verbal categories. The 'elastic' constrains on the verbal expressions for every single subject are determined in a further experiment by means of sequential testing. The results from this experiment are used to suggest a simple mechanism underlying the "availability" heuristic. In further experiments it is shown that the superiority of the verbal processing of knowledge about uncertainty quite generally reduces persistent biases reported in the literature: conservatism and negligence of regression. In a final experiment about predictions on a real-life situation it turns out that in a numerical forecasting task subjects restricted themselves to those parts of their knowledge which are numerical. On the other hand subjects in a verbal forecasting task accessed verbally as well as numerically stated knowledge. The conjecture is made that the superiority of the verbal mode in representing and in processing knowledge about uncertainty is due to "share-ability" (Freyd, in press)constrains which have evolved in the history of humankind and might even be phylogenetically determined.

  • The probability concept is not simple: It can be disolved into many components and we shall describe some of them. From the general point of view, the student does not acquire all these components simultaneously, but rather crosses one treshold of the other. Hence, he/she always possesses but a partial understanding. He/she will be able to answer some questions which are compatible with the tresholds already attained; but he/she will systematically fail in other tests. Besides, the mere crossing of treshold is not sufficient. fixation is indispensible to secure the knowledge acquired. Otherwise, the pupil can be asked diverting questions which reveal the instability of the knowledge acquired. This paper is a personal view of a study of Jesus ALARCON (Mexico), who presented his doctoral thesis in Strasbourg in Juni, 1982. This research is not in a genetical perspective, as this would require comparing the results of children from different age groups. The study used approx. 300 12- to 14-year -old pupils who were distributed to 3 samples of approx. equal size, i.e., 3 questionnaires were presented with slight alterations of the items to be tested. The results of only one of these samples (106 pupils) will be discussed here.

  • The establishment of relationships among variables is basic to prediction and scientific explanation. Correlational reasoning - the reasoning processes one uses in determining the strength of mutual or reciprocal relationship between variables - is, therefore, a fundamental aspect of scientific reasoning. Suppose, for instance, that a scientist is interested in finding out whether a correlation exists between the body weight of rats and the presence of a substance X in their blood. The establishment of a correlation requires an initial recognition of the four possible associations: (a) = heavy weight and presence of substance X; (b) = heavy weight and absence of substance X; (c) = light weight and presence of substance X; and (d) = light weight and absence of substance X. When variables can be dichotomized such as this, one may construct a 2x2 association table of the sort used to compute simple contingencies. In view of the fundamental role played by correlational reasoning in the investigative process, we asked ourselves the following question: How do high school science and mathematics students approach tasks that require correlational reasoning for successful solution? An answer to this question will indicate how students apply this important aspect of scientific reasoning and might suggest how this reasoning pattern could be enhanced through instruction.

  • Psychologists interested in such diverse areas as scientific reasoning, attribution theory, depression, and judgment have central to their theories the ability of people to judge the degree of covariation between two variables. We performed seven experiments to help determine what heuristics people use in estimating the contingency between two dichotomous variables. Assume that the two variables are Factor 1 and Factor 2, each of which may be present or absent. In Experiment 1 we hypothesized that people assess contingency solely based on the number of instances in which both Factor 1 and Factor 2 are present. By manipulating column and row tables of a 2x2 matrix, we were able to place various values in this "present-present" cell, also called Cell A. If subjects do base their contingency estimate on Cell A, we would expect a monotonic relation between Cell A frequency and the contingency estimate. This test of the Cell A heuristic led us to conclude that it could not represent a complete explanation of contingency estimation. Although Experiment 2 resulted in a rejection of one possible explanation of the results of Experiment 1, Experiment 2 and 3 together provided us with an essential finding: Very low cell frequencies are greatly overestimated. In Experiment 4 participants in a contingency estimation task involving no memory demands used rather complex heuristics in judging contingency. When the memory demands were increased in Experiment 5, the comparatively simple Cell A heuristic emerged as the modal strategy. Two factors, the use of simple heuristics by most subjects and the overestimation of small cell frequencies, combined to explain the results of Experiments 2 and 3. In Experiment 6 we showed that in a contingency estimation task, salience can augment the impact of one type of data but not another. In Experiment 7 we learned that the versus at the end of the data stream, can influence the final estimate. From this group of experiments we concluded that the "framing" of the task affects the contingency estimate; a number of factors that bear no logical relation to the contingency between two factors nevertheless influence one's perception of the contingency. Finally, we related our findings to a variety of analogous findings in the research areas of memory, attribution theory, clinical judgment, and depression.

  • A simple stay-switch probability game demonstrates the importance of empirically testing our beliefs. Based on intuition, most undergraduate subjects believe that a stay strategy leads to a higher percentage of winning, and most faculty subjects believe that the staying and switching strategies yield equal probabilities of winning. However, a simple in-class experiment proves that switching wins twice as often as staying. Rather than teaching specific probability principles, this demonstration emphasizes reliance on empirically validating our beliefs. A follow-up questionnaire shows that participating in this experiment may increase students' trust in the empirical method.

  • Although few adults would be able to define probability with any precision, and in fact definitions of probability are a matter for dispute among logicians and mathematicians, most adults are able to behave effectively in probabilistic situations involving quantitative proportions of independent elements. Piaget has studied the behavior of children in a probabilistic situation and from their behavior has concluded that young children (say up to age 7) are unable to utilize a concept of probability. The present study is a demonstration that young children are able to behave in terms of the probability concept under appropriate conditions. It is an experiment in which Piaget's technique for assessing the probability concept in young children is compared with a decision-making technique.

Pages

register