Literature Index

Displaying 661 - 670 of 3326
  • Author(s):
    Estrada Roca, A., & Batanero, C. D.
    Editors:
    Rossman, A., & Chance, B.
    Year:
    2006
    Abstract:
    Being able of correctly read and interpret two ways tables is a basic component of statistical literacy for every citizen. Therefore, future teachers who will be responsible to teach statistics to children at school level should acquire these abilities along their training. However, this capacity is taken for granted in Spain and its teaching is not usually included in the curriculum for training teachers. In this study present the results of a small exploratory study that describe the future teachers' semiotic conflicts in solving elementary probability problems when data re given in a two-way table.
  • Author(s):
    Luchini, S. R., Moncecchi, G. & Perelli D'argenzio, M. P.
    Editors:
    Phillips, B.
    Year:
    2002
    Abstract:
    The present work describes the results of a study carried out in the 1999-2000 school year in primary schools of 5 Italian provinces, which involved 145 teachers and more than 2000 pupils aged 6-10. Teaching units adopted by teachers were based on Data Oriented Approach, according to two distinct teaching strategies. One regarded the usual teaching model aiming at objectives, and the other concentrated on the learning of relationships between concepts by using a conceptual map. All the teachers involved attended a preliminary training course on statistics, pedagogy and theory of learning. Basic statistical concepts and their relationships were learnt through semi-structured interviews in class. Concept mapping gave interesting results, especially with regards to permanent acquisition of concepts. Comparison with concepts pupils had before the teaching of statistics in class and after, was carried out with entrance and exit cognitive maps.
  • Author(s):
    Pollatsek, A., Lima, S., & Well, A. D.
    Year:
    1981
    Abstract:
    In statistics, and in everyday life as well, the arithmetic mean is a frequently used average. The present study reports data from interviews in which students attempted to solve problems involving the appropriate weighting and combining of means into an overall mean. While mathematically unsophisticated college students can easily compute the mean of a group of numbers, our results indicate that a surprisingly large proportion of them do not understand the concept of the weighted mean. When asked to calculate the overall mean, most subjects answered with the simple, or unweighed, mean of the two means given in the problem, even though these two means were from different-sized groups of scores. For many subjects, computing the simple mean was not merely the easiest or most obvious way to initially attack the problem; it was the only method they had available. Most did not seem to consider why the simple mean might or might not be the correct response, nor did they have any feeling for what their results represented. For many students, dealing with the mean is a computational rather than a conceptual act. Knowledge of the mean seems to begin and end with an impoverished computational formula. The pedagogical message is clear: Learning a computational formula is a poor substitute for gaining an understanding of the basic underlying concept.
  • Author(s):
    Giuliano, M., Nemirovsky, I., Concari, S., Pérez, S., Alvarez, M., Sacerdoti, A., & Pereziano, M.
    Editors:
    Rossman, A., & Chance, B.
    Year:
    2006
    Abstract:
    Previous misconceptions about science may cause difficulties in the interpretation of scientific models. A Likert scale test was made and presented to part of the population in order to find out beliefs about science and technology that students who wanted to have a degree in engineering at Universidad Nacional de La Matanza had. Principal components analysis was performed to identify the testees' profile. We show the results referring to the beliefs and conceptions about probability, margin for error, accuracy, certainty, truth and validity. Although most of the people who answered the survey acknowledged the presence of probability in the results of a physical experiment, they also gave it accuracy and truth values which are not inherent. It is also remarkable that only a very low percentage has a posture that is coherent with the scientific vision of the terms.
  • Author(s):
    Daniel L. Canada
    Year:
    2008
    Abstract:
    This paper examines how preservice teachers and middle school students reason about distributions as they consider graphs of two data sets having identical means but different spreads. Results show that while both subject groups reasoned about the task using the aspects of average and variation, relatively more preservice teachers than middle school students combined both aspects to constitute an emerging form of distributional reasoning in their responses. Moreover, these emergent distributional reasoners were more likely to see the data sets as fundamentally different despite the identical means used in the task.
  • Author(s):
    Teresa González and Jesús Pinto
    Year:
    2008
    Abstract:
    In this study we analyze the conceptions of future secondary school mathematics teachers on the teaching of statistics and their influence in classifying the problems in which graphical statistics play a role. For this purpose we present a case study of four students taking the course 'Introduction to the Teaching of Mathematics', who responded to different data collection instruments and were interviewed afterwards.
  • Author(s):
    Konold, C. E.
    Year:
    1983
    Abstract:
    This study was undertaken with the goal of inferring an informal approach to probability that would explain, among other things, why subject responses to problems involving uncertainty deviate from those prescribed by formal theories. On the basis of an initial set of interviews such as an informal approach was hypothesized and described as outcome-oriented. In a second set of interviews, the outcome approach was used to successfully predict the performance of subjects on a different set of problems. In this chapter I will elaborate on the importance of understanding that subjects' performance in situations involving uncertainty is based on a theoretical framework that is different in important respects from any formal theory of probability. Additionally, I will argue that the outcome approach is reasonable given the nature of the decisions people face in a natural environment. To this end, I will review research which suggests some reasons why causal as opposed to statistical explanations of events are salient and functionally adaptive.
  • Author(s):
    Saldanha, L. & Thompson, P.
    Year:
    2002
    Abstract:
    We distinguish two conceptions of sample and sampling that emerged in the context of a teaching experiment conducted in a high school statistics class. In one conception 'sample as a quasi-proportional, small-scale version of the population' is the encompassing image. This conception entails images of repeating the sampling process and an image of variability among its outcomes that supports reasoning about distributions. In contrast, a sample may be viewed simply as 'a subset of a population' - an encompassing image devoid of repeated sampling, and of ideas of variability that extend to distribution. We argue that the former conception is a powerful one to target for instruction.
  • Author(s):
    Meletiou, M.
    Year:
    2002
    Abstract:
    There are two parts to this literature review. The first part includes bibliography directly focusing on variation,: meaning of variation, role of variation in statsitcal reasoing, researchon conceptions of variation, as well as literature discussing the neglect of variation. The second part lists references belonging to four bodies of literature which, although not having the study of intuitions abaout variation as their main object of study, do offer rich insights into people's thinking about variation: literarure on sampling and centers, on intutioons about the stochastic, on the role of technology, and on the effect of the formalist mathematics tradition on statistics education.
  • Author(s):
    Scholz, R. W., & Waller, M.
    Editors:
    Scholz, R. W.
    Year:
    1983
    Abstract:
    This article provides a critical review of psychological theories and research approaches on the ontogenesis of the probability concept. An analysis of the conceptualization and interpretation or probability within developmental research reveals that with only one exception on objectivistic interpretation of probability has been made. The reviewed (theoretical) research approaches, the cognitive developmental theory of PIAGET & INHELDER, FISCHBEIN's learning-developmental approach, and verious information processing model's differ in two main aspects. Firstly on the question whether the development should be considered to be a continuous or discontinuous process, and secondly the role of conceptual versus strategic knowledge for coping with probability problems is disputed. The discussion tries to point out what progress could be gained by skipping a one-sided objectivistic interpretation of the probability concept and turning to a conceptualization of probability encompassing both sides, the objectivistic and subjectivistic view. This integration might also lead to a deeper understanding of the individual's conceptualization of uncertainty.

Pages

The CAUSE Research Group is supported in part by a member initiative grant from the American Statistical Association’s Section on Statistics and Data Science Education