# Report

• ### Problem systems in stochastics teaching. Didactic concepts of developing material and teacher in-service training in the LEDIS project. : Aufgabensysteme im Stochastikunterricht. Didaktische Konzepte zur Materialentwicklung und Lehrerfortbildung im LEDIS-

The project LEDIS aims at developing stochastics as example of application-oriented school mathematics. Parallel activities on a theoretical model level and with real random phenomena are intended for developing applications hand in hand with theory in the sense of contributing to mutual explanation but not of applying a previously developed mathematical theory on examples. For being able to put this guiding thesis into practice in projects with teachers, the didactic concept of a (mathematical) problem system for building up the back-up text has been developed in the project. This report presents some of the results of the project, which involves the task of developing and testing models for the in-service-training of teachers.

• ### ChancePlus: A computer-based curriculum for probability and statistics

ChancePlus is a three-year project to research and develop effective methods for teaching introductory probability and statistics, especially at the high-school level. In this report, we summarize our activities during the second year of the project. Our efforts in this year were concentrated on developing and revising instructional units in probability and statistics and on the development of our statistical analysis program, DataScope. We also describe progress in our research of student understanding of probability and statistics and in the development of items to test for conceptual understanding.

• ### Probability estimation and the use and neglect of base-rate information

A number of studies have reported that there is a strong tendency to ignore base-rate information in favor of individuating information, except when the former can readily be incorporated into a causal schema. In the present study, students in eight undergraduate classes were given problems in which the base-rate information was (1) either causal of noncausal and (2) either incongruent or congruent with the individuating information. In addition, twelve subjects were interviewed as they attempted to solve several versions of the one of the problems. We found (1) strong individual differences in the perceived importance of base-rate information and even in how the probability estimation task itself was interpreted, (2) little if any effect of the causality manipulations employed by Ajzen (1977) and Tversky and Kahneman (1980, and (3) greater use of base-rate information congruent with the individuating information than of base-rate information which is incongruent. The interview data indicate that it is difficult to determine from the answer alone whether or not the subject thought that the base-rate information was relevant. These data also suggest that subjects have different strategies for dealing with probability estimation problems. One of these we characterize as not only nonBayesian, but also nonprobabilistic.

• ### Data analysis: An adjunct to mathematical statistics at Oberlin College

Are SAT scores useful predictors of success in college? I led a group of mathematics majors at Oberlin College in an exploration of this question as the core of a one-credit course, entitled Data Analysis, last year. This course, MATH 337, is an adjunct to the standard, junior-level, two semester sequence in probability and mathematical statistics that we offer each year at Oberlin. Unlike most statistics courses for mathematics majors, the Data Analysis course allows - indeed, it forces - students to "get their hands dirty" exploring real data and trying to answer real questions. Each year I select a set of data, such as the SAT data, to serve as the central focus of the course. I believe that it is imperative that student learn something of how statistical theory is applied in practice and I try to show this side of statistics in the courses I teach at all levels. However, it is particularly difficult to cover much material on applied statistics while at the same time covering the many important topics in the mathematical statistics course - probability, random variables, functions of random variables, expectation, the central limit theorem, estimators, confidence intervals, hypothesis testing, and others - that are fundamental to the discipline and are an essential foundation for advanced (graduate) training in statistics. The Data Analysis course provides a workable solution to this problem.

• ### Forecasting/Time Series Analysis: An Introduction to Applied Statistics for Mathematics Students

A course in time series analysis offers a number of unique opportunities for introducing mathematically oriented students to the applications of statistics. Characteristics of such a course and its suitability as a "first course" in statistics are discussed.

• ### A rationale for teaching probability and statistics in primary and secondary schools: A report of the committee on probability and statistics

This document explores some reasons why statistics and probability are appropriate topics for primary and secondary schools.

• ### Technology and data: Models and analysis

This is the report of a working group on Technology and Statistics that explores the role of technology in improving how students model and analyze data to understand the world around them. The paper first addresses curricular goals for students at different grade levels. A section on modeling issues examines data analysis as an investigative process in which students construct, examine, and interpret models of the world. Other sections of the paper describe the unique capabilities technology offers for teaching and learning statistics, provide examples of existing technology designed to help students develop important concepts, and offer recommendations for technology to be developed over the next decade. Research issues and teacher preparation concerns are also addressed.

• ### Report on workshop on statistical education

There were 39 statisticians who gathered for a workshop on statistical education in Iowa City, Iowa, June 18-20, 1990. Theses persons represented universities, colleges, consulting firms, business and industry. As we prepared for the workshop, most of the 39 participants, and several others not attending, wrote papers on some aspect of statistical education, the majority of which concerned a first course in statistics. As a group, we recognized several poor characteristics of science and mathematical education, including statistical education.

• ### ELASTIC and reasoning under uncertainty: Final Report to NSF

Our analysis identified problems both with the subject matter of statistics (e.g. multiple levels of abstraction, difficulty mapping statistical representations to real-world situations) and with its pedagogy (which typically does little to help concertize abstract concepts or illuminate the mapping process). Drawing on research in education, cognitive psychology and statistical computing, we designed, implemented, and pilot-tested software (ELASTIC) and a curriculum (Reasoning Under Uncertainty) to address these problems. Our approach was successful in many of the problem areas identified above; in addition, our experiences in classrooms helped us better understand the difficulties students have in understanding and applying statistical reasoning.

• ### Hands on data: Direct-manipulation environments for data organization and analysis

Current curricular thinking in mathematics, science and computing displays a recurring theme: the value of working with data and the importance of learning the skills and concepts associated with such work. Recent statements issued by the National Council of Teachers of Mathematics (NCTM, 1987) and the Mathematical Sciences Education Board (Ralston 1988) advocate a sharply increased emphasis on data analysis in school mathematics at all levels. The concern with computer literacy of the early 1980s is maturing into a discussion of the kinds of "information studies" that are needed to prepare students for a society in which information technologies play an essential and ever-expanding role (White , 1987). The call for the use of real data in the natural and social sciences goes back considerably farther (Hawkins, 1964; Morrison, 1964; Taba, 1967), but has recently gained new impetus from technological advances which multiply the potential for powerful, realistic investigation by science students (Hawkins, Brunner, et al. 1987; Tinker, 1987). In support of curricular developments such as these, new technologies offer a potential that is largely untapped. The large bitmapped screens and fast processors which are available on today's new workstations, and will be available on the school computers of the middle to late 1990's make possible a whole new class of tools for working with data, tools whose transparency and rich interactivity can support qualitatively new styles of inquiry and bring unprecedented analytic power to students of all ages. We have designed and partially prototyped an exemplar of this new class: a highly visual, highly interactive environment for creating, organizing, exploring and analyzing "attribute data" -- the kinds of data that are used in statistics and many of the sciences, and which conventional database systems are designed to store. The environment achieves a striking combination of simplicity, directness, power and flexibility. We are truly excited by the potential for tools of this kind to support a new level of data analysis and theory building in mathematics and the sciences. More than tools are needed, however. Essential to all of these curricular trends, it seems, is a fundamental set of concepts and skills about which more needs to be known.