Literature Index

Displaying 791 - 800 of 3326
  • Author(s):
    Bakker, A.
    Year:
    2004
    Abstract:
    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research presented in this thesis is a sequel to design research carried out by Cobb, McClain, and Gravemeijer (2003) and aims at contributing to an empirically grounded instruction theory for early statistics education in which educational computer tools are used. Computer software allows users to dynamically interact with large data sets and to explore different representations in ways that are impossible by hand. The computer tools used were the Statistical Minitools (Cobb et al., 1997), which have been designed for middle school students. One important end goal of instruction was that students would gain understanding of sampling and learn to use 'shape' to reason about distributions. In line with the theory of Realistic Mathematics Education, a central tenet was that learning mathematics should be a meaningful activity. The research questions were: 1. How can students with little statistical background develop a notion of distribution? 2. How does the process of symbolizing evolve when students learn to reason about distribution? In the latter question, 'symbolizing' refers to the reflexive process of making symbols and mentally constructing the objects which they represent. The design research consisted of five cycles of three phases: a design phase, a teaching experiment, and a retrospective analysis. Prior to these cycles, historical and didactical phenomenological analyses (Freudenthal, 1991) were carried out as well as exploratory interviews. The historical study gave rise to hypotheses that were partially tested in the teaching experiments carried out in grades 7 and 8 (12 to 14 years old). In the design phase, a so-called hypothetical learning trajectory (Simon, 1995) was formulated, which was tested and revised in the subsequent cycles of design research. The recurring activity of discussing growing samples proved useful to support reasoning about distribution and sampling. For the analyses of students' learning, a method was used similar to the constant comparative method (Strauss & Corbin, 1998). It turned out that students conceived distributions as 'bumps' consisting of a small group of low values, a large group of 'average' values, and a small group of high values. Peirce's semiotics proved useful for analyzing students' process of symbolizing, in particular, his notions of hypostatic abstraction and diagrammatic reasoning. Hypostatic abstraction refers to the transition from a predicate to a new abstract object ("the dots are spread out" to "the spread is large"). Diagrammatic reasoning consists of three steps: making a diagram, experimenting with it, and reflecting on the results. The research shows the importance of letting students make their own diagrams and discussing these. The computer tools seemed most useful during the experimentation phase. Remarkably, the best diagrammatic reasoning occurred only during class discussions without computers around. One recommendation is: only invest in using computer tools if all educational factors such as teaching, end goals, instructional activities, tools, and assessment are tuned to each other.
  • Author(s):
    The Design-Based Research Collective
    Year:
    2003
    Abstract:
    The authors argue that design-based research, which blends empirical educational research with the theory-driven design of learning environments, is an important methodology for understanding how, when, and why educational innovations work in practice. Design-based researchers' innovations embody specific theoretical claims about teaching and learning, and help us understand the relationships among educational theory, designed artifact, and practice. Design is central in efforts to foster learning, create usable knowledge, and advance theories of learning and teaching in complex settings. Design-based research also may contribute to the growth of human capacity for subsequent educational reform.
  • Author(s):
    Konold, C.
    Editors:
    Lovett, M. C., & Shah, P.
    Year:
    2007
    Abstract:
    In this chapter, I describe ideas underlying the design of a software tool we developed for middle school students. The tool - TinkerPlots - allows students to organize data to help them see patterns and trends in data much in the spirit of visualization tools such as Data Desk (DataDescription, Inc.). But we also intend teachers and curriculum designers to use it to help students build solid conceptual understanding of what statistics are and how we can use them.
  • Author(s):
    Anderson-Cook, C. M.
    Year:
    1998
    Abstract:
    A project suitable for use as a first and last assignment given in an introductory experimental design course is outlined, and its implementation is discussed.
  • Author(s):
    Cohen, S., Smith, G., Chechile, R., & Cook, R.
    Editors:
    Brunelli, L., & Cicchitelli, G.
    Year:
    1993
    Abstract:
    ConStatS is used in almost all the discipline-specific introductory statistics courses taught at Tufts. Even though most students seem able to use the software for an extended period of time without a great deal of direction, most of the instructors provide assignments that provoke open-ended use of the software. These assignments usually contain short essay questions that address the kinds of conceptual issues captured in ConStatS experiments. During the past year, ConStatS has been at the center of a large scale, FIPSE funded project for evaluating the effectiveness of curricular software. The goal of the evaluation was to assess if ConStatS in particular, and curricular software in general, could help students to develop a deeper conceptual understanding of statistics. The evaluation tool place in the fall 1992 and the spring 1993 semesters. Classes at three universities participated. In total, seven classes with 303 students participated as control groups.
  • Author(s):
    Konold, C.
    Year:
    1994
    Abstract:
    Few would question the assertion that the computer has changed the practice of statistics. Fewer still would argue with the claim that the computer, so far, has had little influence on the practice of teaching statistics; in fact, many still claim that the computer should play no significant role in introductory statistics courses. In this article, I describe principles that influenced the design of data analysis software we recently developed specifically for teaching students with little of no prior data analysis experience. I focus on software capabilities that should provide compelling reasons to abandon the argument still heard that introductions to statistics are only made more difficult by simultaneously introducing students to the computer and a new software tool. The micro-computer opens up to beginning students means of viewing and querying data that have the potential to get them quickly beyond technical aspects of data analysis to the meat of the activity: asking interesting questions and constructing plausible answers. The software I describe, DataScope, was designed as part of a 4-year project funded by the National Science Foundation to create materials and software for teaching data analysis at the high-school and introductory college level. We designed DataScope to fill gap we perceived between professional analysis tools (e.g., StatView and Data Desk), which were too complex for use in introductory courses, and existing educational software (e.g., Statistics Workshop, Data Insights) which were not quite powerful enough to support the kind of analyses we had in mind. Certainly, DataScope is not the educational tool we might dream of having (see Biehler, 1994), but it does give students easy access to considerable analysis power.
  • Author(s):
    Clements, D. H. & Battista, M. T.
    Editors:
    Kelly, A. E. & Lesh, R. A.
    Year:
    2000
    Abstract:
    The Handbook of Research Design in Mathematics and Science Education is based on results from an NSF-supported project (REC 9450510) aimed at clarifying the nature of principles that govern the effective use of emerging new research designs in mathematics and science education. A primary goal is to describe several of the most important types of research designs that:<br>* have been pioneered recently by mathematics and science educators;<br>* have distinctive characteristics when they are used in projects that focus on mathematics and science education; and<br>* have proven to be especially productive for investigating the kinds of complex, interacting, and adapting systems that underlie the development of mathematics or science students and teachers, or for the development, dissemination, and implementation of innovative programs of mathematics or science instruction.<br>The volume emphasizes research designs that are intended to radically increase the relevance of research to practice, often by involving practitioners in the identification and formulation of the problems to be addressed or in other key roles in the research process. Examples of such research designs include teaching experiments, clinical interviews, analyses of videotapes, action research studies, ethnographic observations, software development studies (or curricula development studies, more generally), and computer modeling studies. This book's second goal is to begin discussions about the nature of appropriate and productive criteria for assessing (and increasing) the quality of research proposals, projects, or publications that are based on the preceding kind of research designs. A final objective is to describe such guidelines in forms that will be useful to graduate students and others who are novices to the fields of mathematics or science education research. The NSF-supported project from which this book developed involved a series of mini conferences in which leading researchers in mathematics and science education developed detailed specifications for the book, and planned and revised chapters to be included. Chapters were also field tested and revised during a series of doctoral research seminars that were sponsored by the University of Wisconsin's OERI-supported National Center for Improving Student Learning and Achievement in Mathematics and Science. A Web site with additional resource materials related to this book can be found at http://www.soe.purdue.edu/smsc/lesh/
  • Author(s):
    Manor, H., & Ben-Zvi, D.
    Year:
    2016
  • Author(s):
    BROERS, Nick J
    Year:
    2007
    Abstract:
    The theory of statistics is composed of highly abstract propositions that are linked in multiple ways. Both the abstraction level and the cumulative nature of the subject make statistics a difficult subject. A diversity of didactic methods has been devised to aid the student in the effort to master statistics, one of which is the method of propositional manipulation (MPM). Based on this didactic method, a corresponding assessment method has been developed. Basically, in using MPM for assessment purposes, the student is instructed to construct arguments using subsets of elementary propositions. In effect, the assessment procedure demands the student to display knowledge of the interrelationships between the propositions in a particular subset. Analysis of the student responses allows for scoring purely propositional knowledge, as well as conceptual understanding. In this paper we discuss research on the effectiveness of this assessment method, relative to assessment of conceptual understanding using concept mapping.
  • Author(s):
    Bakker, A., Kent, P., Noss, R., &amp; Hoyles, C.
    Editors:
    Rossman, A., &amp; Chance, B.
    Year:
    2006
    Abstract:
    Our interest in the techno-mathematical literacies employees need in their jobs has led us to do case studies in different industrial sectors and to design learning opportunities for improving employees' techno-mathematical literacies. We conceptualise the learning opportunities not as training or transfer, but as forms of "boundary crossing" between employees from a company and us as researchers. Two examples are given in which packaging managers explore and discuss a realistic problem using TinkerPlots, an educational software tool. The results emphasise how important it is to allow managers to bring their ideas and concerns to the problem situation so they can connect these to statistical theory that is relevant in statistical process control (SPC). This approach is contrasted with SPC training we have observed in two industrial sectors.

Pages

The CAUSE Research Group is supported in part by a member initiative grant from the American Statistical Association’s Section on Statistics and Data Science Education

register