Literature Index

Displaying 1351 - 1360 of 3326
  • Author(s):
    Gilovich, T., Griffin, D.
    Year:
    2002
    Abstract:
    In the late 1960s and early 1970s, a series of papers by Amos Tversky and Daniel Kahneman revolutionized academic research on human judgement. The central idea of the "heuristics and biases" program - that judgement undeer uncertainty often rests on a limited number of simplifying heuristics rather than extensive algorithmic processing - soon spread beyond academic psychology, affecting theory and research across a rance of disciplines including economics, law, medicine, and political science. The message was revolutionary in that it simultaneously questioned the descriptive adequacy of ideal models of judgement and offered a cognitive alternative that explained human error without invoking motivated irrationality. The initial papers and a variety of related work were collected in a 1982 volume, Judgment under Uncertainty: Heuristics and Biases (Kahneman, Slovic, & Tversky, 1982). In the time since, research in the heuristics and biases tradition has prospered on a number of fronts, each represented by a section of the current volume. In this opening chapter, we wish to put the heuristics and biases approach in historical context and discuss some key issues that have been raised since the 1982 book appeared.
  • Author(s):
    Cox, M. D.
    Year:
    2004
    Abstract:
    Faculty learning communities create connections for isolated teachers, establish networks for those pursuing pedagogical issues, meet early-career faculty expectations for community, foster multidisciplinary curricula, and begin to bring community to higher education.
  • Author(s):
    Moore, D. S.
    Editors:
    Hoaglin, D. C., Moore, D. S.
    Year:
    1992
    Abstract:
    A wide gap separates statistics teaching from statistical practice. This volume presents a view of statistics that reflects statistical practice as well as changing emphases in statistics research and in the approach that statisticians take to their subject.
  • Author(s):
    Hoerl, R. W.
    Year:
    1997
    Abstract:
    The bulk of this article was originally presented as a commentary on Neil Ullman's paper "Statistical or Quantitative Thinking as a Fundamental Intelligence", presented at the 1995 Joint Statistical Meetings in Orlando. (Editors' Note: A condensed version of N. Ullman's paper appeared in Vol. 2, No. 1-Winter 1996 of this Newsletter.) The purpose for including it in this publication is to suggest that introductory statistics courses must be radically redesigned, not incrementally improved, if statistics is to assume its rightful place in US society. It is further argued that how to implement this radical revision is basically known; it simply requires combining several suggestions which have been made previously by recognized researchers in the field. While these do not appear radical individually, combining them would result in an introductory course virtually unrecognizable by today's standards.
  • Author(s):
    Robyn Reaburn
    Year:
    2014
    Abstract:
    This study aimed to gain knowledge of students’ beliefs and difficulties in understanding p-values, and to use this knowledge to develop improved teaching programs. This study took place over four consecutive teaching semesters of a one-semester were used to inform the instructional design for the following semester, the introduction of hypothetical probabilistic reasoning using a familiar context, and the use of alternative representations. The students were also encouraged to write about their work. As the interventions progressed, a high proportion of students successfully defined and used p-values in Null Hypothesis Testing procedures.
  • Author(s):
    Roiter, K., & Petocz, P.
    Year:
    1996
    Abstract:
    This paper presents a framework for the design and analysis of introductory statistics courses. This framework logically precedes the usual process of putting together the syllabus for an introductory statistics course. Four approaches, or paradigms, of statistics teaching are put forward, together with tools for deciding which blend of approaches is most useful in any particular case. These approaches do not correspond to the two traditional schools of thought in statistics education -- probability-driven or data-driven -- but rather constitute a new approach.
  • Author(s):
    Megan R. Hall and Ginger Holmes Rowell
    Year:
    2008
    Abstract:
    This paper describes 27 National Science Foundation supported grant projects that have innovations designed to improve teaching and learning in introductory statistics courses. The characteristics of these projects are compared with the six recommendations given in the Guidelines for Assessment and Instruction in Statistics Education (GAISE) College Report 2005 for teaching an introductory course in statistics. Through this analysis, we are able to see how NSF-supported introductory statistics education projects during the last decade achieve the GAISE ideals. Thus, materials developed from many of these projects provide resources for first steps in implementing GAISE recommendations.
  • Author(s):
    Chance, B. L.
    Year:
    1999
    Abstract:
    The recent statistics education reform movement has advocated the adoption of many supplements to the introductory statistics course. These include hands-on activities, extensive use of technology, student projects, reflective writing, oral presentations, collaborative learning, and case studies. Combined with a full curriculum of topics for a variety of majors, this appears to be a daunting wish list. This paper offers some suggestions, based on experience at a small university, as to how to integrate many of these techniques, allowing them to build on and complement each other. Benefits and tradeoffs of implementing these techniques will be discussed, including issues of time commitment from the perspective of both students and instructors.
  • Author(s):
    Cobb, G. W.
    Year:
    1987
    Abstract:
    This review grows out of a strong conviction on three points: 1. Statistics is fundamentally and primarily concerned with analyzing real data. 2. Data analysis, including inference, is both intellectually challenging and intrinsically interesting. 3. Until recently, most authors of introductory statistics textbooks have managed to do a superb job of concealing from their readers the truth of the first two points. Fortunately, the last decade has seen the arrival of a number of innovative introductory textbooks, so I now find it much more reasonable than in the past to apply high standards in judging an elementary book. In preparing this review, I have tried to present these standards systematically; I use them as an organizing frame for comparing 11 new books (or new editions) with 5 favorites from the past 10 years.
  • Author(s):
    Suich, R. & Turek, R.
    Editors:
    Goodall, G.
    Year:
    2003
    Abstract:
    Summary The notion of independence between two nominal variables is typically introduced through the use of chi-square analysis of contingency tables, while the topic of prediction of one nominal variable from a second nominal variable using optimal prediction to the mode is often omitted. Through the use of a questionnaire, this article indicates that there is considerable confusion among students on the difference between the concepts of independence and prediction, and remedies are suggested.

Pages

The CAUSE Research Group is supported in part by a member initiative grant from the American Statistical Association’s Section on Statistics and Data Science Education