Literature Index

Displaying 241 - 250 of 3326
  • Author(s):
    Weldon, K. L.
    Editors:
    Phillips, B.
    Year:
    2002
    Abstract:
    Statistical software has made traditional statistical calculations accessible to almost anyone, but it has also stimulated new methods that are usually reserved for advanced courses. In this paper I argue for inclusion in the first course of non-parametric smoothing, density estimation, coplots, simulation, the bootstrap, time series forecasting, and plots of multivariate data. It is argued that the logic underlying these techniques is simpler and more useful than the logic underlying the inference usually included in first service courses in statistics.
  • Author(s):
    Gatti, G. G., & Harwell, M.
    Year:
    1998
    Abstract:
    Statistics and research design textbooks routinely highlight the importance of a priori estimation of power in empirical studies. Unfortunately, many of these textbooks continue to rely on difficult-to-read charts to estimate power. That these charts can lead students to estimate power incorrectly will not surprise those who have used them, but what is surprising is that textbooks continue to employ these charts when computer software for this purpose is widely available and relatively easy to use. The use of power charts is explored, and computer software that can be used to teach students to estimate power is illustrated using the SPSS and SAS data analysis programs.
  • Author(s):
    Larsen, M. D.
    Editors:
    Stephenson, W. R.
    Year:
    2006
    Abstract:
    Lecture is a common presentation style that gives instructors a lot of control over topics and time allocation, but can limit active student participation and learning. This article presents some ideas to increase the level of student involvement in lecture. The examples and suggestions are based on the author?s experience as a senior lecturer for four years observing and mentoring graduate student instructors. The ideas can be used to modify or augment current plans and preparations to increase student participation. The ideas and examples will be useful as enhancements to current efforts to teach probability and statistics. Most suggestions will not take much class time and can be integrated smoothly into current preparations.
  • Author(s):
    Green, K. E.
    Year:
    1993
    Abstract:
    This study approached the investigation of attitudes toward statistics from the perspective of Rosenberg and Hovland's (1960) hierarchical, multicomponent model. In this model, cognition, affect, and behavior are considered interrelated first-order-factors with attitude a single second-order factor. In the present study, affective, cognitive, and behavioral components of attitude were assessed. Two other assessment tools were also administered for comparison purposes (Wise's ATA and a semantic differential measure). Data were gathered from 2 classes (n = 47) attended by master's and doctoral students in education, social work, and speech communication. Results suggest the strongest association is between affective and cognitive components but the greatest temporal stability was found for the behavioral component. Consistency in ratings of affect and cognition was not predictive of behavior nor was locus of attitude formation.
  • Author(s):
    Patrick White and Stephen Gorard
    Year:
    2017
    Abstract:
    Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on delivery. The emphasis has generally been on increased use of complex techniques, specialist software and, most importantly in the context of this paper, a continued focus on inferential statistical tests, often at the expense of other types of analysis. We argue that this ‘business as usual’ approach to the content of statistics teaching is problematic for several reasons. First, the assumptions underlying inferential statistical tests are rarely met, meaning that students are being taught analyses that should only be used very rarely. Secondly, all of the most common outputs of inferential statistical tests – p-values, standard errors and confidence intervals – suffer from a similar logical problem that renders them at best useless and at worst misleading. Eliminating inferential statistical tests from statistics teaching (and practice) would avoid the creation of another generation of researchers who either do not understand, or knowingly misuse, these techniques. It would also have the benefit of removing one of the key barriers to students’ understanding of statistical analysis  
  • Author(s):
    Schwartz, D. L., Goldman, S. R., Vye, N. J., Barron, B. J., Cognitive & Technology Group at Vanderbilt
    Editors:
    Lajoie, S. P.
    Year:
    1998
    Abstract:
    In this chapter, we present results from three studies that examined and supported 5th- and 6th-grade children's evolving notions of sampling and statistical inference. Our primary finding has been that the context of a statistical problem exerts a profound influence on children's assumptions about the purpose and validity of a sample. A random sample in the context of drawing marbles, for example, is considered acceptable, whereas a random sample in the context of an opinion survey is not. In our design of instructional and assessment materials, we have tried to acknowledge and take advantage of the role that context plays in statistical understanding.
  • Author(s):
    Ben-Zvi, D.
    Year:
    1999
  • Author(s):
    Garfield, J. B., & delMas, R. C.
    Editors:
    Maher, C. A., Goldin, G. A., & Davis, R. B.
    Year:
    1989
    Abstract:
    Research on misconceptions of probability indicates that students' conceptions are difficult to change. A recent review of concept learning in science points to the role of contradiction in achieving conceptual change. A software program and evaluation activity were developed to challenge students' misconceptions of probability. Support was found for the effectiveness of the intervention, but results also indicate that some misconceptions are highly resistant to change.
  • Author(s):
    Lock, R. H.
    Editors:
    Vere-Jones, D., Carlyle, S., & Dawkins, B. P.
    Year:
    1991
    Abstract:
    Brief descriptions of several model courses which have been developed by participants in a series of Statistics in the Liberal Arts Workshops (SLAW).
  • Author(s):
    Arthur Bakker, Phillip Kent, Richard Noss & Celia Hoyles
    Year:
    2009
    Abstract:
    In manufacturing industry, many employees need to interpret and communicate statistical information to monitor and improve production processes. Often the information is reduced to the form of numerical measures, on the logic that numbers are a convenient and understandable type of information to pass among the diverse groups of employees that make up a manufacturing operation. We investigated by means of interviews and observation how several numerical measures, 'process capability indices', were used in an automotive factory and how employees were trained to use them. We found that the typical introduction to the measures deployed statistical and algebraic symbolism as well as laborious manual calculations that did not appear to support employees' understanding of the underlying mathematical relationships. These measures therefore failed to be 'boundary objects' - artifacts that inhabit different social worlds and satisfy the informational requirements of each. The goal of our subsequent design-based research was to design a representation of the process capability indices that would be easier to engage with than the existing formal symbolism used in shop floor calculations and in training. We did this by re-presenting relevant mathematical relationships in computer tools - technology-enhanced boundary objects (TEBOs) - developed in collaboration with company trainers. To evaluate our interaction with three trainers and 37 trainees in three courses in two factories, and the impact of the computer tools on practice, we followed the computer tools' trajectory from the stage of codesign with the original car factory through to the stage at which the computer tools were used by factories beyond this research project. The evaluation points to the importance of aligning statistical and workplace norms and meanings, and gives illustrations of how the tools facilitated communication between employees.

Pages

The CAUSE Research Group is supported in part by a member initiative grant from the American Statistical Association’s Section on Statistics and Data Science Education