Date of Award

5-2014

Embargo Period

1-26-2015

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Statistics

Advisor(s)

Joel Greenhouse

Abstract

The aim of the evidence based education movement is two-fold: (i) to determine the best practices from scientifically rigorous studies and (ii) to apply those best practices to educational decision making. To assist the adoption of evidence based practices by US educators and policy makers, congress created the What Works Clearinghouse (WWC) with a mission to evaluate evidence about educational interventions and to disseminate information about best practices. the WWC synthesizes the results of education research and publishes these recommendations for use by educators and policy makers. Throughout its history, however, the evidence based education movement has struggled with the low quality of education research. For example, a common analytic error used by education researchers is that a an experiment will be designed to randomize entire schools to treatment and control ideas , but then the experiment will be analyzed ignoring the grouped nature of the randomization. this error is well know to lead to invalid conclusions because it overstates the statistical significance of the treatment effect. The WWC chose to address this common error by attempting to remove the anti-conservative bias of these misspecified analyses by calculating a correction to the test statistic. In this thesis I investigate the properties of this correction and generalize it to a larger class of experimental designs. I find that: (i) these exists a correction that the WWC could feasibly calculate , and (ii) the Hedges correction approaches this more general asymptotically, but (iii) for common experimental designs these corrections can be so conservative as to have nearly zero statistical power. I illustrate this result with an example from the WWC,s own recommendations, where a math curriculum was rated as having "Potentially Positive Effects" in 2004, "No Discernible Effects" in 2006, and "Potentially Positive Effects" in 2008, depending on whether a single experiment was (i) analyzed incorrectly, corrected with the WWC process, or (iii) re-analyzed with a reasonable model, respectively. I recommend that the WWC stop attempting to correct misspecified analyses, and instead work closely with researchers to improve the quality of analysis in education research

Share

COinS