Update on the Milwaukee School Choice Evaluation Dust-Up
My post of April 1 criticizing Diane Ravitch has raised quite a stir. In that post and in this one, I defend and explain the work of my research team but I want to be clear that, in doing so, I speak only for myself.
To briefly review, I admonished Ravitch for repeating inaccurate facts regarding my team’s school voucher evaluations, relying on secondary sources for her information, and mischaracterizing our scientific research methodologies, which she apparently does not understand. Kevin Welner of the National Education Policy Center (NEPC) has been especially forceful in objecting to my post in text posted on Ravitch’s blog. Here I respond to his charges.
First, Welner argues that I owe Ravitch and NEPC an apology because the initial version of our Milwaukee Parental Choice Program (MPCP) educational attainment study was the source of one of Ravitch’s factual errors, and our error was merely repeated by the person NEPC hired to review our study. Since Ravitch used that review to source her claim, she (and NEPC) are not responsible for the mistake.
Specifically, we are discussing the claim that 75% of the students who started in the voucher program in 9th grade were not in the program four years later. That was an error in the initial draft of our report which Welner points out was quickly corrected to 56% in a second and final version of the report identified as “Updated and Corrected”. Welner claims that the initial version, with the incorrect figure, was the one sent to their reviewer of our study, Casey Cobb, and that “Nobody had thought to go back and see whether Wolf or his colleagues had changed important numbers in the SCDP report.”
Welner is obviously mistaken on that last point. Someone did think to go back and access the updated report. Casey Cobb did. We know this because, after mentioning the incorrect 75% figure in his executive summary and page 2 of his review, on page 4 Cobb writes:
“Notably, more than half the students (56%) in the MPCP 9th grade sample were not in the MPCP four years later.”
Cobb could only have gotten the correct, 56%, figure from the updated and corrected report, which means that he knew that the 75% figure was outdated and incorrect but he mentioned that number as well, even though it clearly conflicted with the 56% figure. People make mistakes. We made a mistake in the form of the initial 75% program attrition figure. Welner made a mistake in claiming with certainty that “Nobody had thought to go back and see” whether our report had been updated. Cobb made a mistake in failing to delete the incorrect program attrition figure from his review after he had taken the correct 56% figure from the “Updated and Corrected” version of our report. And Welner and his colleagues made a further mistake in not catching the inconsistency between the 75% and 56% figures in Cobb’s review, before they published and publicized it. The big question is whether people correct their mistakes after they recognize them. We did because that’s what scholars do. I expect that the NEPC will issue an “Updated and Corrected” version of Cobb’s review promptly.
While Casey Cobb is correcting his review of our report, he should also revise his charge on page 4 that, “Curiously, it [meaning the report] fails to state how many program-switchers there were, when they switched and in which direction, and how many graduated.” True, we did not provide those details in the report, but we referred readers to yet another publication of ours that does. It is even called “Going Public: Who Leaves a Large, Longstanding, and Widely Available Urban Voucher Program?” It was published in the prestigious American Educational Research Journal, the flagship journal of the American Education Research Association, more than a year ago. Its mere existence definitively refutes Diane Ravitch’s charge that “Nobody knows” what happened to the students in our study who left the voucher program. Not only do we know, we published an entire article about it that she and her colleagues really should read.
In a sense, the dust-up over the “75% versus 56%” number and the false charge that nobody knows what happened to students who left the MPCP during our study was both avoidable and immaterial. Obviously it could have been avoided if we hadn’t initially reported the incorrect percentage of attriters. It also could have been avoided if Diane Ravitch had actually read our updated report or, better yet, our , before issuing the charge in her March 29 blog post. Instead, it is obvious that she relied solely on Cobb’s review and never read our report before criticizing it. My original point was that this is not something that serious scholars do.
The difference between the 75% and 56% figure is largely immaterial because our “intention-to-treat” analysis exclusively measures the effect of starting high school in the voucher program on future levels of educational attainment regardless of how long you stayed in the program. Okay, let’s all say this together, “Program attrition has no effect on the internal validity of intention-to-treat analyses of program effects.” None. Period. Anyone who doesn’t accept that doesn’t understand the basics of program evaluation and shouldn’t be discussing studies that employ such scientific methodologies.
So, these are the facts: First, 56%, and not 75%, of MPCP 9th graders left the program before the end of 12th grade. Even in the face of substantial program attrition, students who were in the MPCP in 9th grade in 2006 graduated from high school, enrolled in college, and persisted in college at rates higher than similar students in Milwaukee Public Schools (MPS). Third, at the end of the study, students who started the study in the MPCP had higher reading scores than comparable MPS students. Fourth, the researchers carefully tracked the students who left the Milwaukee voucher program and even published an article in the top education journal about it. Unfortunately, I worry that some people are determined to avoid acknowledging these facts.