False Claim on Drill and Kill

The Gates Foundation is funding a $45 million project to improve measures of teacher effectiveness.  As part of that project, researchers are collecting information from two standardized tests as well as surveys administered to students and classroom observations captured by video cameras in the classrooms.  It’s a big project.

The initial round of results were reported last week with information from the student survey and standardized tests.  In particular, the report described the relationship between classroom practices, as observed by students, and value-added on the standardized tests.

The New York Times reported on these findings Friday and repeated the following strong claim:

But now some 20 states are overhauling their evaluation systems, and many policymakers involved in those efforts have been asking the Gates Foundation for suggestions on what measures of teacher effectiveness to use, said Vicki L. Phillips, a director of education at the foundation.

One notable early finding, Ms. Phillips said, is that teachers who incessantly drill their students to prepare for standardized tests tend to have lower value-added learning gains than those who simply work their way methodically through the key concepts of literacy and mathematics. (emphasis added)

I looked through the report for evidence that supported this claim and could not find it.  Instead, the report actually shows a positive correlation between student reports of “test prep” and value added on standardized tests, not a negative correlation as the statement above suggests.  (See for example Appendix 1 on p. 34.)

The statement “We spend a lot of time in this class practicing for [the state test]” has a correlation of  0.195 with the value added math results.  That is about the same relationship as “My teacher asks questions to be sure we are following along when s/he is teaching,” which is 0.198.  And both are positive.

It’s true that the correlation for “Getting ready for [the state test] takes a lot of time in our class” is weaker (0.103) than other items, but it is still positive.  That just means that test prep may contribute less to value added than other practices, but it does not support the claim that  ”teachers who incessantly drill their students to prepare for standardized tests tend to have lower value-added learning gains…”

In fact, on page 24, the report clearly says that the relationship between test prep and value-added on standardized tests is weaker than other observed practices, but does not claim that the relationship is negative:

The five questions with the strongest pair-wise correlation with teacher value-added were: “Students in this class treat the teacher with respect.” (?=0.317), “My classmates behave the way my teacher wants them to.”(?=0.286), “Our class stays busy and doesn’t waste time.” (?=0.284), “In this class, we learn a lot almost every day.”(?=0.273), “In this class, we learn to correct our mistakes.” (?=0.264) These questions were part of the “control” and “challenge” indices. We also asked students about the amount of test preparation they did in the class. Ironically, reported test preparation was among the weakest predictors of gains on the state tests: “We spend a lot of time in this class practicing for the state test.” (?=0.195), “I have learned a lot this year about the state test.” (?=0.143), “Getting ready for the state test takes a lot of time in our class.” ( ?=0.103)

I don’t know whether something got lost in the translation between the researchers and Gates education chief, Vicki Phillips, or between her and Sam Dillon at the New York Times, but the article contains a false claim that needs to be corrected before it is used to push changes in education policy and practice.

UPDATE – The LA Times coverage of the report contains a similar misinterpretation: “But the study found that teachers whose students said they “taught to the test” were, on average, lower performers on value-added measures than their peers, not higher.”

Try this thought experiment with another observed practice to illustrate my point about how the results are being mis-reported…  The correlation between student observations that “My teacher seems to know if something is bothering me” and value added was .153, which was less than the .195 correlation for “We spend a lot of time in this class practicing for [the state test].”  According to the interpretation in the NYT and LA Times, it would be correct to say “teachers who care about student problems tend to have lower value-added learning gains than those who spend a lot of time on test prep.”

Of course, that’s not true.  Teachers caring about what is bothering students is positively associated with value added just as test prep is.  It is just that teachers caring is a little less strongly related than test prep.  Caring does not have a negative effect just because the correlation is lower than other observed behaviors.

-Jay P. Greene

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College