Value-Added: The Devil’s in the Details
(This post also appears on Rick Hess Straight Up.)
In response to the mail I’ve received since Monday’s column critiquing Aaron Pallas’s attack on the DCPS teacher firings, I think it’s useful for me to weigh in on the live-wire question of value-added systems. Monday’s column was not meant to be a simple-minded defense of value-added systems, but rather an attempt to defend a smart, careful effort against an unfair attack. The truth is that value-added systems (including DCPS’s IMPACT) are more an art than a science, and the slew of decisions involved in designing them can be reasonably questioned and ought to be subject to responsible public scrutiny.
“Responsible public scrutiny” doesn’t mean that all the intricate and proprietary details of value-added systems need to be available online, but that there needs to at least be a process for policymakers, educators, analysts, journalists, and researchers to obtain the technical details that determine just how the system works. On this score, I think it’s fair to say that most states and districts using value-added models–including DCPS–can and should do better.
After all, there are a vast number of judgment calls–regarding student characteristics, interactions, test scaling, and the rest–required when building any value-added system. As Stanford University’s Rick Hanushek, who’s been a technical advisor on IMPACT, explained last week, DCPS “has developed a sophisticated statistical model to account for a variety of non-teacher factors that could affect gains, including student mobility, test error, family background, English language proficiency, and special education status.” The smart folks in the value-added community are well aware of how sensitive their models are to specification decisions and how much those decisions matter when put into practice.
When states, districts, or consultants won’t share critical details, it is appropriate to ask hard questions about their reliance upon complex adjustments and calculations understood only by insiders and assorted experts. For one thing, support for consequential evaluation systems will depend on establishing a foundation of understanding and a broad conviction that the system is fair and sensible.
While noting that value-added calculations are imperfect and dependent on assumptions and that states and districts can do better at helping to explain how their systems work, it is also worth recognizing the seriousness and technical acumen that DCPS and Mathematica have brought to designing IMPACT. When tackling value-added, it’s crucial to proceed with this kind of thoughtful appreciation for the challenges at hand (something I found lacking, for instance, in Florida’s Senate Bill 6). I think DCPS has done an admirable job on this count, but I also think that too many quick-fix artists have not–and my defense of DCPS ought not be read as a blanket defense of overcaffeinated value-added enthusiasts.