Which Student Growth Method Should Policymakers Use to Evaluate Schools?

Contact:
Ashley Inman: ashley_inman@hks.harvard.edu, 707-332-1184, Education Next Communications Office
Cory Koedel: koedelc@missouri.edu, University of Missouri, Columbia

Which Student Growth Method Should Policymakers Use to Evaluate Schools?

Measuring student performance correctly helps set the right expectations for students and teachers in both high-poverty and advantaged schools.

Increasingly, states and school districts use measures based on growth in individual students’ test scores to evaluate which schools are performing well and how effectively educators are teaching.  But the question of how best to measure student test-score growth for evaluation purposes is still the subject of lively debate.  Now new research from Education Next concludes that in order to send the most useful information to educators and local decision makers, growth measures should level the playing field by comparing the performance of schools and teachers that are in similar circumstances.

In “Choosing the Right Growth Measure,” authors Mark Ehlert, Cory Koedel, Eric Parsons, and Michael Podgursky examine three evaluation approaches that represent the range of options available to policymakers. Using data from the Missouri Assessment Program, they calculate growth in mathematics for 1,846 schools serving grades 4 to 8 over a 5-year period (2007-2011).

The approach the authors favor is based on a value-added technique that is carried out in two steps. In the first step, test scores are adjusted for differences in prior test scores and the demographic characteristics of the students the schools serve.  For example, if low-income students really do attend inferior schools as compared with high-income students, the average gap in school quality would be eliminated in this first step.  In the second step, these adjusted test scores are used to construct a growth measure for each school. By comparing schools that serve low-income students to others that serve similar students, this approach ensures that there is essentially no relationship between growth measures and aggregate measures of student poverty. When this two-step approach is used, high- and low-poverty schools are roughly evenly represented among those schools identified as more and less effective.

The authors identify several advantages to the two-step value-added technique:

• All educators are encouraged to work hard, since they are not compared to educators in very different settings.

• Instructional strategies, curricula, personnel policies, and day-to-day decisions that prove successful in high-poverty schools can be identified and used in other, less-effective high-poverty schools.

• The ability of high-poverty schools to recruit and retain teachers will not be diminished.  The authors say, “Policymakers should proceed cautiously with implementing an evaluation system that could worsen the working conditions in challenging education environments.”

The authors note the potential risks created by two other approaches, which are more frequently used to assess school and teacher performance.  The first, often referred to as Student Growth Percentiles, compares each student’s test score to the scores of students with similar test-score histories and then calculates the median score for all students in the school or classroom.  Because this approach does not take into account student or school demographic characteristics, it is likely to penalize schools serving disadvantaged students.  A second approach, and the one that is most commonly used by researchers, is a value-added model that adjusts test scores for students’ prior test scores and demographic characteristics and calculates performance measures in a single step. The results of this approach may also be biased in favor of schools serving more advantaged students if the test-score growth of disadvantaged students differs in ways not captured by the value-added model.

Choosing the Right Growth Measure: Methods should compare similar schools and teachers,” is available now at https://www.educationnext.org, and will appear in the Spring 2014 issue of Education Next.

About the Authors

Mark Ehlert is research associate professor of economics at the University of Missouri, Columbia, where Cory Koedel is assistant professor of economics, Eric Parsons is research assistant professor of economics, and Michael Podgursky is professor of economics. Authors are available for interviews.

About Education Next

Education Next is a scholarly journal published by the Hoover Institution that is committed to careful examination of evidence relating to school reform. Other sponsoring institutions are the Program on Education Policy and Governance at Harvard University, part of the Taubman Center for State and Local Government at the Harvard Kennedy School, and the Thomas B. Fordham Foundation. For more information about Education Next, please visit: https://www.educationnext.org.

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College