The 2017 NAEP Results: Nothing To See Here?

Well, the long-awaited (at least as far as the edu-pundit class is concerned) 2017 NAEP results have been released. Unlike 2015’s results, which landed with a thud, these landed with a “meh.” There have been very few meaningful changes since 2015, with national averages and the vast majority of states’ results changing barely, if at all, in 4th- and 8th-grade math and ELA.

What do I take away from these unexciting results?

First, I note that scores are still way up (in mathematics) and a little up (in ELA) relative to a couple decades ago. That is, 2015’s dip, while apparently real, isn’t even close to erasing the tremendous progress that all student groups made during the standards-based reform period of the last quarter century.

Second, there is no obvious pattern here with regard to Common Core versus non–Common Core states (as was the case in 2015). It’s a bit hard to say who’s a Common Core state and who’s not at this point, but if we take the average score change from 2015 to 2017 in the seven decidedly non-CCSS states in both subjects (Alaska, Indiana, Nebraska, Oklahoma, South Carolina, Texas, and Virginia), we see that these states declined by about 1.4 points on average across tests. This is lower than the national average score change, which was a gain of 0.2 points, but does this difference mean anything? Probably not. Two of the three biggest losers from 2015 to 2017 were non-CCSS states (Alaska and South Carolina), but the other one was Louisiana, which many people believe is doing a bang-up job on Common Core implementation.

Third, the California results in particular seem to corroborate recent research on the state. California was the third-highest-gaining state (below Florida and New Jersey), though the gains were only statistically significant on some tests. This supports recent work suggesting California’s new funding formula, which pours substantial money into low-income schools, is causing achievement to increase. California districts also fared particularly well on TUDA—the top three gainers from 2015 to 2017 were, in order, San Diego, Los Angeles, and Fresno, and these were districts that particularly benefited from the new funding formula.

Fourth, we need more rigorous investigation of these results to understand whether they can really tell us anything about policy effects. Several groups are already working on this issue. For instance, I am co-principal investigator on the IES-funded Center for Standards, Alignment, Instruction, and Learning, and one of our main projects is a longitudinal study of NAEP data to understand the impact of “college- and career-readiness standards” on student achievement. Other work might consider exploring substrand or item-level results, as well as the restricted-use data, to see if those can tell us anything.

Overall, while this year’s results are not the sexiest, the NAEP data offer an important barometer. I conclude that educational progress is real, but stalled, and that we may need new and sustained policy efforts to resume the gains we saw in the 1990s and 2000s.

—Morgan Polikoff

Morgan Polikoff is Associate Professor at the Rossier School of Education at USC.

This post is part of a series from Education Next analyzing the 2017 results from the National Assessment of Educational Progress. Click here to check out the series, and stay tuned throughout the week of April 10 for more analysis

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College