Change offers the opportunity for transformation, and education is no stranger to the tectonic shifts currently underway in the U.S. political landscape. Betsy DeVos has been confirmed as Secretary of Education, while the Every Student Succeeds Act (ESSA) guides the federal role in education. For students to benefit from these changes, however, state education leaders must avoid tinkering with new initiatives and instead seek courageous, whole-system reforms. They must also avoid reductionist policy fixes and focus on the hard but necessary work of, as DeVos tweeted, improving “options and outcomes for all students.”
In his recent EdNext article, Tom Kane argues that the new evidence requirements in ESSA offer state leaders a powerful lever to improve how decisions on purchasing and practice are made in schools. Local outcome data will drive this decisionmaking, says Kane, and to deliver on these new requirements, he imagines a future in which a collection of “efficacy networks” provide local capacity to measure, evaluate, and share the impact of local interventions.
This network model emerges from Kane’s experience with the excellent Proving Ground project at the Center for Education Policy Research at Harvard University. As a part of this project, he encourages states to help incentivize participation in these networks through linking funds, and he believes change will come as local research shifts the opinion of local decisionmakers.
The call for the greater integration of evidence into decisionmaking is right and needed. A broad coalition of education leaders, ranging from the Gates Foundation to Digital Promise and the Office of Education Technology, has championed linking evidence to decisionmaking. Similarly, efficacy has been a chief focus of our work at Pearson since 2013. It impacts how we design products, measure effectiveness, leverage data and knowledge into product-level improvements, and report on the outcomes of those improvements to our customers. Scaling this shift in approach across a global company hasn’t been quick or easy, but it has allowed us to better serve teachers and learners.
Kane’s vision for the future of education research is intriguing, and state leaders would be well-served to reflect upon these policy options. Building on his ideas, here are three features of research that we hope to see in the future of education:
1. A Pillar of Whole System Reform
To imagine a great American education system, we must imagine a radically different approach to all parts of the system—including education research. Research cannot be purely academic or reserved for accountability measures. Evidence can’t just capture how things were or how they are. It needs to become deeply embedded in the formative process of improving outcomes. As Kane mentions, these requirements will be little without state leadership. That leadership should include setting a bold vision for education improvement that includes an emphasis on research and evidence.
If efficacy networks are put into place, they should be linked to a larger mandate to improve education using evidence-based methods. This mandate doesn’t have to come from Washington, D.C., but each pilot program should be seen as part of a larger cultural shift.
Overcoming the deficit of “efficacy literacy” will not happen overnight, especially given the shocking lack of attention paid to national evidence standards (as shown by Kane’s exhaustive study of school minutes). Teachers and decisionmakers need to know how to interpret evidence, which means being able to navigate sample sizes, differences of impact, uncertainty, and reporting methods. The Datawise work at Harvard could give them an excellent start in understanding these concepts.
More evidence will not change outcomes without a strong focus on how to effectively spread and scale these new ideas and practice. We should learn from the work of the Gates Foundation, which utilizes teacher networks and influencers to help the importance of evidence “go viral.” We’ll know that a transformation is truly underway when efficacy and evidence start showing up on the agenda at school board and PTA meetings.
2. Evidence Which is Timely, Impactful, and Understandable
Bureaucracy chokes useful information. Unless data and research is fed back to teachers and administrators in a timely and understandable fashion, then it will be hard to see improvements in products or decisionmaking. If these efficacy networks are to deliver change, they will need to inform classroom practice, product development, and parental choice. To do that, we will need a much more intuitive approach to data. We might even look to nutrition labels to find better ways of reporting evidence back to teachers and parents.
The work done by a company like Learn Platform takes an important step in lowering the barrier to sharing information on usage, results, and experience. Reports should not be about heading toward a final outcome or verdict (“here’s what works”) but about posing questions that lead to further improvement (“how can we use what the data tells us to improve teaching and learning?”).
3. Focused on “How” and “Why,” Not Just “What”
Kane notes that “practitioners want to know whether a given intervention will work in their own classroom.” As a result, Kane argues for more local pilots. But what those pilots are capturing and sharing should be questioned. Instead of trying to mimic the pharmaceutical industry’s approach of certifying certain approved products, education research needs to dynamically capture and share information on implementation—the how, the why, and the who—instead of focusing solely on what works in the classroom.
We know that the efficacy of products is linked to their intended usage or context. Context and implementation are critical toward evaluating efficacy, an insight surfaced in “Understand, Implement, and Evaluate” and supported by a wider review of education research.
The time is ripe for change, and Kane is right in saying that ESSA offers a unique chance for research to play a vital role in the transformation of education. Efficacy networks might help, but if evidence is to drive impact, it must be part of a larger, clearly communicated vision of research integrated with practice. This is an ambitious vision, but if we truly care about improving options and outcomes for all students, a vital one.
—Sir Michael Barber and Nathan Martin
Sir Michael Barber is Chief Education Advisor to Pearson. Nathan Martin is Efficacy and Innovation Manager at Pearson.
Last updated February 21, 2017