Stanford Summer Math Camp Researchers Defend Study

Critique of California math framework draws a response
Stanford University
Stanford University was the site of a summer math camp whose outcomes were studied.

To the Editors:

Tom Loveless’s analysis piece on the California Math Framework (California’s New Math Framework Doesn’t Add Up) was thick with criticism of the framework.

The framework was the product of a careful, considered, laborious, and collaborative legislative process, and it has widespread public support. In fact, the signatories on the support petition, from STEM professionals, educators, and 66 organizations, outnumber those in the oppositional petition Loveless cites.

We are in particular concerned over his critique of the study of Youcubed summer camps cited in the framework.

The first Youcubed summer camp was conducted at Stanford in 2015. This first session resulted in achievement gains equivalent to 2.8 years of school. A video of the students participating in that very first camp can be seen here.

As Loveless notes, a later study of camps conducted in 10 districts across the U.S., with an average achievement gain equivalent to 1.6 years of school.

Participating school districts invested countless hours organizing and running their camps, providing students with new ways and new outlooks through which they might engage with mathematics. The camps changed students’ learning and their levels of enthusiasm for mathematics the further they progressed through each camp program. These school districts supplied all of their research data to Youcubed at Stanford, and their efforts, as well as the immeasurable accomplishments of their students, should be lauded, rather than torn to shreds.

The Loveless piece describes the study of these camps as an “in-house” study. The truer description is that Stanford University researchers studied camps conducted by others across the U.S. The statistics produced were vetted by external evaluators, and the resulting journal article was peer reviewed by Frontiers in Education, a scientific journal.

Loveless argues further that there was no control group for the study’s analysis of test-score outcomes. He fails to mention the main result of the study – that the achievement of the students attending the camps was evaluated through analysis of their math GPAs in their following school year, compared to a control group. The result of this was that the students attending Youcubed camps achieving significantly higher math GPAs. This result is based on a quasi-experimental design in which students who attended the camps were statistically matched with students who demonstrated similar levels of prior achievement but did not attend the camps. Comparison groups drew from administrative and demographic student-level data. We matched on various characteristics including socioeconomic status, gender, English-learner status, special education status, and previous math grade point averages. The statisticians performed several sensitivity tests to ensure the robustness of the findings.

The Loveless piece critiques the use of Mathematics Assessment Resource Services (MARS) tasks as pre- and post-tests. These particular tasks were chosen because they are well-respected assessments scored by external evaluators. The Loveless writeup omits the rich legacy of these assessments, and their importance in assessing mathematical understanding.

Non-enrolled students did not take the MARS assessments because the tests were administered in two 40-minute blocks of time and were typically administered on the first day of camp. The students did not work on “similar problems” in camp, and the questions were chosen to measure algebraic understanding. These assessments were used alongside math GPAs as a measure of change, and were included in the statistical model as just one of perhaps many dimensions of mathematical understanding.

The Loveless piece further omits an important finding of the Stanford camp study which showed that, across the 10 Youcubed camps sites, the more hours the students spent in their respective camps, the greater their improvement on assessments.

Lastly, Loveless’s critique of the proposed California Framework’s approach to “automaticity” steps over the fact that the framework plainly highlights numerical understanding as an absolute necessity, positing that students can learn math facts and other number bonds with deeper and more expansive levels of understanding, than through rote memorization.

Loveless closes his Youcubed takedown by saying “if the Youcubed gains are to be believed, all pandemic learning loss can be restored, and additional gains achieved, by two to four weeks of summer school.”

Youcubed does stand by its conclusions. It is probably true that if students were to receive concentrated and focused interventions that targeted mathematical understanding, rather than the same sort of rote memorization that has been taught not just for decades, but for centuries, any loss of mathematics achievement caused by the Covid pandemic could be reversed. Our camps provide just one example of how this can be done.

Youcubed is an open organization that encourages expansive thought and free-flowing collaboration. That same spirit infuses how we teach others to teach — interactively and cooperatively, with an eye always trained on what works, and what will stick with students over the long haul, as they traverse the earliest years of their schooling, and then transition to whatever future path may suit them best.

A response from Tom Loveless to this letter is available at “Stanford Summer Math Camp Defense Doesn’t Add Up, Either.”

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College