A Response to “Texting Nudges Harm Degree Completion”

In March 2016 Jay Greene raised concerns in Education Next about the use of nudges (specifically interactive text message reminders) to support low-income high school graduates to follow through on their intentions to pursue postsecondary education. Greene observed that it’s unclear whether nudging students into college is beneficial without longer-term evidence on whether the students persist in college and earn a degree. We agree with this point, and in late December 2018 posted results on longer-run outcomes for the original texting campaign we conducted in 2012 to reduce “summer melt” among college-intending high school graduates, as well as for two other low-cost interventions to help such students follow through on their college plans: a counselor outreach campaign consisting of 2-3 hours of summer support for students (and which included text-based communication between counselors and students) and a peer mentor intervention. We were able to observe students’ college enrollment and degree attainment for at least six years after high school for most of the experimental samples, and for four years for one of the 2012 texting samples.

On January 18th, Greene posted a new column on the EdNext Blog. As indicated by the post’s title, “Texting Nudges Harm Degree Completion,” Greene suggests that our latest results show that the summer melt texting campaign led to an overall reduction in the share of students earning a degree. As was the case with Greene’s 2016 article, we welcome critical review of our analyses and believe nudge strategies in education merit close scrutiny and debate, as does any new policy instrument. Yet Greene’s cursory and simplistic presentation of our follow-up results leaves the impression that his column may be motivated more by an ideological position than by a careful review of the full set of evidence.

A closer inspection of the results suggests a more nuanced interpretation: potential reductions in degree completion for certain samples and potential gains in degree completion for others. Above all: too little precision to conclude much about the long-term impacts of nudges in education from these analyses alone. We are in the process of writing a working paper that will comprehensively discuss our analyses but share our initial responses to Greene’s column here. In particular, we highlight results that complicate and, in some cases, run counter to Greene’s broad assessment:

• The first set of results we present for the 2012 texting study are in Table 10, and show estimated impacts on four-year degree attainment for a pooled sample of high school graduates from uAspire, a non-profit college affordability organization, and Dallas Independent School District (Dallas ISD). Four years is likely too short a time frame to observe impacts on degree attainment. Nationally the median time to degree is six years, and attainment rates in our sample are much lower: only four percent of students earned an associate’s degree and seven percent of students earned a bachelor’s degree within four years. As Greene observes, we find a marginally significant one percentage point decline in associate’s degree attainment and a 1.7 percentage point decline in bachelor’s degree attainment for the treatment group of students who received the summer-melt texting intervention relative to the control group.

• In Table 12 we present six-year degree impacts for the uAspire sample; we were only able to obtain college enrollment and completion data four years after high school for Dallas ISD. The table’s bottom panel presents results of the uAspire summer 2012 texting campaign on whether students earned any degree over different time intervals. The point estimate for whether students earn any degree within six years is very close to zero, but the confidence interval is +/- about 4 percentage points, meaning we cannot rule out substantively meaningful positive or negative impacts—we just have too little precision. The top panels of the table decompose the “any degree” impact into effects on associate’s and bachelor’s degree attainment. The impact on earning an associate’s degree within six years is -2 percentage points (significant at the 10 percent level) while the impact on bachelor’s degree attainment is a non-significant one percentage point. This suggests the possibility that the treatment shifted students from pursuing two-year to four-year degrees, though we lack sufficient precision to make definitive claims.

• In Table 17, we focus on a subset of the summer 2012 sample that had stronger observed intentions to pursue college (as indicated by their application for a Last Dollar Scholarship from uAspire). For this group, the impacts of the texting campaign are potentially large and positive. We find a non-significant 4 percentage point increase in degree attainment overall within six years, a non-significant 6 percentage point increase in bachelor’s degree attainment within six years, and a significant nine percentage point increase in bachelor’s degree attainment within five years. We view this analysis as exploratory, but the results suggest that there may be certain populations that benefit substantially from text nudges. Discussion of these potential positive impacts of nudge campaigns is noticeably absent from Greene’s post.

• In Table 19, we present six-year degree attainment impacts for students who received both summer melt nudges and nudges to renew financial aid their first year in college. For the subset of students who started their postsecondary education at a community college, we find suggestive though imprecisely estimated evidence of a large positive impact on degree attainment within six years. Again, we view this as exploratory and, at best, suggestive, but this pattern of results strengthens the possibility that there may be particular student populations for whom and/or educational settings in which the longer-term impacts of text nudges are positive and substantial.

• We find similarly positive impacts on degree attainment for the other summer interventions we evaluate in our follow-up analyses—for all students in the case of peer mentor outreach and for low-income students in the summer counseling outreach campaign. While more intensive than text nudges alone, these interventions were still low-touch, with students receiving at most 2-3 hours of support during the summer months, and involved text-based communication. Yet Greene does not mention the longer-term positive impacts from these interventions in his column.

Our goal in this response is not to argue the counterpoint to Greene’s claim that “Text Nudges Harm Degree Completion”—that low-touch summer interventions always benefit students. As is true with most education interventions, the evidence appears mixed. Some student populations in certain educational settings appear to be more likely to graduate from college as a result of summer outreach; other students may have a reduced probability of graduating on-time, though we do not know whether this negative impact would persist over a longer time horizon. We do know that further follow up with additional samples will allow us to estimate these effects with more precision.

So what do these results mean for nudges in education? There is enough rigorous scholarship on nudges to reveal patterns similar to those we have seen in other education interventions, from intensive college advising to charter schools to school vouchers: in some settings and for some students, interventions promote beneficial outcomes, but for other students, the same intervention has no effect or may impede students’ educational success. We believe the relevant questions to investigate at this stage are not “Do nudges work?” but rather “At which decision points, in which educational settings, and for whom are nudges effective?” and “What are the design features of effective nudges that are most strongly correlated with positive impacts?” We also believe in the importance of understanding the limitations of nudges and their implications for educational policy and practice. Finally, we believe strongly in the importance of replicating results from studies that show early promise and investing time and resources to conduct follow-up studies. We welcome and encourage replication and extensions of our summer melt interventions by other researchers. We have posted publicly the intervention materials we used in the summer interventions, and would be happy to provide further guidance for any researchers seeking to replicate the intervention.

We are eager for further discussion and debate of the research and policy implications of our follow-up study, particularly after we have had the chance to post our working paper. But we believe neither educators, policymakers, nor other researchers are served by selective interpretation of results that support an individual researcher’s convictions at the expense of a more nuanced and complete analysis.

— Ben Castleman and Lindsay Page

Ben Castleman is an Assistant Professor of Education and Public Policy at the University of Virginia. Lindsay C. Page is an assistant professor of research methodology at the University of Pittsburgh School of Education.

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College