Colleges, particularly community colleges, have long relied upon remedial coursework to help academically underprepared students get ready for college-level work.  Students assigned to remediation may be required to take three or more remedial courses – which cost money, but do not confer college credit – before they can enroll in college-level coursework. The best available national data (based on student transcripts) indicates that as of about 2009, about half of all college students and nearly 70 percent of community college entrants had taken at least one remedial course within 6 years of college entry. 
But 2009 feels like eons ago when it comes to college remediation policy and practice. Between 2008 and 2012, a number of studies began to raise serious questions about the effectiveness of typical remedial placement and delivery practices. My own research indicated that community college students were being over diagnosed as underprepared, and estimated that one-quarter to one-third of students assigned to remediation could have earned a B or better in college-level coursework, had they been given the chance.  Around the same time, remediation rates appeared to decline for the first time in years. Figure 1 shows self-reported remediation rates rising steadily from 2000 to 2008, before dropping off between 2008 and 2012. 
Figure 1: Percent of current undergraduates who report ever having taken a remedial course
Source: Author’s tabulations using NCES TrendStats with National Postsecondary Student Aid Surveys, 2000-2012. Note: Self-reported estimates of remediation rates are substantially lower than transcript-based estimates such as those available in the Beginning Postsecondary Student database, note that these self-reports are lower than transcript-based estimates since they are not cumulative over students’ entire period of study, and because students don’t always know which courses are remedial. Unfortunately, transcript-based measures are not currently available to track for different cohorts over time. The trends are still informative as long as under-reporting is relatively stable over time.
This dip may be purely coincidental, as the economic conditions and postsecondary enrollment patterns shifted over this period, and unfortunately more recent national data are not yet available. But since 2012, a number of large-scale system- and state-wide reforms have given students more options for completing remediation quickly, and more ways to avoid it altogether. Most recently, in October 2017 California joined Texas, Florida, and Connecticut in passing legislation intended to reduce the number of college students assigned to a traditional remedial course sequence. It thus seems probable that remediation rates may now be significantly lower than they were a decade ago, at least in affected states.
As an education policy scholar whose research contributed to calls for reform, there’s nothing more rewarding than to see my own work have a real-world impact on policy. (But because there’s always the possibility of being wrong, it’s also more than a little terrifying!) Below, I review the key evidence that motivated calls for reform, take stock of the major changes that have occurred since then, and describe what we know so far about how well these reforms are working. Overall, the case of college remediation presents an encouraging example of productive, ongoing interaction between research and policy.
Data enables research which identifies a problem
A critical factor spurring the wave of research that began around 2008 was the increasing accessibility of de-identified state-level administrative datasets, which enabled institutions and analysts to systematically track course-taking patterns and outcomes for remediated students for the first time.  For example, one study used such data to show that less than half of students assigned to remediation in English and only a third in math ever completed the course sequence to which they were assigned.  Several studies estimated the causal effect of being assigned to remediation on future college outcomes by comparing students just above and below test score cutoffs for remedial placement. These studies generally found null to negative impacts of assignment to remediation in terms of credit and degree completion for community college students, though a study of four-year college students using a different method finds some positive effects. 
A typical caveat in these studies is that they estimate effects of remedial assignment only for students scoring near the cutoff. Indeed, studies examining how these effects vary for different types of students suggested that the negative effects of remediation may be largest for those closest to college-ready with less negative and some positive effects for students who score at lower levels. 
In retrospect, this pattern of findings does not seem altogether surprising. By requiring students to take remedial coursework before they can take college-level courses and complete their degree, institutions immediately place remediated students at a disadvantage (in terms of both time and money). The hope was that the remedial course treatment would improve subsequent performance enough to more than make up for this initial deficit. And for some students, it might, but others simply drop out before ever making it out of remediation.
Problem leads to more research, which identifies more problems
These discouraging findings regarding the effects of remedial assignment led to my interest in the effectiveness of the remedial assignment process itself. In 2009, the vast majority of community colleges relied upon high-stakes standardized exams to determine placement (primarily the COMPASS® and Accuplacer®, produced by ACT, Inc. and The College Board, respectively). Placement was (and still is) often solely based on whether or not a single test score fell above or below a fixed cutoff. But very little was known about the quality of these exams, beyond validity reports published by the test-makers themselves.
If a placement tool isn’t accurate or reliable, it can result in high error rates – truly prepared students being assigned to remediation, or truly unprepared students being assigned to college-level courses – which could undermine the effectiveness of the remedial “treatment.” My colleagues and I at the Community College Research Center (CCRC) analyzed the error rates of placement tests versus measures of high school performance, using data from tens of thousands of community college entrants in two large multi-institution systems using different placement tests. Incorporating rich information on students’ high school performance, placement test scores, and demographics, we developed statistical models to predict how remediated students would have performed had they been placed directly into college-level courses.
In both analyses, we estimated that misplacement into remediation was far more common than misplacement into college-level courses. In one urban system, we estimated that nearly a quarter of students assigned to remedial math and a third of students assigned to remedial English could have passed college-level courses with a B or better.  In related work, we also showed that high school transcript information (courses taken and grades) generally predicted college performance better than one-off test scores, and that using multiple measures of preparation could substantially lower remediation rates while maintaining or increasing the probability of success for those assigned directly to college-level coursework. 
Research evidence enters the conversation and opens door for innovation and experimentation
These findings resonated with many professionals out in the field, some of whom were already raising their own questions and beginning to experiment with alternative models. For example, in 2012, Long Beach City College (LBCC) in California was one of the first to develop and pilot an alternative placement algorithm based on high school coursework and grades, which increased the proportion of students placing directly into college-level coursework by 21 percentage points in math and 56 percentage points in English, without significantly lowering the average performance of students in these courses.  Many other institutions had no choice but to innovate when ACT, Inc. discontinued the COMPASS® exam in 2015, stating that “A thorough analysis of customer feedback, empirical evidence and postsecondary trends led us to conclude that ACT Compass is not contributing as effectively to student placement and success as it had in the past.” 
Other institutions have experimented with a “co-requisite” model in which at least some students previously targeted for remediation instead have the option to start in college-level courses immediately, with required supplementary instruction provided alongside. The Community College of Baltimore County’s (CCBC’s) Accelerated Learning Program is one early example that produced large increases in college-level course completion.  Several states, including Tennessee and Virginia, have expanded the use of co-requisite models and/or redesigned, shorter remedial course sequences.  Still others, such as Texas, experimented with new math pathways focused on quantitative reasoning and statistics rather than traditional college algebra. 
A recent nationally representative institutional survey conducted by the Center for the Analysis of Postsecondary Readiness (or CAPR, a research partnership led by MDRC and CCRC) shows that these innovations are becoming widespread. In 2016, more than half of community colleges used multiple measures for remedial placement, more than doubling the numbers from 2011.  While most community colleges continue to offer traditional remedial sequences, the majority now offer compressed courses in at least one remedial subject, and more than a third now offer co-requisite options in English (as well as 16 percent offering similar options in math).
Table 1, which catalogs the most consequential state-level policy reforms relating to college remediation over the past decade, suggests the reform movement is continuing to gain steam (the Education Commission of the States [ECS] State Policy Database provides addition details). Recently-passed laws in California and Texas have yet to take full effect, and new bills have been introduced in additional states (e.g., Minnesota’s S.F. 302, introduced in 2017, seeks to minimize remediation via use of multiple measures and co-requisite supports).
|Table 1 – Selected state-level reforms relating to college remediation|
|California||AB 705||October 2017||Requires institutions to offer co-requisite model and mandates 75 percent of remediation be co-requisite by 2020|
|Texas||HB 2223||June 2017||Requires institutions to offer co-requisite model and mandates 75 percent of remediation be co-requisite by 2020|
|Minnesota||HF 2749||June 2016||Prohibits institutions from requiring remediation for students who meet certain benchmarks on state HS exams, the ACT or SAT|
|Tennessee||TN Board of Regents A-100 Guideline||2015||Board of Regents guidline makes all remediation co-requisite, based on success from 2014 pilot study|
|Oregon||HB 2681||June 2015||Creates a work group to study and recommend best practices for remedial course placement|
|Florida||SB 1720||May 2013||Prohibits institutions from requiring remediation for Florida HS diploma recipients; institutions must offer more flexible options for non-exempt students|
|Connecticut||PA 12-40||July 2012||By 2014, limits remediation to one-semester plus embedded support for college-level, requires multiple measures for placement|
|Washington||SB 5712||April 2013||Encourages colleges to use multiple measures for remedial placement|
Source: Education Commission of the States (ECS) State Policy Database, retrieved March 23, 2018; Tennessee Board of Regents Policies and Guidelines (https://policies.tbr.edu/guidelines), retrieved March 23, 2018.
Experimentation leads to more research evidence
So what do we know about the consequences of all these reforms? So far, the evidence is limited to a handful of institutional studies, like the ones at LBCC and CCBC, as well as from descriptive analyses of changes in aggregate state-level trends, such as in Florida and Tennessee. One study found that remediation rates in Florida fell by nearly half after the implementation of S.B. 1720 (from 38 to 22 percent in math, and from 21 to 10 percent in reading).  The largest declines occurred among black students and students under age 25. While pass rates in the first college-level course (“gateway” course) in each subject decreased modestly (from 75 to 73 percent in English composition, and from 65 to 58 percent in math), because more students attempted these gateway courses the overall percentage of students successfully completing them rose markedly (by 7 percentage points in English and 4 percentage points in math), with larger increases for black students.
Early evidence regarding the effects of co-requisite supports (as opposed to the traditional pre-requisite model of remediation) is similarly encouraging. A study by the Tennessee Board of Regents found that after shifting to a co-requisite model in 2015, the percent of community college entrants completing a gateway course in their first year doubled in writing and quadrupled in math.  Meanwhile, a study at three community colleges in New York City found that students were far more likely to complete a college-level math course when they were randomly assigned directly to a college-level course with weekly supporting workshops, instead of to a traditional pre-requisite remedial course. 
These studies still leave plenty of open questions, including whether the estimated effects are robust to more rigorous methods of evaluation, and whether the reforms affect student outcomes beyond grades in the “gateway” college-level course. But additional research is already underway. For example, the Center for the Analysis of Postsecondary Readiness has two large randomized-controlled trials ongoing to examine the impacts of using multiple measures of preparation for placement as well as the effectiveness of an alternative approach to traditional college math pathways. While both studies are ongoing, preliminary evidence suggests these reforms are having the positive effects that early work had hypothesized. 
The success of these major policy shifts is not without caveat. A RAND study of the implementation of co-requisite models in Texas, for example, found that many participating institutions reported faculty resistance (most commonly from developmental education faculty), challenges with scheduling and advising, insufficient training and support, and anxiety over the fast pace of change.  Collecting and processing multiple measures of college preparation, including high school transcript information, can be a heavy lift for institutions, especially those that are not already engaged in close partnerships with local K-12 schools. Limiting remediation can also add to the financial strains of already under-resourced community colleges: college-level courses cost more to provide, and institutions often aren’t reimbursed by the state for providing co-requisite support, as they typically are for enrolling students in remedial courses. 
As more evidence is generated, new questions are sure to be raised. But as long as a virtuous cycle between research, policy, and practice continues, outcomes for students will hopefully keep moving in the right direction.
— Judith Scott-Clayton
This post originally appeared as part of Evidence Speaks, a weekly series of reports and notes by a standing panel of researchers under the editorship of Russ Whitehurst.
The author(s) were not paid by any entity outside of Brookings to write this particular article and did not receive financial support from or serve in a leadership position with any entity whose political or financial interests could be affected by this article.
2. Author’s tabulations using transcript data from the nationally representative Beginning Postsecondary Students: 2004-2009 dataset, via NCES QuickStats. Note that these estimates, which rely upon transcripts and are cumulative over 6 years post-entry, are preferable to (and substantially higher than) students’ self-reports of remedial coursework taken only in the first year of college, since many students are not aware of which courses are “remedial” and some may delay remedial course-taking beyond the first year.
3. Scott-Clayton, J. (2012). “Are College Students Overdiagnosed as Underprepared?” Economix, NYTimes.com, https://economix.blogs.nytimes.com/2012/04/20/are-college-entrants-overdiagnosed-as-underprepared/, based on Scott-Clayton, J. (2012). Do high stakes placement exams predict college success? (CCRC Working Paper No. 41). New York, NY: Columbia University, Teachers College, Community College Research Center.
4. Author’s tabulations using student self-reports from four waves of the National Postsecondary Student Aid Survey, 2000, 2004, 2008, and 2012, accessed via NCES TrendStats. Note that these self-reports are lower than transcript-based estimates since they are not cumulative over students’ entire period of study, and because students don’t always know which courses are remedial.
7. Martorell, P., & McFarlin, I. J. (2011). Help or hindrance? The effects of college remediation on academic and labor market outcomes. The Review of Economics and Statistics, 93(2), 436–454; Calcagno, J. C., & Long, B. T. (2008). The impact of postsecondary remediation using a regression discontinuity approach: Addressing endogenous sorting and noncompliance (NBER Working Paper No. 14194). Cambridge, MA: National Bureau of Economic Research; Scott-Clayton, J., & Rodriguez, O. (2015). Development, discouragement, or diversion? New evidence on the effects of college remediation policy. Education Finance and Policy, 10(1), 4-45; Dadgar, M. (2012). Essays on the Economics of Community College Students’ Academic and Labor Market Success. (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses. (Accession Order No. ); Bettinger, E. P., & Long, B. T. (2009). Addressing the needs of underprepared students in higher education: Does college remediation work? Journal of Human Resources, 44(3), 736–771.
8. Studies finding biggest negative effects for highest-prepared students include Martorell & McFarlin (2011) and Scott-Clayton & Rodriguez (2012); studies finding smaller negative or even some positive effects for lower-scoring students include: Boatman, A. & Long, B. T. (2010). Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation (NCPR Working Paper). New York, NY: National Center for Postsecondary Research; Hodara, M. (2012). Language Minority Students at Community College: How Do Developmental Education and English as a Second Language Affect Their Educational Outcomes? (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses. (Accession Order No. ).
10. Scott-Clayton, J. (2012). Do high stakes placement exams predict college success? (CCRC Working Paper No. 41). New York, NY: Columbia University, Teachers College, Community College Research Center; Belfield, C., & Crosta, P. M. (2012). Predicting success in college: The importance of placement tests and high school transcripts (CCRC Working Paper No. 42). New York, NY: Columbia University, Teachers College, Community College Research Center.
14. Edgecombe, Nikki (2016). The Redesign of Developmental Education in Virginia. New Directions for Community Colleges, vol. 2016, no. 176; Rutschow, E.Z. & Mayer, A.K. (2018). Early Findings from a National Survey of Developmental Education Practices. New York: Center for the Analysis of Postsecondary Readiness.
15. Rutschow, E.Z, Diamond, J. & Serna-Wallender, E. (2017). Math in the Real World: Early Findings from a Study of the Dana Center Mathematics Pathways. New York: Center for the Analysis of Postsecondary Readiness. https://postsecondaryreadiness.org/math-real-world-early-findings-dcmp/
17. Hu, S., Park, T. J., Woods, C. S., Tandberg, D. A., Richard, K., & Hankerson, D. (2016). Investigating Developmental and College-Level Course Enrollment and Passing before and after Florida’s Developmental Education Reform. REL 2017-203. Regional Educational Laboratory Southeast.
18. Denley, Tristan (2016). Co-requisite Remediation Full Implementation 2015-16. Tennessee Board of Regents Technical Brief No. 3. https://www.tbr.edu/sites/tbr.edu/files/media/2016/12/TBR%20CoRequisite%20Study%20-%20Full%20Implementation%202015-2016.pdf
20. Rutschow, Diamond, & Serna-Wallender (2017); Barnett, E., Kopko, E., & Ragucci, M. (2018). Student Assessment and Placement Systems: Initial Outcomes From an RCT. Presentation given at the annual conference of the League for Innovation in the Community College, March 18, 2018. https://ccrc.tc.columbia.edu/presentation/student-assessment-placement-systems-initial-outcomes-rct.html
21. Lindsay Daugherty, Celia J. Gomez, Diana Gehlhaus Carew, Alexandra Mendoza-Graf, and Trey Miller. Designing and Implementing Corequisite Models of Developmental Education: Findings from Texas Community Colleges. Santa Monica, CA: RAND Corporation, 2018. https://www.rand.org/pubs/research_reports/RR2337.html.
22. Ashley A. Smith, June 16, 2017, “Florida Colleges Take Hit on Remediation,” Inside Higher Ed. https://www.insidehighered.com/news/2017/06/16/florida-colleges-take-hit-remediation-veto-cap-ba-degrees.