Remember how the Wizard of Oz, once the curtain was drawn back, turned out to be an insignificant little blowhard? What if “college education” in America, especially the kind that culminates in a bachelor’s degree, is headed toward a similar revelation?
Once upon a time, it was determined by the Great and the Good (as they say in England) that almost everyone needs a college education—and that the country needs for everyone to have a college education—and that it’s discriminatory and evil to deny anyone such an education. Whereupon we started slowly but surely to dilute what we mean by it.
That was inevitable in part because we weren’t able to fix our K–12 system to get everyone ready for what we formerly meant by college. When you declare that everyone—or almost everyone—should graduate from high school and enter college, you come smack up on the reality that tons of young Americans haven’t learned enough in twelve or thirteen years of school even to qualify for what we once meant by a high school diploma, much less college admission.
So at the high school end, we tried to boost standards—and some places did a pretty good job of it—but even much-praised Massachusetts wasn’t able to raise its high school exit standard to equal college readiness as traditionally defined. Lots of other places eased off or deferred their exit standards, while still others hacked alternate paths to diplomas that circumvented their exit standards and/or devised ersatz “credit recovery” schemes whereby diplomas could be “earned” without even passing the classes dictated by the old Carnegie Unit rules.
Then there’s the college end. There, our egalitarian impulse led us to create—starting decades ago—thousands of open-admission institutions that have essentially no prerequisites, often not even a high-school diploma (a GED will suffice, and maybe not even that). Then we watched selective campuses do away with some of their own long-standing prerequisites, such as the expectation that entering students would have studied and become reasonably proficient in at least one foreign language.
Most colleges employed (and many still do) placement tests to determine whether an entering student was adequately prepared to undertake credit-bearing courses in core subjects like math and English, with remedial classes assigned to those who weren’t. But remediation was an ugly and discouraging term, so a decade or two ago, it got relabeled “developmental,” and one of today’s hottest trends is to replace “developmental” with “co-requisite” courses, whereby you can actually earn credit toward a degree by completing a course with passing norms that may (or may not) fall somewhere between what we used to mean by remedial and credit-worthy.
Today’s other hot trend is “dual credit” or “early college,” whereby high school kids can begin to pile up credits toward a college degree while they’re still working on their diplomas. In some places (Texas, for example), dual credit can start as early as ninth grade—and some community colleges now derive close to half of their state formula dollars from enrolling high-school kids. Well-wrought early-college programs can be fine. Yet college credit via dual credit in most places is automatic for anyone who gets a passing grade from the instructor, who is typically an “adjunct” assigned by the community college and not infrequently a regular high school teacher with the appropriate master’s degree. Quality control is uneven, to put it gently.
Nor should we forget grade inflation, in both high school and college, whereby the kind of student work that once earned a “C” now gets at least a “B+.”
Along the way, because there was so much oomph behind the goal of getting everyone into college—and so much aversion to anything that resembles “tracking”—we devalued and stigmatized what used to be called vocational education and are now having to reinvent it under the shiny new label of “career and technical education” a.k.a. “CTE”. This stigmatizing of explicit workforce preparation had the further effect of wooing kids into college who, even by the degraded standards applied to them, were so ill-prepared that they were destined to falter, flunk, and drop out, often with a heavy debt burden—because as we were wooing everyone into college we also made it far costlier to attend, causing us to proffer easy credit to those who otherwise couldn’t swing it. If normal economic rules applied in this case, the student-debt “bubble” would make the tech and housing bubbles look like bubble tea.
Some kids got over-matched in the college where they found themselves, others under-matched. But the push to get more of them into college was relentless.
Everything seemed to make sense at the time. Many high school kids were bored, spinning their wheels during the last year or two, having completed their diploma requirements but not yet graduated. The economy needed a higher-skilled workforce. Many of tomorrow’s jobs appeared to demand college-level preparation. A college degree looked like the surest path to upward mobility. Everyone saw the urgency of increasing the enrollment of black and Latino students. And nobody, but nobody, dared take the risk of being called elitist, much less discriminatory.
There were, to be sure, efforts to hold the line on rigor, even to beef it up. Elementary-secondary academic standards rose—thanks mainly to the much-maligned Common Core—and many state tests improved, too. Some high school end-of-course exams are pretty solid (which doesn’t mean there aren’t paths around them). The Advanced Placement program has worked hard to justify its gold standard reputation, and the much smaller International Baccalaureate program does that, too. Large-scale assessments such as NAEP, TIMSS, and PISA have clung to their demanding norms and continue to speak the truth about actual performance in relation to those norms.
It’s those metrics, mainly, that reveal how little progress we’ve actually made, despite all the effort to get more kids a better education. That’s how we know that for every effort to beef up standards, there were moves to define “proficiency” downward, to ease back on cut scores, to create alternate paths, and to confer exemptions.
Those metrics are mostly at the K–12 level because the higher education industry has successfully stonewalled any comparable outcome measures of student learning. Hence today our best source of evidence of what college accomplishes is the work of analysts like Raj Chetty and Mark Schneider, who have been able to link college degrees—and different kinds of degrees in different fields from different kinds of colleges—to subsequent earnings.
From their analyses, and those of Anthony Carnevale and others, we’ve recently learned many things about the value of a college degree. We’ve learned that many respectable, well-paid jobs don’t require such degrees. We’ve learned that associate’s degrees, and industry certifications in some fields, pay better than many bachelor’s degrees. We’ve learned that some college degrees are far more reliable tickets to upward mobility than others–whether because of selection effects and the sorting that takes place at their admissions offices or because of the superior educational experience they deliver.
We’ve certainly learned that aspiring to a four-year bachelor’s degree for everybody, without regard to institution or field of study, is a costly, frustrating, and ultimately feckless target. It’s hard to be sure how much of that sober conclusion can be attributed to diminished, inflated grades, waivers, exemptions, and eyewinks that take the place of true academic accomplishment. Certainly it’s no secret that the more widespread a credential becomes, the less comparative advantage it confers on those possessing it. But it’s also no secret that a growing number of employers who once treated the college degree as a passport to hiring by their organizations now demand other evidence that an individual can truly do the job expected of him or her.
Please understand that I am not here seeking a return to some halcyon yesteryear when only the children of those with college degrees were expected to get their own degrees. Access to the varied (if limited) benefits of such degrees ought not be determined by zip code, parentage, or race. But—a very important but—that doesn’t mean individuals or society benefit when we cavalierly hand out credentials that, in the end, signify very little—credentials that, like grades, have themselves been inflated beyond their true worth.
Higher education today gives analysts, policymakers, and critics so much to fret about—cost, free speech, leftward lurching faculty, politically trendy majors—that we haven’t been paying nearly enough heed to the quality and value of the product itself. Some revisionism may be setting in as a few of the Great and the Good begin to weigh multiple pathways to prosperity and mobility. Some serious analysts and gutsy policymakers are pushing to catch up with Singapore, Switzerland, and other places that have long featured rigorous and respectable career preparation as well as university-style education. It’s important to note, however, that while they’re pushing for other pathways besides the traditional four-year college, they’re not paying much attention to the degraded state of the college degree itself. Practically nobody except the occasional cranky professor is doing that, and such critics are typically dismissed as academic snobs on a nostalgia trip. It’s impossible to get most policymakers beyond the smug assertion that “America has the best higher-education system in the world, the one that people from other countries clamor to get into.” What such claims usually refer to, of course, are the top hundred or so U.S. research universities—despite mounting evidence that they’re being gained on. And often it’s the graduate programs on those campuses, not their undergraduate colleges, that are the center of attention.
College itself is what needs a rethink, both in its own right and as a universal destination for young Americans. No, I’m not holding my breath, for many, many influential organizations, thinkers, and philanthropies remain wedded to “college for all,” and it’s too easy to get called nasty names if you cast doubt on that goal. The last thing these folks want is for anyone really to peel back the college curtain—as A Nation at Risk did for K–12 back in 1983. But we might do well to recall that scene where Toto darts behind the curtain in the Emerald City and finds a wimpy little non-wizard frantically working the controls of the machines he has been using to fool people into thinking that he’s mighty and magical.
— Chester E. Finn, Jr.
Chester E. Finn, Jr., is a Distinguished Senior Fellow and President Emeritus at the Thomas B. Fordham Institute. He is also a Senior Fellow at Stanford’s Hoover Institution.
This post originally appeared in Flypaper.