In January 2020, just days before the first case of Covid-19 was identified in the United States, Bryan Alexander, a scholar at Georgetown University known as a “futurist,” published a new book, Academia Next: The Futures of Higher Education. Alexander made no claim to clairvoyance, only to “trend analysis and scenario creation.” But one of his scenarios showed startling foresight:
Imagine a future academy after a major pandemic has struck the world…. Would distance learning grow rapidly as people fear face-to-face learning because of perceived contagion risk?… How would we take conferences and other forms of professional development online?… Would athletes refrain from practice and play for fear of contagion, or would both institutions and the general public demand more college sports as an inspirational sign of bodily vigor in the context of sickness and death?
By the spring of 2020, these questions were no longer hypothetical. Classrooms emptied as “distance learning” became almost universal. Conferences moved online. Some athletic programs canceled competition, while others kept up normal play and travel (and partying) despite the risk.
Although the pandemic is far from over, the business of predicting what universities will look like after it’s gone is now in high gear. As early as the fall of 2020, months before the first vaccines rolled out, The Chronicle of Higher Education published a booklet of essays titled The Post-Pandemic College, followed by (virtual) conferences on “The Post-Pandemic Campus” and “Higher Education and the Post-Pandemic Employer.”
Thanks to vaccine reluctance, the Delta and Omicron strains, and the specter of new variants, we’re still a long way from “post”—as of last fall only about half the jobs in higher education that were lost to layoffs or furloughs had been recovered, and student enrollment remains down by over one million since the fall of 2019. But while we wait, any effort to envision the future ought to begin with some facts about the recent past and the present.
Here are a few:
• Ninety-five percent of US colleges and universities have an endowment equivalent to less than 1 percent of Harvard’s.
• Some elite institutions spend more than $100,000 a year per student, which means that even students who pay the full “sticker price” (around $75,000 in the Ivies) are subsidized, while most community colleges can spend only $10,000 to $15,000 per student.
Popular
"swipe left below to view more authors"Swipe →
• Eighty percent of students enrolled in a community college—around 7 million, the majority from low-income, minority, or immigrant families—hope to earn a bachelor’s degree, but fewer than 15 percent succeed in doing so within six years.
• Low-income students with high grades and test scores in high school are nearly 20 percent less likely to enroll in college than affluent students with low grades and scores.
• Students from families in the top 1 percent income bracket are almost 80 times more likely to attend an Ivy League or other highly selective college than those from families in the bottom 20 percent.
• Between 2008 and 2015, average state appropriations per full-time student at public universities fell by more than 15 percent (adjusted for inflation); meanwhile, tuition and fees rose even faster, feeding the growth of personal debt that falls disproportionately on low- and middle-income families.
These are snapshots of an education system that is profoundly and increasingly stratified. If higher education once helped to reduce inequities in American life, it now too often sustains and fortifies them. Like our health care system, it delivers concierge services to the affluent while consigning low- and modest-income Americans to overcrowded or underfunded facilities. And the disparities are getting worse. In Unequal Colleges in the Age of Disparity—published five years ago—Charles Clotfelter documented this “re-sorting of customers” upward and downward: At elite private colleges, the average family income of students has “surged ahead of the national mean,” while students attending colleges with fewer resources lag ever farther behind. America’s top-ranked universities position their mostly affluent students to accrue yet more wealth, influence, and power, while far too many who attend institutions that stand lower in the hierarchy of prestige are burdened by debt and struggle to graduate.
In the public sector, many regional universities are in trouble. A study from 2019 classified a growing number of these institutions as “vulnerable,” meaning they could be headed for contraction or even closure. Institutions such as Central Michigan University, where I met impressive students from high-poverty cities like Flint, face the dual challenge of declining public subsidies (in Michigan over the past 20 years, state funding per student has fallen 40 percent) and a projected decline in enrollment. By contrast, the University of Michigan at Ann Arbor raises large sums for its endowment and current use (over $5 billion in its most recent campaign) and attracts international as well as out-of-state students at higher tuition rates than those charged to Michigan residents.
At private nonprofit colleges, which are often accused of charging exorbitant prices, fewer than one in six students pay the published amount. Net tuition and fees—the amount of revenue received after discounts in the form of full or partial scholarships—have therefore been virtually flat for decades, even while costs were rising. Over the coming decade, many small colleges with modest reputations will likely face enrollment shortfalls due to regional population shifts as well as the broad decline in birth rates that followed the Great Recession of 2008. In short, except for the most prominent institutions, and despite infusions of federal aid since Covid’s outbreak, many private and public colleges will find themselves in a battle to stay solvent.
There is a racial dimension to this story as well. In a valuable new book, Broke: The Racial Consequences of Underfunding Public Universities, Laura T. Hamilton and Kelly Nielsen write that “for most of the twentieth century, families of color, as part of the tax base, were paying for wealthy white students to attend universities where their own offspring were not welcome.” But over the past four decades, as the numbers of African American, Latinx, and students of Asian origin were rising, higher education funding was falling. It’s a suspicious symmetry. Under the old funding model, “affluent whites would need to help pay for the postsecondary education of Black and Brown youth, as well as the white working class. This did not happen.” In California between 1970 and 2014, the share of the state budget devoted to higher education fell by nearly a third, while the share for prisons more than doubled. The effect—if not necessarily the intent—has been to place a heavy burden on students with comparatively limited income and assets, of whom a disproportionate number are students of color.
All of these inequities have been made worse by the pandemic. At wealthy institutions like the Ivies, the problems are temporary. Rather than start out their college days in the fall of 2020 on Zoom, some newly admitted students took a “gap year,” while others already enrolled went on a leave of absence, and foreign students found themselves unable to obtain visas. Top universities took hits as well on the expense side of the ledger in the form of unanticipated costs, such as administering Covid tests and enhancing digital technology to facilitate remote teaching. But the costs were manageable, enrollments have bounced back, and in 2021 large endowments generated spectacular returns, in some cases growing by billions of dollars.
Below the elites, the damage was much more severe and lasting. The number of 2021 high school graduates filling out the Free Application for Federal Student Aid dropped by 5 percent, which corresponds to roughly 100,000 low-income students who might have planned to attend college but have given up, at least for now. As for those already in college, Georgia State University, which serves a large population of Black, Latinx, and low-income students, is both a hopeful and a cautionary example. A national leader in using digital data to identify students who need timely academic help, financial assistance, or supportive counseling, Georgia State has made remarkable gains in academic performance and graduation rates over the past decade; but during the first pandemic year, the number of dropped courses exceeded those of the previous academic year by more than 30 percent. This means a longer road to graduation—a road from which many first-generation, low-income, and minority students are at risk of being blocked.
As the pandemic took hold, it drove down enrollments in community colleges by more than 10 percent. At the 10 community colleges in the City University of New York system, where dropout rates had been over 50 percent even before Covid, thousands of students—many of whom (or their family members) lost jobs when the restaurant and retail sectors imploded—had no access to an adequate Internet device or Wi-Fi connection. The political scientist Corey Robin, who teaches at CUNY’s Brooklyn College, wrote a pungent response to the president of Brown University, who had recommended that colleges control Covid outbreaks by deploying tracing technology and quarantining sick students in hotels. Brooklyn College, Robin pointed out, can’t afford contact tracers or hotel rooms; it doesn’t even have bathrooms where the water runs reliably hot. The typical student doesn’t live in a dorm, and if she falls sick, she “will, in all likelihood, end her day where it began: at home with her family.”
In other words, Covid has turned the gap between institutions serving mainly privileged students and those serving needy ones into a chasm. Some of the most acute problems—student anxiety, faculty fatigue—don’t appear as numbers on any balance sheet. But even for problems that can be quantified, most colleges have few and poor tools for addressing them. Except for the elite privates and flagship publics, most colleges can’t simply open the spigot to increase the inflow of tuition-paying students. They can’t boast about how many applicants they turn away, as the Ivies love to do. Instead, they struggle to attract enough students to cover operating costs. They often have no choice but to provide incentives for relatively affluent candidates by offering discounts (known as “merit aid”), while pulling back from recruiting those who could attend only if offered bigger discounts or a total waiver (“need-based aid”). Long before Covid, the diversion of financial aid from needy to less needy students was already a national trend, and there’s every reason to expect it to accelerate—with the result, as Martin Kurzweil and Josh Wyner put it, that “rich kids are eating up the financial aid pot.”
Other revenue-raising and cost-cutting strategies are being tested. In the hope of attracting students by lowering the published tuition price, some colleges have abandoned the high-tuition/high-discount financial model altogether. Others are trying to reduce duplicative hiring, for example, by sharing language instruction with nearby colleges. In order to conserve fellowship funds, some research universities temporarily suspended graduate student admissions in the humanities and social sciences. A few institutions—Mills College and Northeastern University; Marlboro College and Emerson College—have formally merged. Name-brand colleges are cashing in on the college admissions frenzy by offering high-priced summer “immersion” programs to affluent high school students seeking advantage in the scramble. (Before the pandemic, my own university, Columbia, was charging more than $10,000 for three weeks.) Still others are enticing older customers into master’s degree programs that charge scores of thousands of dollars for a credential of dubious worth.
The most effective strategy for balancing the books, however, is one that threatens to destroy the institutions it’s meant to save: namely, making deep cuts in the instructional budget. For many Americans, the word “professor” conjures up the image of celebrity scholars shuttling between Aspen and Davos while a squad of teaching assistants does the scut work with students back home. This is a grotesque distortion. In fact, roughly two-thirds of college professors work today as adjuncts on contingent contracts—at community colleges, the figure is at least 70 percent—not a few of whom teach five or more courses per semester, sometimes at more than one institution, in the hope of cobbling together a living wage. For many, the workload is overwhelming, the pay is meager, the benefits are minimal, and tenure is a pipe dream.
In a darkly prescient book, The Last Professors, published more than a decade ago, Ohio State English professor Frank Donoghue noted that “the dismantling of the American professoriate is part and parcel of the casualization of labor in general.” In the national context of weakened unions, outsourcing, and layoffs as means to protect shareholder profits, making the case that academics deserve singular job security is a tough sell. When the case is made, it’s usually on behalf of academic freedom, which has been the chief rationale for tenure since the outbreak more than a century ago of what the historian Walter Metzger called “ideological conflict between academic social scientists and trustees of wealth.”
The incendiary event occurred in 1900 in the person of Edward Ross, a Stanford University sociologist who favored public ownership of utilities, regulation of railroads (from which Stanford derived its wealth), and a ban on Asian immigration as a source of cheap labor. When the university president came under pressure from Mrs. Stanford to get rid of him, Ross resigned, followed by colleagues who left in protest, including the great intellectual historian Arthur O. Lovejoy, who later helped found the American Association of University Professors (AAUP) on two principles still pertinent today: that tenure is necessary to protect “freedom of teaching and research and of extramural activities” and is also a way to provide “a sufficient degree of economic security to make the profession attractive to men and women of ability.”
In our own era—when some pundits and opportunistic politicians on the right are trying to dictate what can be taught, while some students and feckless faculty on the left are trying to police what can be said—the first rationale is more compelling than ever. As for the second, the distinguished chemist Holden Thorp—formerly chancellor of the University of North Carolina at Chapel Hill and provost of Washington University; now the editor of Science magazine—has a sharp retort for opponents of tenure who “lament the job security that they feel is exploited or not earned”:
Ask them if they can think of any other jobs that pay what an entry assistant professor in the humanities pays with 10 years of postbaccalaureate training and hundreds of applicants for every slot? And the so-called exploitation? For every senior faculty member phoning it in, 10 are serving on every committee, teaching extra courses, and still doing research. It’s [tenure that is] a bargain.
But because the job market for recent PhDs is saturated, especially in the humanities and social sciences, universities are often able to avoid the tenure system and hire contingent faculty at even lower pay. Under these conditions, tenure is a bargain to which more and more institutions are saying no. In some fields the market for stable, decently paid teaching positions has all but collapsed—one consequence of which is the growing push for unionization among graduate students who have scant hope of an academic career after putting in years of advanced study and “apprentice” teaching.
Meanwhile, for the shrinking fraction of young faculty who do manage to obtain tenure-track jobs—jobs, that is, leading to an “up or out” moment when their contract is either terminated or indefinitely extended—the criteria for promotion and retention typically have little to do with how well they are serving students. The idea of tenure is an artifact of the early 20th-century research university, where, in some cases (Johns Hopkins, Clark), the number of undergraduates was between negligible and zero. In 1900, in the entire United States, there were approximately 200,000 college students. Today there are around 16 million. Yet under the tenure system inherited from a century ago, college faculty—most of whom, as the economist Noah Smith has written, “have been essentially hired to be teachers”—are still compelled to “prove their suitability for the job by doing research.” As a result, many good teachers who do little research are denied tenure, while weak teachers who do lots of research achieve it.
Such an outcome may be justifiable at institutions whose primary function is the production of new knowledge. More broadly, however, it is not only unjustifiable but unjust. As the University of Wisconsin philosophy professor Harry Brighouse points out:
Instructional quality is the most neglected—and perhaps the most serious—equity issue in higher education. Good instruction benefits everyone, but it benefits students who attended lower-quality high schools, whose parents cannot pay for compensatory tutors, who lack the time to use tutors because they have to work, and who are less comfortable seeking help more than it benefits other students.
There is some reason to hope that tenure, or at least renewable extended contracts, may become less strictly tied to research productivity. For example, Worcester Polytechnic Institute recently announced the creation of 45 tenure lines for faculty who “specialize in teaching.” And following a number of “J’accuse” books published over the last 15 years, including Derek Bok’s Our Underachieving Colleges and Richard Arum and Josipa Roksa’s Academically Adrift: Limited Learning on College Campuses, there has been a growing effort to assess and improve college teaching, even at institutions whose core mission is research. Some teachers in the burgeoning fields of STEM (science, technology, engineering, and mathematics) are discarding hour-long lectures in favor of shorter segments on discrete topics, breakout groups, frequent quizzes, and digital feedback systems that tell the instructor whether students have grasped the material or need it repeated or presented in a different way. The Stanford physicist Carl Wieman believes that “university teaching is in the early stages of a historic transition, changing from an individual folk art to a field with established expertise, much as medicine did 150 years ago.”
It is heartening that the STEM fields—which attract many first-generation college students but then tend to discourage them—may be shifting their teaching culture from weed-them-out to help-them-learn. But it’s not clear that the recent explosion of work in the neuroscience of cognition—lucidly reviewed in Grasp: The Science Transforming How We Learn by Sanjay Sarma and Luke Yoquinto—has done much to modify the basic insight shared by all good teachers since Socrates: that to teach well is to ask questions or pose problems that prompt students to reflect and respond with words, numbers, or other expressive symbols, including the nondiscursive languages of the arts. A good teacher will meet each response with more questions, hoping to inspire students with the excitement of discovering that the chain of questions has no end.
The sociologist Steven Brint has proposed some steps that would help provide this experience to as many young people as possible. These include restoring robust funding for regional public institutions; doubling the maximum award amount of the federal Pell Grant for low-income students, which once covered three-quarters of the average tuition at public universities but now covers only 30 percent; making repayment of student loans contingent on post-college income; and extending eligibility for tenure to adjunct faculty based on the quality of their teaching. For its part, the AAUP is calling for “A New Deal for Higher Education,” which would provide federal tuition subsidies, student loan forgiveness, and support for staff and campus infrastructure. There have also been calls to restrict eligibility for federal funds to institutions that award tenure to some mandated percentage of their faculty.
Given the divisions within the Democratic Party and the battered reputation of higher education among large sectors of the public, these proposals face daunting odds. Still, despite the excision of free community college from the Biden administration’s stalled Build Back Better bill, the next iteration of the proposed legislation still seems likely to increase the purchasing power of Pell Grants and include funds for programs aimed to improve college retention and completion rates, as well as for infrastructure and financial aid at historically Black colleges and universities, tribal colleges, and Latinx-serving institutions.
Well before the pandemic hit, serious commentators—including some academic leaders—were predicting that colleges and universities as we’ve known them are destined for oblivion, and that any effort to stabilize or reform them is a proverbial case of rearranging the deck chairs on the Titanic.
An early version of this view was set forth in the mid-1990s by the late Harvard Business School professor Clayton Christensen, who advanced a theory of “disruptive innovation” in which new enterprises target customers who have limited resources and bring “to market a product or service that is not as good as the best traditional offerings but is more affordable and easier to use.” Gradually, this low-end service improves in quality and appeal until higher-end providers embrace and adapt it. For higher education, that disruptive innovation was online instruction, deployed by for-profit “universities” like the University of Phoenix and initially sneered at in established institutions as a tacky product not to be taken seriously.
Over the ensuing 25 years, “distance learning,” as it came to be known, made significant inroads into higher education—even before Covid, 40 percent of students had taken at least one online course—at first almost entirely through the private for-profits. Lightly regulated during the Clinton and Bush administrations despite their predatory recruitment practices, these for-profit “universities” played a disproportionate role in driving up student debt. As Tressie McMillan Cottom reports in her aptly titled book Lower Ed, by 2008 more low-income Black and Latinx women were attending for-profits than were enrolled in four-year private and public nonprofit institutions combined.
By the late 1990s, established universities were also taking tentative steps into online teaching. Columbia led a failed attempt to market online courses through an entity called Fathom.com, which folded in 2003. By 2012, two for-profit start-ups—Coursera and Udacity—had been launched by entrepreneurs at Stanford and Google, followed shortly by a nonprofit competitor, edX, a partnership between Harvard and MIT. Suddenly star professors were signing up as independent contractors to teach everything from astrophysics to lyric poetry through so-called massive open online courses, or MOOCs.
There was high-minded talk about how this new disruptive form would help democratize education by reaching anyone anywhere on the globe with an Internet connection and an appetite to learn. Stanford president John Hennessy predicted a “tsunami” of technological innovation that would sweep away all but a few super-wealthy colleges and universities. But online education has not brought about a new era of democratic higher education—at least not yet. Most consumers of MOOCs turned out to be not young people looking for a substitute for traditional college but professionals looking for career advancement. Coursera is increasingly a global service for businesses, governments, and credentialed individuals, though it continues to provide a platform through which universities offer courses and programs. It has raised hundreds of millions in venture capital from investors betting that “these new alternative education models are the future of how people will be trained up for the labor market.” This past summer, Harvard and MIT sold edX to the for-profit company 2U for $800 million (a tidy return on their initial $30 million investment), promising to use the proceeds for “transforming educational outcomes” and “tackling learning inequities”—whatever exactly that means.
Though the MOOCs have so far failed to shake up universities to anything like the extent predicted, online instruction in other forms is growing. The audaciously inventive president of Arizona State University, Michael Crow, speaks of a “fifth wave” of American higher education (the first four were colonial colleges, state universities founded in the early republic, land-grant institutions following the Civil War, and research universities in the 20th century) that will “redress the inequities associated with the hierarchical differentiation of sectors, or vertical institutional segmentation,” and “catalyze innovative knowledge production as well as its dissemination to an increasing proportion of citizens.” Buried in this technocratic prose is the news that previously excluded or underserved students will be reached mainly through “technologically enabled delivery”—a euphemism for online instruction.
Paul LeBlanc, the president of Southern New Hampshire University, speaks more plainly but with comparable ambition to extend the reach of higher education. Southern New Hampshire offers a curriculum taught almost entirely online, mainly by part-time faculty, to 135,000 students, many of whom are older than traditional college age, have limited resources, and whose work and family responsibilities don’t fit the constraints of conventional campus life, such as daytime in-person classes.
Most traditional institutions continued to ignore or condescend to these alternative forms of college until mid-March 2020, when the pandemic sent students and faculty rushing almost overnight into distance learning—or, as some prefer to call it, “emergency remote instruction.” People like me, for whom “Zoom” used to mean the sound of a motorcycle or a car with a punctured muffler, suddenly found ourselves, like it or not, teaching online. After a two-week hiatus between the shutdown of classes and their resumption on the Internet, my students and I felt like friends who had been scattered by a storm and reunited in a shelter.
With a variety of mask and vaccine mandates now in place, most colleges have nervously resumed in-person teaching, but a full return to the status quo ante seems unlikely. The Zoom experiment showed that even where physical space is scarce, it’s possible to find breakout rooms for small group discussions and to accommodate students with long commutes or mild illness who might otherwise be absent. Zoom made it easier to include guest speakers in class and harder for shy students to hide in the back of the room, because in a Zoom room there’s no front or back. On the other hand, the screen is just that: a filter or barrier that screens both teacher and student from the serendipitous effects of gesture, body language, and eye contact, making the relation less immediate and more impersonal. Though it’s hard to distinguish from the many forms of exhaustion induced by the pandemic, “Zoom fatigue” feels very real.
One of the unfulfilled hopes of online instruction has been its promise to slow the rise in college costs and thereby make higher education more accessible to students with few resources. But with specialized exceptions like the online master’s degree in computer science offered by Georgia Tech, the prospect of matching affordability with quality remains elusive. At the outset of the pandemic, after the shift to online classes, a few mega-rich institutions announced temporary tuition discounts (15 percent at Williams, 10 percent at Princeton). But prestigious institutions will also look for ways to use online technology to generate more revenue—perhaps by rotating cohorts within an expanded undergraduate class through cycles of on-campus and off-campus enrollment (this semester you’re living in a dorm; next semester you’ll be Zooming off-campus), thereby forgoing investment in new physical facilities while collecting more tuition dollars. The economists Michael McPherson and Sandy Baum suggest that some institutions may be able to “charge as much or more for an on-line course, with lower overhead at an increased scale, as for the on-campus equivalent.”
In the not-yet-aftermath of the pandemic, trying to predict the future of higher education is, even more than usual, a dicey business. Contingencies include future birth rates; regional population shifts driven by climate change; economic expansion or contraction; the needs and wants of students; the capabilities of future technologies; public opinion; the priorities of private philanthropy; and, most important, the scale and focus of state and federal government funding. Bryan Alexander, whose imagination caught an early glimpse of the pandemic, leads a group called the Future of Education Observatory—but despite the astronomical metaphor, predicting what will happen to our colleges and universities is less like tracking a planet than playing with a Ouija board.
For now, as we wait and speculate, the question at hand is whether we are witnessing resourceful adaptations to an exceptional event, or the beginnings of deep change of which Covid is giving us a preview. In a book written at the height of the pandemic, The Great Upheaval: Higher Education’s Past, Present, and Uncertain Future, Arthur Levine, former president of Teachers College and the Woodrow Wilson Foundation, and Scott Van Pelt, who teaches at the Wharton School, argue that awarding credentials based on “seat time” spent in classes in hour-long segments crammed into a fraction of each day is a relic of the industrial era. Years before the pandemic, the formidable journalist Kevin Carey wrote a book, The End of College, in which he predicted that traditional colleges and universities will eventually give way to what he called the “University of Everywhere.” This will not be a fixed institution with certifying authority but rather an array of entities in the cloud that issue certificates or “badges” based on competency tests to prove the mastery of certain skills—credentials much more reliable than today’s diplomas. Students will benefit from what Jeffrey Selingo, an education writer now serving as the special advisor for innovation at Arizona State, has called “adaptive learning technologies” that “adjust to the speed at which an individual student learns.” In this imagined future, equity will be served, in Carey’s words, by “increasingly sophisticated artificial intelligence [that] will diagnose the strengths and weaknesses of each individual learner and customize his or her education accordingly, constantly challenging and motivating people to work harder and better without breaching the threshold of frustration and failure.” Liberated from the burdens of time and cost imposed by ossified institutions, people will become more capable, better informed, and even—because of diminished frustration—happier.
On the question of who the certifying authorities will be, Levine and Van Pelt bring us down to earth. “The awarding organization,” they write, “does not need to be a college: it could be an industry leader, such as Google or Microsoft, whose endorsement would mean more than that of most colleges.” In fact, the lines between for-profit businesses and nonprofit institutions have been blurring for some time as students seek “the same kind of relationship with their colleges that they have with their banks, supermarkets, and internet providers. They ask the same four things of each: (1) convenience, (2) service, (3) a quality product, and (4) low cost.”
For faculty, this brave new world won’t be a hospitable place. During the initial excitement over MOOCs, Stanford’s John Hennessy predicted that faculties would shrink as technologies grow. As AI takes over more and more human functions, a lot of college professors—much like radiologists or truck drivers—will be collateral damage. No doubt some traditional institutions will survive, with students in residence and faculty in classrooms on what Bryan Alexander calls the “Retro Campus” —a place “like a vinyl record store,” where eccentric customers go to indulge in remembrance of times past.
For anyone who cares about equity, or about education as something more than the transmission of marketable skills, this sci-fi vision of the future has at least three big problems. First, there is no reason to believe that technology will broaden opportunities and improve learning for unconfident students, many of whom are low-income, nonwhite, and from families with no previous college experience. As Thomas Bailey, the president of Teachers College, has written with his colleagues at the Community College Research Center, “online instruction…tends to reinforce…disconnectedness and isolation, while undermining many students’ success.” McPherson and Baum report that “moving coursework fully online increases gaps in success” and that “students who take online classes do worse in subsequent courses and are more likely to drop out of school. Males, students with lower prior GPAs, and Black students have particular difficulty adjusting to online learning. The performance gaps that exist for these subgroups in face-to-face courses become even more pronounced in online courses.”
None of this should be surprising. Students need human—and humane—teachers and mentors. They need recognition from adults alert to their capacities, aware of their limitations, with concern and time enough to counsel and guide them. Among the bitter realities of America’s current higher education system is the fact that the students who most need these supports are the least likely to get them.
A second problem with the entrepreneurial, global, and increasingly virtual university is that it offers too little to the local communities in which it is rooted. This is especially true of wealthy private institutions, which enjoy a nonprofit status that spares them from taxation on their real property and investment returns and confers tax deductibility on gifts from their donors—all of which represents revenue withheld from the public treasury. When pressed on what they are doing to meet their public responsibilities, most presidents and trustees will point to advances in medicine (if the institution has a medical school), or to technologies derived from research conducted on campus, or to the fact that their institution is a local employer.
These claims have merit. But private institutions, especially those with significant resources, should be doing more, as indeed some are. Many of these institutions, as the urbanist scholar Davarian L. Baldwin shows in grim detail in his book In the Shadow of the Ivory Tower, are islands of privilege in or near areas of high poverty and poor public schools—a reality made painfully apparent by the disparate effects of the pandemic. They should be partnering with public libraries on literacy and civic programs; providing legal and other services to the needy; pledging, as Yale has done recently, larger payments in lieu of taxes (or PILOTS) to their municipality; subsidizing rents or otherwise supporting locally owned businesses; and making direct investments in community improvements. They should be opening their doors wider to qualified transfer students from local community colleges. They should be serving veterans, incarcerated people, and local public school students, whose prospects to attend college can be markedly improved by after-school tutoring and other “wrap-around” services of the kind that affluent families take for granted.
There are some encouraging examples of such efforts that could be replicated or adapted by many more institutions: the Clemente Course in the Humanities, a nationwide course for indigent adults taught by local academics and accredited by Bard College; the Warrior-Scholar Project, which brings veterans to study at eminent universities; the Netter Center for Community Partnerships at Penn and the Dornsife Center for Neighborhood Partnerships at Drexel, which connect undergraduates with schools and social services in West Philadelphia; Mount Tamalpais College, a liberal arts institution for students incarcerated at San Quentin staffed by volunteer teachers from around the Bay Area; the Double Discovery Center at Columbia, which more than 50 years ago furnished the model for the federal Upward Bound program that helps middle and high school students prepare for college; the Knowledge for Freedom network (full disclosure: funded in part by the Teagle Foundation, where I am currently president), which brings low-income high school students onto college campuses for a “Great Books” seminar. Public institutions, too, should be ramping up efforts to serve students beyond their campus gates, as exemplified by the Newark City of Learning Collaborative, led by the Newark branch of Rutgers University under the leadership of its farsighted chancellor, Nancy Cantor. As the prospect grows that the Supreme Court will disallow race-conscious admissions policies, such programs become all the more important for widening the pipeline for talented Black and Latinx students mired in poor schools. What all of these programs have in common is that they are reciprocal learning experiences: They can be life-changing for the teachers as well as for the taught.
Finally, there is a third—and perhaps the deepest—problem with the futuristic vision of education advanced by “technologically enabled delivery”: the debilitating fact that it rests on a narrow, positivistic conception of knowledge. In this view, all teaching is training, and all learning is a quest for competence: the mastery of some field whose practitioners can expect compensation for their proficiency or expertise. No one should dispute that colleges have a vital responsibility to prepare students for the world of work—to provide them with what the political scientist Benjamin Ginsberg calls “more or less sophisticated forms of vocational training to meet the needs of other established institutions in the public and private sectors.” In fact, preparation for economic productivity has been the main aim of universities since the decline of prescribed curricula in the 19th century, when the introduction of electives and, later, majors aligned what students chose to study in college with the work they planned to do after. Over the past 50 years, as students from economically insecure families entered college in growing numbers, this alignment has only become tighter, including at elite institutions that serve predominantly affluent students. “It is a shame,” Ginsberg writes, “when that is all that the university offers.” “All” is an exaggeration, but at more and more institutions it’s a fair approximation.
What’s increasingly rare in higher education, and almost entirely missing from writings about its future, is a more than nominal commitment to the value of learning undertaken in the hope of expanding the sympathetic imagination by opening the mind to contesting ideas about nature and history, the power of literature and art, and the value of dialectic in the pursuit of truth. These aspirations—traditionally gathered under the term “liberal education”—are in desperate need of revival. To advance them requires teachers and institutions committed to a more capacious vision of education than the prevailing idea of workforce training and economic self-advancement.
Dare we hope that the shock of the pandemic will confirm the urgency of this need? The kinds of questions the virus has forced upon us are not, after all, ultimately technical or empirical ones. They are political, ethical, and historical questions: How do we reconcile individual liberties with the public good? How do we account for savage inequities in health care and the quality of life? What should national sovereignty mean in a world where pathogens go from local to global in a flash? To debate such questions with rigor and candor requires habits of the heart and mind that are, to put it mildly, sorely lacking in our viciously polarized political culture. If higher education, along with the legislatures, philanthropies, and private donors who support it, does not recommit itself in act as well as word to the principles of pluralist democracy—equity, opportunity, tolerance, rationality—our republic doesn’t stand much of a chance. Certain habits of mind—distinguishing between arguments and opinions, admitting self-doubt, rethinking assumptions—are imperative for collective life. If these habits are not nurtured in the college classroom, where else will they be found?
Editor’s note: This article has been updated to clarify that Bryan Alexander, not Bryan Stevenson, used the phrase the “Retro Campus.”