The Three-Phase Story of American Higher Education

Andrew Delbanco, professor of American Studies at Columbia University and recent author of College: What It Was, Is, and Should Be, provides a short history of how American undergraduate education has conceived of its purpose:

Most American colleges before the Civil War (more than five hundred were founded, but barely one hundred survived) were, in Richard Hofstadter’s words, “precarious little institutions, denomination-ridden, poverty-stricken…in fact not colleges at all, but glorified high schools or academies that presumed to offer degrees.” A bachelor’s degree did not have much practical value in the labor market or as a means of entering the still-small managerial class. The antebellum college was typically an arm of the local church—an academy for ministers, missionaries, and, more generally, literate Christians—that remained true to the purpose of the oldest American college, Harvard, which had been founded in dread “lest the churches of New England be left with an illiterate ministry…when our present ministers shall lie in the dust.”

As sectarian fervor cooled, the colleges became less closely tied to the churches, though most retained a strong religious tone through the mid-1800s. Whatever the particular method or creed, there was consensus, in “an age of moral pedagogy,” that the primary purpose of a college education was the development of sound moral character. A senior-year course in moral philosophy, usually taught by the college president, was almost universal. As the grip of religion loosened further over the course of the century, and the impact of Darwin transformed intellectual life, colleges changed fundamentally, becoming largely secular institutions devoted less to moral education than to the production and transmission of practical knowledge.

By the mid-nineteenth century, the need for expert training in up-to-date agricultural and industrial methods was becoming an urgent matter in the expanding nation, and, with the 1862 Morrill Act, Congress provided federal land grants to the loyal states (30,000 acres for each of its senators and representatives) for the purpose of establishing colleges “where the leading object shall be, without excluding other scientific or classical studies, to teach such branches of learning as are related to agriculture and the mechanic arts.” Eventually these “land-grant” colleges evolved into the system of state universities.

At the same time, as the apprenticeship system shrank and some professional careers began to require advanced degrees, the impetus grew for the development of private universities. Some took shape around the core of a colonial college (Harvard, Yale, Columbia), while others (Chicago, Northwestern) came into existence without any preexisting foundation. Still others (Clark, Johns Hopkins) had at first few or no undergraduate students. In 1895, Andrew Dickson White, the first president of Cornell, whose private endowment was augmented by land granted to New York State under the Morrill Act, looked back at the godly era and declared himself well rid of “a system of control which, in selecting a Professor of Mathematics or Language or Rhetoric or Physics or Chemistry, asked first and above all to what sect or even to what wing or branch of a sect he belonged.”

The idea of practical and progressive truth to which the new universities were committed was, of course, not entirely novel. It had already been advanced in the eighteenth century by Enlightenment ameliorists such as Benjamin Franklin, who anticipated a new kind of institution, to be realized in the University of Pennsylvania, that would produce “discoveries…to the benefit of mankind.” Roughly one hundred years later, Charles W. Eliot, the president who turned Harvard College into Harvard University (and who was himself descended from Puritan clergy), explained that a modern university must “store up the accumulated knowledge of the race” so that “each successive generation of youth shall start with all the advantages which their predecessors have won.”

By 1900, professors, no less than physicians or attorneys, had become certified professionals, complete with a peer review system and standards for earning credentials—which one of Eliot’s faculty members, William James, referred to as the “Ph.D. octopus.” Faculty began to benefit from competitive recruitments in what was becoming a national system of linked campuses; and when some rival university came wooing, the first thing to bargain for was, of course, a reduced teaching load. Seven years after Eliot’s inauguration speech in 1869, the Harvard philologist Francis James Child was exempted from grading undergraduate papers in response to an offer of a job from Johns Hopkins.

By the end of the nineteenth century, the professionalized university had absorbed schools of medicine and law that had typically begun independently, and was acquiring teacher-training schools, along with schools of engineering, business, and other professions. It was on its way to becoming the loose network of activities that Clark Kerr, president of the University of California, famously called the “multiversity.” When Kerr coined that term in 1963, in The Uses of the University, he remarked on the “cruel paradox” that a “superior faculty results in an inferior concern for undergraduate teaching,” and he called this paradox “one of our most pressing problems.”

Since Kerr wrote, the problem has gotten worse. Today, as David Kirp points out in Shakespeare, Einstein, and the Bottom Line, New York University, which has lately made a big (and largely successful) push to join the academic front rank, employs “adjunct” faculty—part-time teachers who are not candidates for tenure—to teach 70 percent of its undergraduate courses. The fact that these scandalously underpaid teachers must carry the teaching burden—not just at NYU, but at many other institutions—speaks not to their talent or dedication, but to the meagerness of the institution’s commitment to the teaching mission. At exactly the time when the struggle to get into our leading universities has reached a point of “insane intensity” (James Fallows’s apt phrase), undergraduate education has been reduced to a distinctly subsidiary activity.

[. . .]

The history of American higher education amounts to a three-phase story: in the colonial period, colleges promoted belief at a time of established (or quasi-established) religion; in the nineteenth century, they retained something of their distinctive creeds while multiplying under the protection of an increasingly liberal, tolerationist state; in the twentieth century, they became essentially indistinguishable from one another (except in degrees of wealth and prestige), by turning into miniature liberal states themselves—prescribing nothing and allowing virtually everything. Anyone whose parents or grandparents were shut out from educational opportunity because of their race, ethnicity, or gender is thankful for the liberalizing trajectory of higher education—but as in every human story, there is loss as well as gain.

— “Colleges: An Endangered Species?” (The New York Review of Books)

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s