Formats and Prices
- ebook $15.99 $20.99 CAD
- Hardcover $29.99 $38.99 CAD
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around January 26, 2016. This date is subject to change due to shipping delays beyond our control.
Jonathan R. Cole, former provost and current University Professor at Columbia University, addresses some of the biggest challenges facing the modern American university:
developing effective admission policies,
creating the most meaningful examinations,
dealing with rising costs,
making undergraduate education central to the university’s mission,
exploring the role of the humanities,
facilitating new discoveries and innovation,
determining the place for professional schools,
developing the research campuses of the future,
assessing the role of sports,
designing leadership and governance,
and combating intellectual and legal threats to academic freedom.
IN LATE SEPTEMBER 1960, when I first walked through the 116th Street and Broadway gates as a freshman at Columbia College, I was in awe of McKim, Mead, and White’s architecture. Could there really be such a beautiful and inspiring campus in the heart of New York City? I already knew something of Columbia and was apprehensive about how I, a “jock” with exceptionally good grades from a New York City public school, would fare in this intense intellectual environment. My older brother, Stephen, was entering his junior year there, and it was where my mother had studied for her Ph.D. in English Literature until my father, a stage, radio, and early television actor, had been blacklisted during the McCarthy era and the responsibility for supporting our family suddenly fell on her shoulders. Traversing the tree-lined College Walk, I thought of George Santayana’s possibly apocryphal observation that when he walked the McKim campus he felt in the company of great minds. There was something inspiring about looking up at the facade of Butler Library and seeing carved in stone the names of Homer, Sophocles, Plato, Aristotle, Cicero, and Virgil, and those of Shakespeare, Milton, and Goethe, among many others whose work we would read in Columbia’s required Humanities course. I was one of around six hundred others who were entering a journey at Columbia College.
Of course, the place was hardly as idyllic as I made it out to be. When I entered the college, there were no women (Barnard was the women’s college at Columbia), only two African Americans—both talented academics and athletes—and no Hispanics in my class. About 60 percent came from Jewish backgrounds. Many were first-generation Ivy League students, and perhaps 30 or 40 percent of them aspired to be either doctors or lawyers. So much for diversity. The first lecture I ever attended was for the entire freshman class and was given by Lionel Trilling, the literary scholar and public intellectual, on C. P. Snow’s recently published essay The Two Cultures and the Scientific Revolution.1
The college was conspicuously “local” compared to its more famous sister “cosmopolitan” Ivy League institutions, Harvard, Yale, and Princeton. It was not even a fully residential college at the time—capable of housing only perhaps 50 percent of its undergraduate students. The rest found either off-campus apartments or commuted to school. Columbia College had its own proud faculty and undergraduates, and its students rarely took courses or seminars with professors with appointments in the graduate faculties. In fact, thick boundaries existed around each school. We all took roughly half of our total points required to graduate in the Core Curriculum—and we could concentrate, rather than major, in a particular discipline. Legend had it that we had the best curriculum in the country (we knew next to nothing about what existed at other schools), taught by dedicated professors and instructors. We believed in the legend. We were immersed in and committed to a liberal arts education despite an equal dedication to a career that would represent upward social mobility.
What I wanted, as Woodrow Wilson said when speaking of his own education at Princeton, was to have the experience of rubbing up against other interesting minds. I quickly became an “organized dilettante,” taking courses with some of the extraordinary Columbia faculty regardless of their discipline, including Trilling, historian Richard Hofstadter, art historian Meyer Schapiro, sociologists Robert K. Merton and Daniel Bell, Nobel physicist Polykarp Kusch, and many others who were the stars of Columbia during those days—to say nothing of the younger, as yet relatively unknown, exceptional minds.
At the outset of my undergraduate years, I would not have predicted (nor would most others) that I would become an academic. Some might have thought that I would follow my father into the theater or aspire to something in baseball or an occupation outside the academy, but I secretly yearned for a life in academia, in part because of my family’s values, my Columbia teachers, and my brother’s influence, as well as the fact that I knew too much about the unpredictable world of the stage to think of it as a career.2
Perhaps a turning point came at the beginning of my sophomore year, when my father suddenly died of a heart attack and my financial aid was insufficient for me to continue at the college. Harry Coleman, assistant dean, saved me. When he heard my story, Coleman assured me that I could continue my studies and immediately increased my scholarship by $500 (a significant percentage of my total tuition bill at the time) without asking any questions or requiring me to fill out a sheaf of forms. I’ve never left Morningside Heights. But when I arrived, I had almost no idea about the contours of the house that I would live in for the next fifty years. I was a student at Columbia, and that was all that mattered at the time.
These autobiographical snippets are intended to provide a crude backdrop of what Columbia was like in the early 1960s, against which we can examine the evolution of this “older” research university. As an undergraduate history major, Ph.D. sociology student, junior and senior faculty member, teacher, researcher, and for fifteen years an academic administrator, of which thirteen were as the university’s provost and dean of faculties, I witnessed profound changes at Columbia and the other great universities around the world. It seems popular these days to target all that is wrong with our universities rather than to focus on the improvements in the quality of study, students, faculty, and particularly the research conducted that has led to a plethora of discoveries and innovations as well as scholarly advances that have altered our world. There was, of course, great science and scholarship before the 1960s and 1970s, even if the headlines had more to do with student unrest or how universities had prostituted themselves to the federal government and intelligence agencies and thus had become just another part of the military-industrial complex that Eisenhower had warned the nation about in his farewell address.
In fact, the very character of these houses of higher learning was changing. They no longer fit neatly into the perceptive characterization of them by Clark Kerr, the first chancellor of the University of California, Berkeley, from 1958 to 1967, as “multiversities,” which had no center and were a bunch of disconnected parts held together by a common interest in parking. But in design and practice, they were slowly becoming more perfect universities: more open, more meritocratic, more influential on our nation’s social and economic growth and potential. Universities were increasingly open to women, minorities, and foreign scholars and students. They were being improved through technological innovation and the recruitment of more diverse and exceptional talent. They continued to struggle to adhere to the principles of academic freedom and free inquiry when they came under assault by the government and other powerful political interest groups. The way scholarship was conducted was moving from very Small Science to Big Science—and finally to very Big Science in some transdisciplinary projects. Multi-university collaborations were emerging; and the Internet, the World Wide Web, and the multitude of new computer technologies allowed for remarkable changes in how information could be found, stored, exchanged, and used. That technology changed how students could be taught as well. Scholarship without borders was slowly becoming a reality—and the hard boundaries that Kerr saw were disappearing. Some of the universities’ great buildings were becoming symbols and architectural ruins—reminders of our past.
The size of the academy was growing exponentially; state universities were playing an increasingly important role in educating the majority of undergraduate, doctoral, and professional school students. Universities were taking on new functions. Consider just a few of the ways in which growth in size and complexity made a qualitative as well as a quantitative difference in the structure and governance of great research universities. We might begin just with the chief policy instrument of any university—its budget. When I entered Columbia, the university’s budget was perhaps slightly more than $100 million, up from $50 million in the 1950s; today, it is rubbing up against $4 billion annually. The same can be said about all of the large research universities. There has been enormous growth in research and scholarly centers and institutes to accommodate new styles of research and learning; and new departments and schools have been formed. The curriculum has expanded to include many studies beyond those that were part of our Western tradition. Yes, there has been significant growth in the size of the faculties at the various schools, but that growth did not compare with the proliferation over time of the bureaucratic administrative offices of the university that were born in response to pressure from governments, students, faculty, and other constituencies of the institutions of higher learning.
The sheer number of government regulations (a new way to control academic institutions and limit their autonomy) related to students, experimental animals, conflicts of interest, and diversity, as well as other aspects of research led to a plethora of unfunded mandates and increases in staff to respond to these governmental requirements and other pressures. From 1989 to 2011–2012, universities and colleges in the United States added more than 500,000 administrators and professional employees. Andrew Gillen, a researcher at the American Institutes of Research, said: “There is a mind-boggling amount of money per student that’s being spent on administration.”3 Mario Lalich, writing for the Washington Monthly, cites data from the Delta Project, which focused on university spending:
Between 1998 and 2008, America’s private colleges increased spending on instruction by 22 percent while increasing spending on administration and staff support by 36 percent. Parents who wonder why college tuition is so high and why it increases so much each year may be less than pleased to learn that their sons and daughters will have an opportunity to interact with more administrators and staffers—but no more professors. Well, you can’t have everything.4
The growth of administration is also linked to the preferences, if not the demands, of students, parents, and faculty when they consider attending or working at Columbia or another institution of higher learning. Today, there are extensive psychological counseling services, health services, advising and job placement services, and large admissions and financial aid offices. Compared with the past, current students and their parents are demanding relatively plush residence halls and dining arrangements; they want Olympic-quality recreational facilities, an elaborate student activities center that houses a variety of extracurricular activities and is staffed by dozens of professionals, remedial learning centers, and information technology centers. There is a growing group of staff members who work on raising money for the university. The legal departments of the university have grown rapidly over time, and they handle an increasing number of cases or threats of lawsuits.5
Many of these requests are for services and facilities that are essential, required, or provide worthwhile support for the members of the academic community. But they come at a price. Expensive layers of bureaucracy have been added to meet a variety of legal requirements and constituency demands—and to become more competitive. The open question is: Have universities produced too much of a good thing, which has now become dysfunctional when they try to reduce costs?
The irony, perhaps, is that the resources spent on increasing the number of people actually teaching our students have not kept pace with the growth in administrative costs. Although less true at the most well-heeled universities, many colleges and universities have resorted to the unfortunate practice of reducing appointments of full-time faculty members while increasing the proportion of their faculty who are part-time adjunct professors with little or no job security and almost no fringe benefits.
The ways in which teaching and research are conducted at major research universities today are vastly different from the approaches used more than fifty years ago. The sheer size of the research enterprise—with much of the sponsored government- and foundation-financed research taking place on the institution’s medical school campus—would be unrecognizable. Research has become bigger and far more collaborative—and far more international in scope. Teaching has used the fruits of new technology and is less likely to be found in the traditional format of professorial lectures. At most top universities in this country, students and faculty come from every corner of the globe.
Although Columbia had world-renowned scholars and scientists when I was an undergraduate (and I was fortunate to study with many of them), virtually every aspect of the university today represents progress for students and faculty members. But, like many of our finest institutions of higher education, Columbia faces a plethora of challenges and criticisms—often by observers who can’t distinguish “fact” from “fiction.” I’ll consider many of these challenges in this book, but my central message is simple: As good as our system of higher learning has become over the past half-century, we still have a great deal to do if we are to approach the maximum potential of these institutions.
Over the decades that I’ve spent getting to know many of these top schools, I’ve come to see the ways in which these universities have served the nation well. I’ve also witnessed their limitations and what they don’t do so well. I know the stresses and strains that they confront every day and the ways in which the outside world is increasingly uncertain of the role that these top schools serve in our nation. Social scientists are notoriously poor predictors of the future, and I’m sure I would be no better than others in pursuing that fool’s errand. My objective is to analyze the threats to American preeminence in higher learning and research, but it is not to prognosticate what these universities will look like twenty or thirty years from now as they evolve in the ecosystem of American society.6 Rather, my aim is to suggest what a great research university ought to look like within several decades. It is a normative rather than a predictive effort—one that argues that substantial changes in this high-achieving system should be made if these institutions are to approach their full potential. If there is going to be an informed conversation, or debate, over what the American research university ought to look like, then I am trying here to stimulate one.
The University’s role is not based upon a conception of neutrality or indifference to society’s problems, but an approach to the problems through the only strength which a university is entitled to assert. It is a conservative role because it values cultures and ideas, and reaffirms the basic commitment to reason. It is revolutionary because of its compulsion to discover and to know. It is modest because it recognizes that the difficulties are great and the standards are demanding.
—EDWARD H. LEVI, PRESIDENT OF THE UNIVERSITY OF CHICAGO,
I BEGIN WITH A PARADOX. As of 2015, the United States has by far the greatest system of higher education in the world. By most reckonings, we have roughly 80 percent of the top twenty universities, 70 percent of the top 50, and 60 percent of the top 100. We win the majority of Nobel science and economics prizes and other internationally prestigious awards for scholarly and scientific achievements. Scholarship produced by our universities dominates most fields and has the greatest impact on discoveries in those disciplines. In fact, American universities have become the envy of the world. Because many of the brightest and most creative people in other nations want to attend them or work at them, they represent collectively perhaps the only American industry today with a favorable balance of trade.
There are, of course, many superb universities in other countries as well as highly distinguished institutes where research is carried out. One only need look to Oxford, Cambridge, or Imperial College London, to name just three in England, or to the Pasteur Institute in France, or to some of the Max Planck Institutes in Germany, or to a number of universities in Asia as evidence that exceptional institutions can be found elsewhere. Indeed, the advancement of knowledge would benefit if we had more great universities in other nations, as it would help to solve more rapidly the extremely complex social, economic, scientific, and health problems that confront us. But in the aggregate, there is no national system of higher learning that has come close to matching what the American universities have achieved.
Most of the educated American public think of our universities in terms of teaching and the transmission of knowledge, rather than the creation of new knowledge, and most critiques of higher education focus on undergraduate education. This point of view is understandable. Families are concerned primarily with the education of their children and grandchildren and its cost, and they relate to their own educational experience. Let me be emphatically clear: Excellent teaching of undergraduates and graduate students is critically important and an integral part of the mission of great universities. It is perhaps our first calling. But the fulfillment of this teaching mission is not what has made our research universities the best in the world. Rather, our ability to fulfill one of the other central missions of great universities—the production of new knowledge through discoveries that actually change the world—has produced virtual consensus about our preeminence.1
Most new industrial nations are striving to match the quality of our great universities. Unfortunately, they are trying to imitate what we were and what we are, not what we will be or should be.2
In fact, most educated Americans don’t realize that lasers, FM radio, magnetic resonance imaging, global positioning systems, barcodes, the algorithm for Google, the fetal monitor, the nicotine patch, antibiotics, and the Richter scale were all born at our universities. Nor do they know that the development of buckyballs and nanotechnology, the discovery of the insulin gene, the birth of computers, the origin of bioengineering through the discovery of DNA, the improvement of transistors, and innovations in treating diseases through work on the mind and the brain also took place at these institutions. Few are aware that improved weather forecasting, cures for childhood leukemia, the Pap smear, and scientific agriculture were also spawned there. In addition, not many know that social and behavioral science discoveries such as the method for surveying public opinion or for projecting election results, or the concepts of congestion pricing, human capital, behavioral economics, and the self-fulfilling prophecy came from work at these institutions. Even the electric toothbrush, Gatorade, the Heimlich maneuver and, yes, Viagra, had their start there. And these are just a few illustrations of the thousands of life-altering discoveries and ideas that have emerged from these great universities.
So, if we are so good, why is there so much criticism of our system of higher learning and such concern about its supposed failures? Today, while we can easily find stories about higher education and university life in books, news media, magazines, blog postings, or on the Internet, few Americans understand what makes these universities extraordinary places, despite the challenges they face and the inadequacies of their current structures and methods of operation. Consider just a few of these challenges: The sources for building our universities with state, federal, and private foundations funding, especially in fields of science, mathematics, and engineering, have been drying up. We face enormous “pipeline” problems with our K–12 education programs, which feed students into our colleges and universities. High-anxiety testing is the fashion of the day and the metric used to compare our primary and secondary educational system against those of other nations. It remains unclear whether educational reforms such as “No Child Left Behind” or the “Common Core,” which are built on assessing the nation’s readiness by relying on the test result of students, will bear any fruit. Also unclear is whether these competency tests even approach measures of creativity or curiosity. The humanities are under attack yet again for their supposed lack of marketability and extrinsic value. Further, despite an increase in college attendance, graduation rates are well below what we ought to expect, especially at community colleges. Instead of increasing social mobility through our educational system, we are diminishing young people’s ability to realize the American dream. Levels of upward social mobility in Europe now exceed what we find in the United States. The public and politicians view the cost of a university education as excessive and escalating—increasingly unaffordable for the middle class. The economic and intellectual value of a college or graduate degree is questioned, yet rarely fully discussed. The unacceptable level of student debt continues to make headlines. Yet, the cost of government mandates regulating the behavior of faculty and others at our universities are passed on to students in the form of tuition increases.
Within the colleges and universities, there is a belief that the traditional structures on which our best universities have been built are rapidly becoming ossified. People are calling for an end to academic tenure. Academic freedom and free inquiry continue to be under attack. Anti-intellectualism simmers at the surface of public attitudes. The public is questioning the universities’ commitment to diversity, to transparency and accountability—terms used to trigger discontent. There is a belief that professors do not like to teach; that more of the actual teaching of undergraduates is done by an “underclass” of adjunct professors and lecturers. And students, we are told, don’t study very much and place more value on the outcome—a job—than on the process of education, both in and out of the classroom. The organization and values surrounding intercollegiate athletics are in a sorry state.3
There is messianic hope that the system will be saved by new technology, represented most recently in the development of online educational courses. Preachers of “disruptive technologies” believe the cost and quality problems can be solved with more clicks and fewer bricks. People question the admissions system. They denounce the quality and exorbitant salaries of academic leaders.
Simultaneously, states are decreasing support for their flagship universities—including world-class institutions like the University of Wisconsin.4 State legislators and governors across the country continue to starve their universities while asking them to teach more students and do a better job of preparing them for meaningful careers. At the same time, states are directing universities to hold down tuition increases. And finally, among other discontents, there is concern with sexual assault and sexual harassment on campuses.
Beneath all of these particulars is a fundamental erosion of trust between the government—at all levels—and our finest universities. There is a growing belief that our system of higher learning may be incapable in the future of fulfilling its compact with the nation: to provide avenues of upward social mobility, a better stock of human capital, a labor force capable of handling an increasing number of technologically sophisticated jobs, a better-informed citizenry, and to remain a key engine of innovation and discovery in our society. Little effort has been made to distinguish fact from fiction in these matters, but there is no shortage of assertions of facts in books and articles critical of the current state of affairs. What ought to be done?
This book focuses on the top American research universities—perhaps 120—rather than on all of the 4,500 or so colleges and universities in the United States. Despite representing only a small proportion of the total, these distinguished educational institutions have a disproportionate impact on the nation and the world. Changes that they make often set the stage for transformations at other colleges and universities. They also produce the majority of advanced-degree recipients and surely an overwhelming proportion of the most significant scientific, engineering, and social and behavioral science discoveries and innovations.
The Idea of a Research University
Our nation’s Founding Fathers took a far more detailed interest in formulating a model for higher learning than do our current leaders. Thomas Jefferson described the “academical village” clustered around tree-lined lawns, where students and teachers lived adjacent to each other with the grand library at the focal point of the quadrangle,5 and Benjamin Franklin envisioned a new type of college in Philadelphia.
The leaders of our first set of exceptional research universities,6 beginning in 1876 with the opening of Johns Hopkins, participated in a debate about what the research university ought to look like and how it should differ from the best systems of higher learning in the world at the time, the German and British university systems. This debate filtered into a continuing discussion in the 1920s and early 1930s over the proper shape of a university. These leaders held remarkably disparate ideas7 about the appropriate contours of universities, but the conversation led to structures and values that moved us closer to preeminence. What emerged was the system that took shape in the 1930s and began its ascent following World War II. That model provided the United States and our citizens with enormous social and economic returns, as well as better-informed and effective citizens, for more than seventy-five years.
The American university’s rise to preeminence happened with remarkable speed.8 Consider a few of the key factors responsible for this rapid rise.
Influences on Our Rise to Preeminence
The United States adopted a hybrid system of higher education that combined the emphasis on research found in both the German universities of the nineteenth and early twentieth centuries and the far older collegium system identified with Britain’s Oxbridge colleges with its own independent adaptations to the needs of a growing and increasingly technologically advanced society. The fundamental idea that we still live with was more or less settled by 1950.
- On Sale
- Jan 26, 2016
- Page Count
- 432 pages