The concept of higher education as the “great equalizer” may be the best outcome of the evolution of American colleges and universities in the 20th century. As education advanced and the needs of the workforce changed, Americans recognized with clearheaded pragmatism that education offered the most certain avenue “out and up.” They took advantage of the G.I. Bill to retrain to meet the demands of the mature industrial economy. For middle-class America, the expectation became even stronger as parents prepared their children for a college degree and sacrificed what was necessary to achieve it.
We should celebrate and appreciate what got done.
At the same time, however, policies that opened access and supported choice also created an enormous higher education infrastructure with protocols that evolved from older higher education models.
Presidents became corporate managers, for example, who lived in two worlds. In the first world, they related as business leaders managing thriving enterprises that related to trustees hailing from Wall Street and Main Street, often with different expectations and expertise.
In the second world, presidents were often the equivalent of big city ward bosses using financial, social and political capital to deal with staff, faculty, students, parents and alumni in a complex and cumbersome shared governance model. Presidents became brokers between these two worlds in which they were often expected to be all things to all people at all times.
This tension between the broader world and life behind the college gates has now come to a head as the financing models continue to break down. The fixed costs of labor, infrastructure and technology, together with the long, lingering deep recession, have wreaked havoc upon even the most comprehensive financial models. There are a few universities and well-endowed liberal arts colleges that can continue to innovate and hide in plain sight. These institutions are so small in number that the quiet crisis in how to finance a college education is now open, raw and a matter of heightened public debate.
These troubles are serious enough but they underscore an even more damaging trend. The fact is that the value of a college education is no longer a right and expectation for most middle class Americans.
The emerging conflict is in large part measured by the very success that American higher education fostered. Technology calls into question the value in the expense of the residential learning experience within a defined four-year timeframe. Despite countless billions in institutional aid, federal support and private philanthropy annually, the sticker price debates intensify. Most ominously, Americans now question the need – and significantly – the value of a college degree.
The reality is, of course, that despite all of the programs put into place since the end of World War II, only a minority of Americans receives a college degree.
While the promise of a degree remains a widely accepted goal, most American families translate the receipt of a degree as economic mobility. If they appreciate the value of an educated citizenry, they understand that a college education provides the credentials necessary for their children and them to advance. Americans embrace the link between higher education and workforce preparation and appreciate the breadth that higher education provided to its graduates as a value added. In the end though jobs trump breadth of learning.
In the 21st century, American colleges and universities face a difficult choice. Their governance models work well to provide the processes and procedures necessary to manage life behind the college gates. They are not adaptable, flexible and creative enough to adjust to the coming battle likely between credentials versus degrees. Indeed, on many campuses there is little understanding that the battle has already begun.
It would be disillusioning and deeply disappointing to see policymakers choose sides.
Effectively, we must broker a resolution to how American higher education creates a system based upon its rich traditions that is flexible enough to address workforce preparation as a common policy goal. To do so, American higher education must stake out its ground for the University as a public good.
It will not be sufficient to fall back on older arguments. It may be possible to state important arguments differently.
The liberal arts do prepare you, for instance, for lifelong learning. They also teach you how to think by training you how to write, articulate, use quantitative methods, apply technology and work in a collaborative setting. These are critical skills necessary for students to become workers who advance in their careers. An outstanding case can be made for explicit links between the breadth provided by liberal arts training and the needs of the American workforce. Choosing the right language to build the case will permit higher education to claim position on the battlefield.
Higher education is part of a continuum of learning. Lifelong learning will include credentials and degrees.
If Americans appreciate the University as a public good, they will understand the difference. Behind all of the arguments about the need for training to fill the millions of unclaimed jobs that will require advanced training, there is an even deeper and more fundamental truth. As we prepare our workers, we must educate our citizens. The future of the Republic rests upon it.
A small aside: In most discussions about credentialing, little differentiation has been made between pre-baccalaureate and post-baccalaureate “credentialing.” And I think that this lack of a clear distinction between the two has often been part of a deliberate strategy by those promoting further corporatization of our universities. I would suggest that to prevent further erosion of the credit standard in favor of other more ambiguous measures such as “competencies,” pre-baccalaureate “credentialing” should simply be left to two-year institutions, where it has traditionally been.