What’s the “Value Added” of a College Degree?

In a recent release titled “Beyond College Rankings,” Jonathan Rothwell, a fellow at the Brookings Institution, examined the “value added” of two- and four-year colleges in the United States. In doing so, Mr. Rothwell assessed the difference in the expected economic success of alumni and the actual outcomes of graduates.

The conclusions are based on comparative outcomes from each college and university surveyed examining factors like academic performance, the demographics of the student body, and the types of courses offered at each school. Mr. Rothwell then compared these findings against the actual performance of graduates from these institutions based on mid-career earning levels, the value of job skills, and the ability to repay student loans.

The results are not surprising. The schools scoring highest include California Institute of Technology, Colgate University, Massachusetts Institute of Technology, Rose-Hulman Institute of Technology, Carleton College, Washington and Lee University, SUNY Maritime College, Clarkson University, Manhattan College and Stanford University. Mr. Rothwell noted that the value-added measures are based on factors “that best predict measurable economic outcomes, attempt to isolate the effect colleges themselves have on these outcomes, above and beyond what students’ backgrounds would predict. “

The study isolated five “quality” factors that predict how well students perform economically, including the amount earned by people who hold a degree in a field offered by the college, the average labor market value of skills listed on resumes, the share of graduates in STEM occupations, college completion rates, and the average financial support provided to each student.

Mr. Rothwell cautioned that his findings at nearly 7,000 colleges and universities are a “starting point” to assess “a college’s strengths and weaknesses with respect to career preparation.” He also noted that “due diligence is required by trustees and public officials to assess the direction of the college, its current leadership, its role in the community, and other factors.” He argued further that students “will need to consider likely outcomes against the cost of attendance, scholarship opportunities, the availability of degree programs, and other personal factors.”

Critics assailed the study because the findings skewed toward engineering schools with high starting salaries, institutions that over-enrolled well-connected wealthy students and their families, and colleges with strong alumni networks. These critics countered that location, and the job openings in the region, made as much difference in who did well economically as alumni as did their preparation by their alma mater.

The Rothwell study is valuable because it establishes a link among education –whether two- or four-year – and alumni productivity. It also demonstrates that good measurements must be seen as a “cradle through career” continuum. In doing so, the study develops research across data sets to present findings that “connect the dots” in its scope of study better than the college “scorecards” currently promoted by print and social media, state governments and the US Department of Education.

Yet the weaknesses of the study are obvious. The findings are narrow and incomplete. They infer that the purpose of American higher education is to prepare a workforce for whom the best success metric is the salary that graduates earn. Is this really the best that American higher education can do?

Let me be clear.   The accreditation agencies and other regulators are right to push for metrics to measure the success of American higher education. What we do best should be nurtured and supported. What we do less well should be abandoned in favor of more productive approaches. But the first steps in a balanced measurement must begin by understanding the mission of the college, its purpose and function, its commitment to access, its ability to create opportunity, and its efficiency in providing a quality education delivered in a timely manner.

Most American undergraduates don’t go to Colgate or MIT. Almost half of them start their education in community colleges, many of which are not located in hot employment towns, are significantly underfunded, and have young and immature alumni networks just starting to take hold.   Many public and private four-year colleges lack the brand appeal of Washington and Lee and the endowment of Stanford. Yet, almost all of them educate effectively, even if they are at best imperfect.

Perhaps the time has come to develop metrics that matter. Are we assessing whether a college or university is a “going concern,” measuring workforce development, or looking for productive citizens who can adapt their communities, however they define them, to the global economy?

If the metrics measure student outcomes, however, we should look first to mission and imagine student success as measured more broadly than the salary earned. Do graduates enter the middle class? Are there contributing factors, like home ownership and community involvement, to present a more balanced and nuanced picture of a well-educated graduate? Are we meeting workforce commitments in non-STEM employment – the kind of stuff that fuels the American economy and builds out a comprehensive national employment picture?

How can we link inputs to outcomes to present a comprehensive value proposition on higher education’s contribution to 21st Century America?

In the end, it come down to one fundamental question. What is the purpose of American higher education?

One thought on “What’s the “Value Added” of a College Degree?

  1. What is the purpose of higher education? Perhaps, it is in good measure to help graduates lead more fulfilling and, well, happy lives, and not just more remunerative ones. As the husband of a career public defender and the proud father of a daughter at a prestigious law school who hopes to enter public service, I read the article “Public Defender Beats Partner on Happiness Scale for Lawyers, Study Finds in the New York Times with more than a little satisfaction (see http://well.blogs.nytimes.com/2015/05/12/lawyers-with-lowest-pay-report-more-happiness). Here’s the gist:

    “Researchers who surveyed 6,200 lawyers about their jobs and health found that the factors most frequently associated with success in the legal field, such as high income or a partner-track job at a prestigious firm, had almost zero correlation with happiness and well-being. However, lawyers in public-service jobs who made the least money, like public defenders or Legal Aid attorneys, were most likely to report being happy. Lawyers in public-service jobs also drank less alcohol than their higher-income peers. And, despite the large gap in affluence, the two groups reported about equal overall satisfaction with their lives.”

    Perhaps we need to measure how well colleges and universities prepare their students for happiness and well-being.

Comments are closed.