The DOE's "College Scorecard" Isn't Accurate

Thanks to Steve Filling of California State University, Stanislaus, who is Chair of the CSU Academic Senate, for alerting me to to an October 9 article in The Hechinger Report entitled “There’s finally federal data on low-income college graduation rates—but it’s wrong.”  Here are some excerpts:

The U.S. Department of Education has released college-by-college graduation rates for low-income students receiving federal grants, for the first time ever—information advocates, and Congress, have long demanded.

But there turns out to be a problem: A lot of the numbers are wrong.

In comparison to reviews of the same data by independent organizations, the Department of Education figures—released in conjunction with the Obama administration’s long-awaited College Scorecard, which is meant to provide consumers with helpful information about universities and colleges—are off by an average of 10, and as much as 59, percentage points. . . .

The figures measure the success of students who get Pell grants, which provide up to $5,775 a year and typically go to Americans from low-income families to pay for tuition or other college expenses at a cost to the federal government of more than $31 billion a year. Knowing how many of these students ever graduate not only tells taxpayers what they’re getting for their money; it helps to measure the effectiveness of colleges and universities at helping all their undergraduates earn degrees.

Yet while schools are required by law to provide the graduation rates of Pell recipients to any applicants who ask, a loophole protects them from having to report the same figures to the government. So the Department of Education used something called the National Student Loan Database System, or NSLDS, to calculate the percentage of people with Pell grants who earn degrees in four, five and six years.

Trouble is, the NSLDS was designed to keep track of student loans, not to monitor the graduation rates of Pell recipients. A student who gets a Pell grant, but doesn’t get another federal loan, for instance, will likely be missed. . . .

Higher-education lobbyists have objected to the College Scorecard on the grounds the information it provides is inaccurate or can be misleading. And, at least in this case, they appear to be right.

A comparison of the figures provided by the government with a report released last month by The Education Trust using data from state education systems, colleges and universities themselves, and other sources shows that, on average, there’s a 10 percentage point difference between what the College Scorecard materials estimate to be the percentage of Pell recipients who graduate within six years and what The Education Trust found they were at the 1,088 four-year institutions for which both have results. . . .

A separate Hechinger Report analysis of Pell graduation rates obtained directly from the nation’s 50 largest public and 32 biggest private colleges and universities—18 privates refused to provide them—also shows the same discrepancies with the Department of Education figures.

Here’s a chart that shows some examples of these discrepancies:

pellgradratesfinal-595x0-c-defaultAs Professor Filling put it, “Perhaps it is timely that I’m watching a PBS documentary on Heisenberg’s Uncertainty Principle as I read the Hechinger Report this morning.  I wonder if quantum theory applies to IPEDS and DOE data.”

For another critique of the “fatally flawed” College Scorecard see this article from The Huffington Post:

One thought on “The DOE's "College Scorecard" Isn't Accurate

Your comments are welcome. They must be relevant to the topic at hand and must not contain advertisements, degrade others, or violate laws or considerations of privacy. We encourage the use of your real name, but do not prohibit pseudonyms as long as you don’t impersonate a real person.