The Brookings Institution thinks the local community college where I live is doing a great job.
At least I think it does.
But it’s really hard to tell.
Brookings – a Washington, D.C., think tank – evaluated data from hundreds of colleges and universities across the country and came up with Beyond College Rankings, a report only a statistician could love, as hinted at by the secondary headline of the report, “A Value-Added Approach to Assessing Two- and Four-Year Schools.”
One thing I have always been proud of as a reporter and editor is my ability to read any report, no matter how dense, understand it and translate the main points into plain English for the average reader.
But Brookings has stumped me.
I understand what “value-added” is. That part the report does well enough to explain: “the difference between actual alumni outcomes (like salaries) and the outcomes one would expect given a student’s characteristics and the type of institution.”
In other words, Brookings decided to look into how well graduates of different schools do, compare that to the cost, and see whether the students are getting a good deal. Fair enough.
But this is where things start to fly apart.
Caldwell Community College and Technical Institute earned a score of 92 on its graduates’ mid-career earnings, tied for 21st best in the country and the best of all colleges and universities in North Carolina. Awesome, right? Brookings explains that the number means CCC&TI students go on to earn about 10 percent more in the middle of their careers than would be expected based on their characteristics – including their “academic preparation,” ethnicity and family income – and the college’s location and level of degrees offered.
But the college scored only 35 for its graduates’ “occupational earning power,” the average salary of graduates as reported by the website LinkedIn.com, and only 24 for the percentage of graduates repaying their student loans.
So graduates’ mid-career earnings are more than expected, but their average salary is less than expected, and the percentage of students who fail to repay their loans is much worse than expected?
What does all that mean? The college’s graduates earn more but don’t really?
I downloaded the Excel spreadsheet to see whether I could make better sense of that than I could of the summary on the Brookings website.
I’ve never seen a spreadsheet like it. If you were to try to print the spreadsheet at a normal type size, just the width of each line would go clear across the average office desk and spill down to the floor. With the spreadsheet up on my computer screen, I kept scrolling right, scrolling right, scrolling right, and there were ever more fields. Scores and scores and scores, numbers, percentages, factors, and then I finally hit some fields that had the word “RANK” in the title. But there were so many ranks. Brookings measured and ranked everything, it seemed. And the rankings were all over the map. Good, bad, high, low.
Lex Menz, the reporter who covers CCC&TI, wants to write a story about the Brookings report, but it’s hard to know where to start when you don’t understand what you’re writing about. It’s hard even to come up with questions to help you figure it out.
She called Edward Terry, CCC&TI’s public information officer. He couldn’t translate the report either.
She hopes to interview someone at Brookings who can translate it.
It’s clear, especially from looking at the data in spreadsheet form, that a tremendous amount of work went into this study and the report. It’s equally clear that the report’s findings seem likely to be wasted if no one can drill through the numbers and put them into plain English.
The clearest sentence in the Brookings report may be this: “The choice of whether and where to attend college is among the most important investment decisions individuals and families make, yet people know little about how institutions of higher learning compare along important dimensions of quality.”
And if those people read this report, they will still know little.