September 25, 2006

Better college rankings

Choosing a school is a frustrating job of buying a pig in a poke. How can you tell how good an education you or your kid will get? Nobody really has a clue, and educators, public and private, like it that way.

The famous US News & World Report college ratings mostly measure how smart the incoming freshmen are, which mostly depends upon how prestigious and wealthy the college already is. Whether the college does a good job of teaching is almost irrelevant.

For example, when I applied to Harvard many decades ago, the alumnus who interviewed me explained that he had taken classes from various superstars of the Harvard faculty such as, to the best of my recollection, John Kenneth Galbraith, David Riesman, Daniel Patrick Moynihan, and Henry Kissinger. "Wow, that must have been great!" I burbled.

"Nah," he said. "Most of them were awful teachers."

Similarly, Scott Turow's memoir One-L of his first year at Harvard Law School around 1975 recounted how blatantly dysfunctional many of that famous institution's classroom traditions were.

But that's not the point. The point of going to Harvard is to show the world you can get into Harvard and to make friends with other people who can get into Harvard.

Thus, I'm glad to see that a think tank called Education Sector has put out a detailed report "College Rankings Reformed" explaining how to create a better college ranking system, using concepts like value-added. Most of the needed data currently exists for scores of colleges, although it's now public for only a handful. For example, U. of Texas-Permian Basin appears to do a better job of improving its students' ability to write an analytical essay than does U. of Texas-Austin.

Oh yeah? Well, if going to UT-Permian Basin makes you so smart, how come you didn't have Vince Young playing on your football team? Huh? Answer me that, Mr. Analysis Guys. Hook 'em Horns!

My published articles are archived at iSteve.com -- Steve Sailer

No comments: