Thursday, June 21, 2007

Choosing a College

There was an interesting discussion yesterday over at Mad Melancholic Feminista about a revolt by liberal arts colleges against U.S. News and World Reports' national rankings. This is a big deal because incoming students and their families take these ranking very seriously. Your ranking does have an effect on how many and what students apply and decide to attend. Part of the problem is that people selecting a college or university are often among the worst informed consumers anywhere. They really have no way of getting a real sense of what it is they are looking for. Maybe they've gone through a college, maybe not, but they have experience with only one or two and no sense of what differentiate them or makes them effective. Surely, some are better (in some sense of that word) than others. Anecdotally, I've spoken to several families who chose one school over another exclusively because it was a couple of ranks higher on the list.

The reasoning seems to go: higher on the list = better school = better education = better job upon graduation = more money = happier life. This, of course, does not actually work on several counts, but it is the first two links that I'd like to discuss.

Because these rankings have real-life effects on a school, schools have had a love/hate relationship with them. They grumble a lot while doing everything possible to raise their ranking. The criteria used are here. The largest component comes from a peer survey. Other factors seem relevant, class size, for example, is educationally relevant. Then there are factors that are not relevant, and indeed may be negatively correlated with getting a good education. Faculty compensation, for example, is a poor measure of quality teaching. Faculty are generally better compensated for research, not teaching and while there are some wonderful teachers who are also prolific publishers, often those two are in conflict and some of the teachers who will spend the time and give their all in the classroom are those who are not the scholarly stars. Similarly, the use of adjuncts decreases the score on which the rankings are based, but often adjuncts are as good if not better in the classroom than tenured faculty. And, of course, there are not only questions about the factors employed, but their weighting.

The question is whether this result is helpful to people who have little other information to make an important decision. There seem to be three possibilities here:

(1) The rankings are not actually informative, but are taken to be because of a lack of other information
(2) The rankings are informative, but schools don't want that information out there
(3) The rankings are informative, but not in the way that they are made use of by the families

I would argue that (3) is the case. There is no doubt that there are differences between institutions in terms of culture and expected level, say how well students are expected to be able to write when they come in the door, and these are reflected to some degree in the rankings. There is a difference between Amherst and Gettysburg as the ranking show,but the fine grained distinctions that families take away from the rankings are not valid.

Further, getting a good education does not mean being higher on the list. Indeed, "a good education" is not a single thing, but something relative to the student's goals, learning style, and personality. Finding the best college is not a matter of seeing how high up the list you can go, but finding the proper fit. The list can be one part of a larger set of criteria to find the right fit for a given student, but they should not be seen in the way that ranking sports teams lists the better ones above the worse ones.