National Jurist decided to toss its hat into the law school rankings ring, and the result is something of a joke. Among other things, NJ decided to base 20 percent of each school’s score on the haphazard evaluation of its professors on Ratemyprofessors.com. (No, really. I couldn’t make this stuff up if I tried.) As Brian Leiter notes, this ranks among the most ridiculous criteria ever used in a law school ranking. It would be methodologically absurd to base any amount of a school’s ranking on this “data,” but 20 percent? And someone got paid to put this together? If that were not bad enough, some of Leiter’s readers appear to have discovered errors in the calculations, and that’s before raising questions about other aspects of these new rankings. It’s no wonder Above the Law calls these rankings “pure ridiculousness.”
I have no problem with law school rankings and greater law school transparency. Giving prospective law students more ways to evaluate their options is all to the good. No ranking is perfect. For instance, there are good arguments for placing greater weight on costs and outcomes than does U.S. News, and there has been an interesting debate about how best to measure faculty productivity and scholarly impact. It can be informative to consider why some schools perform better under one set of metrics than another. If the methodology is reasonably sound (and competently applied) it will reveal something, and readers can decide for themselves how much weight to give the results. But for a ranking to be worthwhile, it must represent a good faith effort to measure something that matters. How anyone at NJ thought their new ranking satisfied this minimal criterion is beyond me.