Thanks to Eugene for inviting me to participate and talk about one of my favorite topics: legal education. Bill Henderson (Indiana) and I are dissecting the components of the U.S. News rankings in a series of papers, the first of which is now available on SSRN, and which we presented at a symposium at Indiana on rankings last spring. (There were a number of excellent papers presented there and anyone interested in the topic should read them all.) In this post I'll give a quick summary of our data and methodology. Later today I'll discuss the first of our findings, the strong evidence of market segmentation in legal education.
U.S. News began ranking law schools in 1987 with a "top 20" list based on a survey of law school deans, a method it repeated in 1988 and 1989. In 1990 and 1991 the magazine used a variety of data to generate a top 25 list. Starting in 1992, U.S. News began giving data on all the ABA-accredited law schools, although only the top 25 were ranked numerically.
It is hard to compare ranks across time because of a variety of data issues. One of the consistent inputs, and a particularly important one, is the median LSAT of entering law students. Bill and I decided to tackle this part of the rankings first, largely because it is something to which faculty members seem to pay a great deal of attention. (This past year U.S. News switched to using the 25th and 75th percentiles of entering class LSATs, in part because the magazine suspected schools were fudging the median numbers they reported. Since the 25/75th percentile numbers are reported to the American Bar Association, U.S. News thought that schools might be more honest in their reporting of those numbers.)
(A good discussion of some of these issues is in Alex Wellen's New York Times magazine story, which discusses our research, from this past Sunday.)
We used data on the law schools themselves (public/private, religious/secular, the number of large law firms (those on the American Lawyer 200 list) interviewing on campus, the percentage of the entering class in the full time program, average student loan debt, and so forth), data from the U.S. News rankings (academic reputation, lawyer-judge reputation) and the metropolitan statistical area (MSA) in which the school was located (how many law schools were in the same MSA, large firm jobs in the MSA, demographic trends in the MSA).
There are huge problems with the U.S. News rankings but there is no question that they are important. For example, a friend recently told me that she had been called by a law review about one of her manuscripts. The articles editor apologized for rejecting the manuscript and explained that the rejection had been made without reading the paper because the editors had mistakenly misclassified my friend's school as being in a lower tier law school. Now that they realized their error, the editor told her, they wanted to consider the article on the merits. I don't know how widespread this type of screening is, but that it occurred at a well-ranked, but not top journal is at least moderately disturbing.
The U.S. News rankings also have consequences for students -- several admissions directors have told both Bill and me that they are noticing a compression in the LSAT scores of applicants, with a strong clustering around the median scores reported in U.S. News. Whether applicants are relying on the overall rankings, or as Brian Leiter suggests, on individual components of the rankings, U.S. News is clearly having an impact on legal education.
The question I would like to throw out for Volokh Conspiracy readers is not whether U.S. News rankings are good or bad in aggregate, but how the competition for students with LSAT scores above each school's median (and, now, 25th and 75th percentiles) is affecting legal education.
Later today I will post a summary of our results on the segmentation of the market for legal education between the "top tier" and everyone else.