My post below led some commenters to ask whether there's been grade inflation in law school grades. A few related answers.
1. When I was a UCLA Law School student in 1989-92, our curve was 20% As, 40% Bs, and 40% Cs or below (the "below" grades were optional and very rare) in each course. In the mid-90s, we shifted to 20% As, 60% Bs, and 20% Cs or below. Recently, we shifted to a 25-29% As, 41-52% B/B+s, 18-22% B-s, and 5-8% Cs or below for first year classes, and 23-27% As, 50-60% B/B+s, 17-23% B-s, and 0-10% Cs or below for second and third year classes (basically a 3.2 median, slightly below a B+). So our median grades have been increasing, from B- to B to B+ish, and our Cs have been declining.
2. On the other hand, the quality of our incoming students, at least as measured by the LSAT (which to my knowledge has not had any grade inflation of its own), has been increasing, too: In 1998, the first year in which U.S. News & World Report reported the 25th and 75th percentile LSAT scores, UCLA's range was 159-165; this year it was 162-169 (on a 120-180 scale). This is part of a broader trend that is seen even at higher-ranked schools; Columbia was ranked #4 both years, but its LSAT 25-75 range rose from 165-171 to 168-173. Should this justify a corresponding rise in law school grades? I don't know how to answer that.
3. Also, for whatever it's worth, incoming UCLA law students, on average, had an A- average at their undergraduate schools (the 25-75 percentile range reported this year in U.S. News was 3.51-3.82). Back when we gave lots of Cs, lots of students would get their first Cs of their lives at UCLA Law School. Is that the way things should be? Again, I don't know how to answer that.
4. As best I can tell, the increases in our grades have been driven by one main factor: The increases in grades at other schools. We shifted to a B median in the mid-90s because we noticed that most Top 20 schools had a B median. Our B- students were roughly comparable in class rank to B students at peer schools, but they looked worse to employers who weren't that familiar with the UCLA system. (An employer could of course look closely at the descriptions of the grading systems and figure out the difference, but we were afraid that many employers wouldn't look that closely.)
We shifted to a B+ median recently because we noticed that most Top 20 schools had done the same. I'm pretty confident that we were at the trailing edge of the change, not the leading edge. We didn't want to increase our grades beyond what others were doing, but we also didn't want our students to be at a disadvantage. This sort of behavior may be bad in some overall sense. But it is sensible for a school that's trying not to leave its students unfairly disadvantaged. If someone suggested some multi-law-school grading reform, I might endorse it (though I can't speak to any antitrust law questions this might or might not raise). But so long as each school has to make these decisions by itself, I think we did what we had to do.
5. Though I'm not wild about grade inflation, I should note that a B+ median still leaves plenty of gradations between students, especially when one averages together the grades in many classes. If everyone got A+s or As (which is more or less the system at Yale, with what I'm told is roughly 20-30% of each course getting Hs, and the rest getting Ps with the exception of a very few LPs and fails), that might pose more of a problem. But a system with plenty of A+s, As, A-s, B+s, Bs, and B-s, and occasional Cs (with some required in the first year) adequately conveys to employers which students tend to be better and which tend to be worse. And to the extent that such a system causes confusion when employers compare UCLA students from one grading system with UCLA students from another (which tends to be rare, since most students are competing against others who graduate the same year or shortly before or after), maintaining the same median as other schools diminishes confusion when employers compare UCLA students with students from other schools.
6. One possible solution to this problem is to report class rank, something that I'm told virtually no schools systematically do these days (though when it comes to top graduates who are applying for clerkships or teaching jobs, many schools do in fact report informal ranks). But for complex reasons — which may be caused partly by excessive egalitarianism, and partly by a plausible (though not obviously right) concern that the difference between 60th percentile and 40th percentile probably looks bigger than it should, and that a GPA may do a better job of indicating how slight that difference is — there's been little move to return to the class rank system.
Related Posts (on one page):