Last semester, I for the first time recorded in my exam scoring spreadsheet the length of each answer. This let me figure out the correlation between the length and the grade.
Note that my exam had 13 multiple choice questions (which amounted to 1/3 of the grade) and one long essay (which amounted to 2/3 of the grade, and for which the median answer was about 3750 words). The students had four hours to do the exam, and the exam was open book and open notes.
The correlation coefficient of the total score (which combined the essay score and the multiple choice score) and the essay word count was 0.60, which is huge as correlations go. So longer is better, by a lot, right? The correlation between the total score and the word count for exams longer than the median exam was basically zero.
In fact, I sorted the spreadsheet by word count, and then added a column for each exam that measured the correlation between total score and essay word count for all exams this exam and longer (the Excel formula for the 5th shortest exam of the 81 total, for instance, was =CORREL(B5:B$81,K5:K$81), where column B was the total score and column K was the essay length). The column started at 0.60, got steadily smaller until the median, and then immediately past the median exam the column fell to basically 0 (-0.01, to be precise) and pretty much stayed that way as the exams got longer.
I also did the same with the correlation between the essay score and the word count. For that, the inflection point didn’t appear until exam 50 out of 81, rather than 42 out of 81. That makes sense: Time spent on the essay is time not spent on the multiple choice, so there’s some tendency for the longer essays (past a certain length) to have slightly smaller multiple choice scores.
Likewise, the correlation between essay word length and multiple choice score was mildly positive if we looked at all exams (0.12), but fell to basically 0 once one set aside the 17 shortest exams — and once one set aside the 35 shortest exams, the correlation between essay word length and multiple choice score got to be -0.10 and stayed pretty much there (with some fluctuations).
Is this of any use to students? I highly doubt it — it’s hard to act on the advice, “write at least as many words as your median classmate,” and in any event simply trying to make your exam longer is unlikely to make it better (even if longer is usually better, up to a point). Still, it struck me as an interesting data point; and perhaps some students might be happy to know that, past a certain level, quantity and quality aren’t even correlated.
In any case, this is just one set of data; in past years, I didn’t include the word counts in my spreadsheets, so I couldn’t do the same analysis. But I’d love to see what other law professors find.