Fostering Strong Comment Threads, and the Effort Rule of Comment Moderation

I sometimes blog on the process of moderating blog comments. I realize it’s a bit “inside baseball,” as most readers don’t comment or run blogs that allow them. But I see Internet comment threads as a new and relatively important kind of online discussion, and I’m very interested in the conditions in which comment threads tend to be useful or just noise. In my view, having a really good comment thread is a terrific asset to a blog: It allows the post to be the beginning of a conversation, with the rest of the conversation carried on it the thread. The interesting and new question is, what are the conditions of helpful comment threads? What kind of comment policies and software leads to the best, most interesting comment threads, and which don’t?

In my experience, there are two basic conditions of strong comment threads. Here’s the first condition: Comments need to be relatively open and accessible to those using a pseudonym. If you make it too hard to comment, or you require real names, most will stay away. They won’t want to engage, for a range of personal and professional reasons.

And here’s the second condition: There needs to be some way to moderate threads to delete inappropriate comments or ban commenters who are out of line. For every one Internet commenter who is consistently thoughtful and interesting, there are X Internet commenters who are either inclined to be or can be coaxed into becoming abrasive and obnoxious. Consider the well-known “Greater Internet fuckwad theory” from the site Penny Arcade:

There’s a lot to that, with an important caveat: When the site is a popular blog with hundreds of commenters, some of the commenters will be “normal people” and some won’t. In any collection of that many people who can post at any time, there will be some stylistic quirks: There are the commenters who always bring up their pet topic, no matter the subject of the post; the commenters who see themselves as needing to wage constant battle with perceived ideological foes; the commenters who see criticism of their views as inherently objectionabe, etc. When any one can comment, everyone gets invited, and the quirky types join in with the rest.

This diversity of audience, combined with the Greater Internet Fuckwad Theory, means that unmoderated threads have a tendency to devolve into virtual food fights. That’s especially true if the topic is controversial and relies heavily on ideological priors, like current debates here at the VC on gay marriage or the individual mandate. Threads that devolve into food fights are entertaining for the subset of commenters who get a kick out of written sparring. But they come at a major cost: They tend to discourage readers and contributors interested in more thoughtful contributions. When the thread turns to muck, the readership drops dramatically: Few people want to wade through the accusations and hostility to find the few morsels of insight. So to maintain the quality of comments, there needs to be some sort of monitoring of threads.

These two conditions combine to produce what you might call the Effort Rule of commenting: Having consistently strong comment threads requires a significant effort moderating threads. Vibrant dialogue requires a relatively open door on the front end, and keeping it from devolving into a digital food fight requires significant attention to editing on the back end. But back-end moderation is always unpleasant, for two reasons. The first is that it’s work. It requires careful judgment as to where the line us, based on the editor’s necessarily limited exposure to the full range of comments. In a blog with thousands of comments a day, no blogger can be fully informed as to the full history (sometimes going back several years) as to exactly which commenter said what to whom. Judgments have to be made, but they necessarily have to be made based on exposure to a subset of the evidence.

Second, the combination of no front-end filter and back-end moderation invariably leads to accusations of bias and claims of censorship. Commenters are most hostile when the subject is deeply controversial, which means that posts on those subjects will trigger the most need for comment moderation. But these are precisely the contexts in which people with strong views tend to interpret the facts to be whatever reaffirms their priors. A great example is the forthcoming paper by Dan Kahan et. al., “‘They Saw a Protest’: Cognitive Illiberalism and the Speech-Conduct Distinction”, which I blogged about here: When shown a video of a protest, people evaluated whether the protest was violent based on whether they supported the cause being protested.

The same basic reasoning applies to interpreting editorial decisions on a blog. The more passionate a commenter feels about the subject, the more likely they are to interpret editing or (in extreme cases) a ban on commenters as incredibly obvious evidence of bias against them based on their viewpoints. The “Joys of Anonimus” thread from a few days ago, now at 450+ comments and counting, has a lot of examples. Anonimus’s violations of the comment policy are flagrant, and he candidly admits he ignores the comment policy and says whatever he wants, but several commenters who agree with Anonimus on the issues are deeply persuaded that the real reason he has been banned is that I disagree with the merits and I’m trying to “silence” him.

Where do these points take us? First, to the conclusion that really good Internet comment threads are rare. Good comment threads require someone with the patience to do the editing work and deal with the inevitable bias accusations, efforts to circumvent bans, etc. On a group blog, each commenter need not do that kind of work; some bloggers can free ride on the efforts of others. But there needs to be at least some amount of work put into an unpleasant task to maintain or even raise the quality of threads. That’s relatively hard to find, and that means that good comment threads will be rare.

Second, I suspect the future of Internet comment threads is a bifurcation into two sorts of threads on high-traffic sites: open and unmoderated threads, where anyone can say anything and few people read the threads; and sites with more moderation on the front end, such as requiring registration through a Facebook account. Neither of those are ideal, for the reasons stated above, but they are more stable forms of comment threads because they don’t require the same amount of work from the editor.

UPDATE: My apologies that comments were off initially; I had forgotten that the software seems to do this automatically when a post has been in draft form for more than a day or two. Comments are now open. As always, civil and relevant comments only.

Comments are closed.