Dogpile No Longer Flags "Gun" Searches as Likely Yielding "Adult Content" To Be Filtered Out, But Related Search Engines Continue:

James Grant reports that has stopped reporting a "likely to contain adult content" -- adult defined as "sexually explicit" -- for searches that use the word "gun." But,, and, which, Grant reports, are owned by the same company (Infospace), continue to do the same thing dogpile used to do.

There was a hilarious skit on the "Half Hour News Hour" on Fox last weekend. It featured an "interview" of a gun-control advocate claiming to have been a repeated victim of gun violence -- only to find out that he was a career criminal complaining about his victims actually being able to defend themselves. Ha!

Only problem is, this kind of thing makes it kind of hard for Fox to deny any tilt to the right...but then again they're clearly no further right than all the others are to the left.
6.8.2007 12:04am
Guest J:
I mentioned this last time, but you never replied to my comment: is there any chance that this is an algorithm based on the list of sites matched rather than the search terms used?

If the top sites matched include known adult sites that are on some list, Dogpile might be warning you about that, rather than trying to make a list of all terms that lead to adult content. In fact, that seems a lot easier to do. Your criticism would then be somewhat off-base, as it's a criticism of an algorithmic result rather than an intentional decision to categorize a word a certain way.

This could change if the sites indexed no longer match the search term, whether because of a change in text on the site, or some other reason. When I tried last time, I noted that rifle, pistol, and a few other related terms had no such warning.

Even if this is not the correct explanation, don't you think it's a possibility, and maybe worth considering and perhaps exploring? Shouldn't you ask the Dogpile people what's up rather than assuming that they are (or were) doing this filtering?
6.8.2007 1:48am
logicnazi (mail) (www):
Despite being open to fairly serious gun regulation (in the event I see suggestive evidence banning it will save significantly more third party lives than banning other forms of recreation) I'm still appalled by the fact that google won't list gun related search buttons for the google toolbar (go read the list of excluded content).

Then again I don't think doing so is any WORSE than barring pornographic search buttons or the like (which they do as well). Warning people might be all fine if they had a bit more specific warning but I don't like the idea of big search companies tinging even their non-search results with their idea of moral purity or political correctness.
6.8.2007 3:47am
What difference does an algorithm matter? The fact is that the infospace company has the ability to turn the filter on and off. They turned it off on dogpile. They have so far refused to disable the filter on and

Of course they use an algorithm. Type the words into their search engines and the answer will be evident.

You can go to and ask them why every search with the word "gun" is referred to an adult content questionnaire. Why don't they screen out "adultry" or "making herion"?

The ball is in your court.
6.8.2007 3:54am
Brian K (mail):

The ball is in your court.

The ball is in the court of those who jump to conclusions of malicious intent on the part of info space.

Given that only the word gun set off the adult filter and not rifle or pistol it is not obvious that someone was/is deliberately playing with the search. The burden of proof is on you.

Also, don't you think it's likely that dogpile, metacrawler and webcrawler use different algorithms? Dogpile is like a metasearch and likely uses a very different algorithm than the either two.
6.8.2007 4:27am
This is about infoSpace questioning your tastes when you are simply looking for something as benign as a gun.

The adult unfiltered option on the adult content warning page, does not seem to change the content of the search results. It appears that it merely re-ranks the the search results.

That adult content questionnaire that they splash up is not at all about content. They are suggesting that you are some kind of pervert. The fact that they are doubting your integrity is malicious.
6.8.2007 5:53am
Brian K (mail):

That adult content questionnaire that they splash up is not at all about content. They are suggesting that you are some kind of pervert. The fact that they are doubting your integrity is malicious.

Who's "they?" What proof do you have that "they" exist? What information do you have that disproves more rational theories? What proof do you have that infospace is a part of some conspiracy to harass pro-gun people?

It appears that it merely re-ranks the the search results.

Have you done an actual comparison of every search result? or are you just guessing?

I have little sympathy for conservatives who complain about irrelevant sites getting caught up in adult content filters. A great many people have been complaining about the strictness of these filters for a while now and the moral values conservative could care less...they thought the war on porn was worth the price.

I'll also note that there is a subset of pornography that takes pictures of girls using guns as sexual objects...perhaps the filter was reacting to these sites? This is yet another possible explanation that doesn't veer off into the realm of conspiracy theories.
6.8.2007 6:29am
Brian K (mail):

The fact that they are doubting your integrity is malicious.

I wonder if you apply this line of reasoning to other topics. Are all adult filters malicious? Do you think we should do away with them all together? Or does it only become malicious when you don't like the content it filters out?
6.8.2007 6:31am
Guest J:
The fact is that the infospace company has the ability to turn the filter on and off. They turned it off on dogpile. They have so far refused to disable the filter on and

You have misunderstood my point. My point is that they may not have done anything -- the algorithm may have responded to a change in the index of the pages matching the search term. Search engines index web pages from time to time. If in the past the search for "gun" returned pages from their index that included pages that were on a list of known porn pages (e.g. containing pornographic images) they might have flagged any search matching such pages as adult content. If a subsequent web crawl showed that the pages no longer matched the search term, then it may have automatically no longer been marked as porn.

If the other websites mentioned are not using the same index, or are using an older version of the index, this could give the results observed without anyone turning a filter on and off.

You can go to and ask them why every search with the word "gun" is referred to an adult content questionnaire. Why don't they screen out "adultry" or "making herion"?

My point is that if this is done by an algorithm based on search results, it's probably because those searches don't lead to known porn images that users have alerted them to. There need not be any plan to make it that way.
6.8.2007 8:52am
L. Halbrook:
Interesting. I can type in the names of people that write extensively about guns (ex. Dave Kopel, Stephen Halbrook, etc.) and that yields normal search results. But typing just "gun" or "Kopel and gun" gets the adult content warning.
6.8.2007 9:40am
Eugene Volokh (www):
Guest J: As I mentioned in my initial post, the search engines do let you bypass the filter; and when I do, I get gun-related sites (certainly among the top sites -- I haven't checked each of the 90-odd results).
6.8.2007 11:08am
Guest J:
L. Halbrook:

I already noted that you didn't get the warning for rifle and pistol -- just for gun. I really wonder if this may be a results-based filter.

Since the idea seems to be unclear, let me spell out how it might work:

Imagine that dogpile and similar companies have a list of domains that are the URLs for known porn sites. Say there's one called Say someone creates a web page that discusses, say, promissory estoppel, but includes in that page an image from OR someone on puts up a page on promissory estoppel. Suppose that dogpile indexes that page, and uses its algorithms to compute that it's a good hit for the search "promissory estoppel".

If what I proposed is how things work, the next time someone does a search for "promissory estoppel", they'll get a warning about adult content, because of the presence of results from a porn site. In this case, no human being chose to put the term "promissory estoppel" on a list of adult search terms -- it was an automatic action based on the results actually found.

I do not know if this is how things work at Dogpile, but it's certainly possible. Before leaping to the conclusion that results reflect the judgment of human beings at Dogpile, wouldn't it make sense to ask them how they do their filtering of adult content to ensure that this is not a by-product of an algorithm like the one I'm describing? Until someone does, I can't see how Dogpile's filtering justifies any inferences about attitudes towards guns.
6.8.2007 11:16am
I learned gun safety as a child in a rural area with no neighbors, but I don't see what the problem is with labeling info on guns and tobacco and alcohol as adult content. Why not make the choice that adult content is more of a literal term rather than "sign of perversion."
6.8.2007 11:22am
Guest J:
Prof. Volokh:
Right, I understand you noted that you can bypass the filter. I don't see what that means for the discussion, though.

As I understand the thrust of this discussion, it seems like you are noting this filtering because you think it may reflect the attitude toward guns at Dogpile. Do I have that right?

My point is that the fact that it is searches for "guns" and not "promissory estoppel", "Eugene Volokh", "apple pie and motherhood", or "goodness, sweetness and light" that produces this warning may be an accident of the content found by the Dogpile web crawler -- it my have absolutely nothing to do with a decision by any human being at Dogpile to categorize "gun" searches as potentially adult content. If so, it seems to me that the point you are making, or at least suggesting -- that the Dogpile folks have a somewhat nannying attitude toward guns -- is wrong.

If I have misread the point you were making, I apologize.
6.8.2007 11:22am
Guest J:
Prof. Volokh: Oh, on rereading your comment, I think I understand your point. You don't see any porn sites right at the top of the search results, so you guess that this is not a probable explanation.

But imagine if Dogpile is trying for the family-friendly slot and has a zero tolerance policy towards passing through known porn sites in its results without a warning. The porn site might be way down in the search results, but it has still been detected, so Dogpile throws up a warning. Again, in my hypothetical case, this happens completely irrespective of whether the search logically seems like an adult search -- just based on presence of porn in teh results.

One tiny bit of evidence was that when you bypassed the filter you reported that the number of results was one or two sites greater, if I recall correctly, no? What would those one or two additional results be other than probable porn sites? Since you haven't reviewd all the results returned, and since you haven't asked anyone at Dogpile how this works, I'm left thinking there's certainly a strong possibility things are done this way, and I can't responsibly draw any inferences whatever about Dogpile's possible attitudes toward guns.
6.8.2007 11:29am
PatHMV (mail) (www):
Guest J... that's certainly a possible theory, but one which I would find very unlikely. Pornographers work very hard to make almost any search turn up one of their pages.

Take the word "adultery." Wouldn't you agree with me that there are many pornographic websites which would be returned with that keyword? Yet on Dogpile, no "adult content" warning pops up if you search for "adultery." Similarly, if you search for "humiliation," no warning pops up, but the first couple of sites are clearly adult in nature.

I conclude from those two examples that your theory is likely incorrect.
6.8.2007 12:17pm
PatHMV (mail) (www):
The word "fetish" also does not generate the "adult content" warning.

And I'm going to stop now before the IT people come and politely inquire just what the heck I'm doing with company resources...
6.8.2007 12:19pm
MarkH112 (mail):

Why don't they screen out "adultry" or "making herion"?

Probably because "adultry" is really "adultery" and I have no idea what "herion" is!!!
6.8.2007 12:21pm
Guest J:
Thank you for exploring this theory -- your search is the best evidence so far. However, my results do not agree with yours.

I just did a dogpile search for "adultery" and "fetish". For "fetish" I received an adult content warning, while for "adultery" I did not.
I very quickly reviewed the adultery result URLs, and it looked like perhaps four might have been links to porn sites or sex shops, while the rest were mainly very serious, with one or two borderline.

Obviously dogpile doesn't have an infallible algorithm for spotting porn, so its list of porn sites would have to be updated from time to time with new ones. It's not unbelievable that a couple of porn sites matching a search for "gun" (and we know that there were two sites filtered; there might have been others that slipped through) might be on the known porn sites list while, by coincidence, the four or five sites referencing "adultery" were not yet on Dogpile's list of porn sites.

By contrast, fetish was found and categorized. So, though your test is interesting, I find it inconclusive, as in two cases it gave a result we might expect under my hypothesis, and in one case it gave a result that seems a bit unlikely under my hypothesis.
6.8.2007 12:33pm
Guest J::
Oh, and, let me add: if human beings were doing the categorization, wouldn't "adultery" be an obvious word to add to the list? And if they added "gun", wouldn't they be likely to add "rifle" or "pistol"? So it seems to me that my hypothesis -- automatic determination of "adult" searches based on results matching a known porn sites list -- is still at least as likely as a master list of "adult" seach terms, drawn from the minds of the people running Dogpile.

Probably the best thing to do, if someone wants to make a point about social attitudes based on this, is to ask Dogpile how they do it.
6.8.2007 12:39pm
Guest J:
I just looked back at the discussion and noticed that I hadn't tried the search term "humiliation". That one does seem to be strong evidence against the theory I proposed, as a very large proportion of the top results look like porn.

So I tentatively concede that PatHMV may have proven my idea wrong.

It's still possible that there could be some explanation sharing some of the characteristics of the two ideas -- for example, imagine if categorization of search results was based on user complaints about finding porn sites. Users searching for "humiliation" are unlikely to complain, whereas users searching for guns might be surprised and upset. Anyway, I still believe the wisest course would be to ask Dogpile what's up, rather than assuming that this reflects their judgment on the topic of guns.
6.8.2007 1:12pm
Tim Fowler (www):
Re: "I'm still appalled by the fact that google won't list gun related search buttons for the google toolbar (go read the list of excluded content). "

Where is the list of excluded content. I can't find it searching Googles pages about the toolbar.
6.8.2007 2:22pm