“That’s Not A Word!”
One of the things that happens to you all the time when you’re a lexicographer is that people say something to you, something perfectly reasonable, such as “I am appalled by the current celebrification of journalists” and then stop themselves to ask you “Is that a word?”
Considering that the is-it-a-word? word is usually completely comprehensible, I always say “it is now!”
When people say something “isn’t a word”, they aren’t usually saying that the item in question is a piece of rotten fruit, or a shoe, or a phone number, or some other non-lexical object. What they are saying is something like “That’s not standard English,” or “I dislike that word and wish you wouldn’t use it,” or “I am not sure that this word is in common use,” and so on. They may also want to call attention, obliquely, to the word as being their own coinage (whether or not that is true).
The ruler most people use to measure a word’s word-ness is The Dictionary. Not any specific dictionary — for most people, if a word is in any standard-looking dictionary, that’s good enough. (The Dictionary is a stand in for “Any Dictionary I Happen To Have.”)
But as a lexicographer, as someone who has seen how the word sausage is made, I think that assessing a word’s fitness for use by whether or not it is in The Dictionary is much too limiting. We’ve already seen that lexicographers can’t possibly register, much less describe, all the words that are used in English; how then, knowing that, can you still cleave to the idea that the words that are in The Dictionary are good to use, and the ones that aren’t, aren’t?
People use The Dictionary as the arbiter of a word’s worth because they are understandably lazy. They want to make a quick appeal to an incontrovertible authority, win their argument (or their game of Scrabble) and get on with their day. Using The Dictionary this way probably worked a lot better in the pre-Google age, but when you can fire up your search engine and find 20,000 hits for celebrification, it’s a bit harder to argue that it “isn’t real.”
Don’t get me wrong: celebrification may still be ugly, it may still be awkward, it may be better expressed by a paraphrase and not by a single (possibly over-suffixed word), but it’s real, all right, and an argument against its use based on “it’s not a real word, because it’s not in the dictionary” is an argument you’re eventually going to lose. Anything that’s used as a word, understood as a word, and that works like a word — is an actual, living, breathing, honest-to-goodness word. Full stop.
Sometimes people use “that’s not a real word” to mean “that’s a mistake” — that something is a misspelling, or is used incorrectly, based on traditional use. The “it’s not in the dictionary” argument doesn’t work there, either. The Encarta dictionary famously listed common misspellings, right in the A-Z, with cross-references to the more common spellings. A facetious argument could then be made that those misspellings are “in the dictionary,” and I wouldn’t bet that some eighth-grader, somewhere, didn’t try it. Dictionaries should list common meanings, even if they are considered errors by traditionalists (but they should also give a warning to that effect). Ignoring a problem never yet made it go away.
But while we’re talking about errors and mistakes, I’m not sure if anyone can announce with certainty just when an error, made by enough people over a long enough period of time, becomes the standard. I think that it takes at least three generations, and that it has to be something obscure enough that it can pass unnoticed by all but the most conscientious of copyeditors. For instance, even though confusing your and you’re is certainly widespread, I don’t see those two words become conflated any time soon — enough people still know and maintain the difference. But other terms, words we don’t use as often or as surely, can sneak by while we’re looking the other way (one that Ben Zimmer pointed out recently is minuscule as miniscule).
Whenever a lexicographer starts discussing the natural tendency of words to mutate and transform, of not-words to become words, a great howl arises. It’s only natural that people who have taken the trouble to internalize standard English and use it in generally accepted ways would be upset when others don’t take that same trouble — or even, it as it sometimes seems, any trouble at all. But the plain truth is that language changes, drifts, and evolves — transmutes, even — and it’s very, very difficult to stop it from doing so.
If language change really annoys you, to the point where you find it no longer possible to enjoy your normal daily activities, you should become a copyeditor, and then you will have the exquisite privilege of fixing the usages that annoy you all day long. Otherwise, if a new usage bothers you, I can only say, “don’t use it, then.”
The Dictionary is no longer the be-all and end-all of wordosity. If you want to be an educated word consumer, you’ll have to do a little more work than just checking for in-or-out-ness. If your real question is “should I use this word or not?” you’ll probably have to do a little bit more analysis. Who is your audience? What is their reaction to an unusual word likely to be? Would a more standard alternative make for a smoother communication of your message, or do you want and need the jolt that a new and striking term will give your listeners and readers? Will your new word be annoying (and if so, do you wish to annoy)? Or will it be playful and add a necessary shot of attitude?
Words aren’t like Bigfoot: a moment’s glimpse of a fabled creature isn’t sufficient proof for cryptozoology. But just one momentary use is perfectly fine for determining whether or not a word is “real.” The big question is what you can do with it, not whether it exists in the first place.