Dilige et quod vis fac.– St. Augustine
Did you know that probing the seamy underbelly of u.s. lexicography reveals ideological strife and controversy and intrigue and nastiness and fervor on a nearly hanging-chad scale? For instance, did you know that some modern dictionaries are notoriously liberal and others notoriously conservative, and that certain conservative dictionaries were actually conceived and designed as corrective responses to the “corruption” and “permissiveness” of certain liberal dictionaries? That the oligarchic device of having a special “Distinguished Usage Panel … of outstanding professional speakers and writers” is an attempted compromise between the forces of egalitarianism and traditionalism in English, but that most linguistic liberals dismiss the Usage Panel as mere sham-populism? Did you know that u.s. lexicography even had a seamy underbelly?
The occasion for this article is Oxford University Press’s semi-recent release of Bryan a. Garner’s A Dictionary of Modern American Usage. The fact of the matter is that Garner’s dictionary is extremely good, certainly the most comprehensive usage guide since e.w. Gilman’s Webster’s Dictionary of English Usage, now a decade out of date.[Note: With the advent of online databases, Garner has access to far more examples of actual usage than did Gilman, and he deploys them to great effect. (Fyi, Oxford’s 1996 New Fowler’s Modern English Usage is also extremely comprehensive and good, but its emphasis is on British usage.)::] Its format, like that of Gilman and the handful of other great American usage guides of the last century, includes entries on individual words and phrases and expostulative small-cap Mini-Essays]. on any issue broad enough to warrant more general discussion. But the really distinctive and ingenious features of A Dictionary of Modern American Usage involve issues of rhetoric and ideology and style, and it is impossible to describe why these issues are important and why Garner’s management of them borders on genius without talking about the historical contexts\footnote[2] Sorry about this phrase: I hate this phrase, too. This happens to be one of those very rare times when “historical context” is the phrase to use and there is no equivalent phrase that isn’t even worse. (I actually tried “lexico-temporal backdrop” in one of the middle drafts, which I think you’ll agree is not preferable.)
Interpolation
The above [paragraph] is motivated by the fact that this reviewer almost always sneers and/or winces when he sees “historical context” deployed in a piece of writing and thus hopes to head off any potential sneers/winces from the reader here, especially in an article about felicitous usage. in which admau appears, and this context turns out to be a veritable hurricane of controversies involving everything from technical linguistics to public education to political ideology, and these controversies take a certain amount of time to unpack before their relation to what makes Garner’s usage guide so eminently worth your hard-earned reference-book dollar can even be established; and in fact there’s no way even to begin the whole harrowing polymeric discussion without taking a moment to establish and define the highly colloquial term snoot.
From one perspective, a certain irony attends the publication of any good new book on American usage. It is that the people who are going to be interested in such a book are also the people who are least going to need it, i.e., that offering counsel on the finer points of u.s. English is Preaching to the Choir. The relevant Choir here comprises that small percentage of American citizens who actually care about the current status of double modals and ergative verbs. The same sorts of people who watched Story of English on pbs (twice) and read w. Safire’s column with their half-caff every Sunday. The sorts of people who feel that special blend of wincing despair and sneering superiority when they see Express Lane — 10 Items Or Less or hear dialogue used as a verb or realize that the founders of the Super 8 motel chain must surely have been ignorant of the meaning of suppurate. There are lots of epithets for people like this — Grammar Nazis, Usage Nerds, Syntax Snobs, the Language Police. The term I was raised with is snoot.[Note: Snoot (n) (highly colloq) is this reviewer’s nuclear family’s nickname a clef for a really extreme usage fanatic, the sort of person whose idea of Sunday fun is to look for mistakes in Satire’s column’s prose itself. This reviewer’s family is roughly 70 percent snoot, which term itself derives from an acronym, with the big historical family joke being that whether s.n.o.o.t. stood for “Sprachgefuhl Necessitates Our Ongoing Tendance” or “Syntax Nudniks of Our Time” depended on whether or not you were one.::] The word might be slightly self-mocking, but those other terms are outright dysphemisms. A snoot] can be defined as somebody who knows what dysphemism means and doesn’t mind letting you know it.
I submit that we snoots are just about the last remaining kind of truly elitist nerd. There are, granted, plenty of nerd-species in today’s America, and some of these are elitist within their own nerdy purview (e.g., the skinny, carbuncular, semi-autistic Computer Nerd moves instantly up on the totem pole of status when your screen freezes and now you need his help, and the bland condescension with which he performs the two occult keystrokes that unfreeze your screen is both elitist and situationally valid). But the snoot’s purview is interhuman social life itself. You don’t, after all (despite withering cultural pressure), have to use a computer, but you can’t escape language: Language is everything and everywhere; it’s what lets us have anything to do with one another; it’s what separates us from the animals; Genesis 11:7-10 and so on. And we snoots know when and how to hyphenate phrasal adjectives and to keep participles from dangling, and we know that we know, and we know how very few other Americans know this stuff or even care, and we judge them accordingly.
In ways that certain of us are uncomfortable about, snoots’ attitudes about contemporary usage resemble religious/political conservatives’ attitudes about contemporary culture:[Note: This is true in my own case at any rate — plus also the “uncomfortable” part. I teach college English part-time — mostly Lit, not Comp. But I am also so pathologically anal about* usage that every semester the same thing happens: The minute I have read my students’ first set of papers, we immediately abandon the regular Lit syllabus and have a three-week Emergency Remedial Usage Unit, during which my demeanor is basically that of somebody, teaching hiv] prevention to intravenous-drug users. When it emerges (as it does, every time) that 95 percent of these intelligent upscale college students have never been taught, e.g., what a clause is or why a misplaced only can make a sentence confusing, I all but pound my head on the blackboard; I exhort them to sue their hometown school boards. The kids end up scared, both of me and for me.
*Editor’s Note: Author insisted this phrase replace “obsessed with” and took umbrage at the suggestion that this change clearly demonstrated the very quality he wished to denigrate.
We combine a missionary zeal and a near-neural faith in our beliefs’ importance with a curmudgeonly hell-in-a-handbasket despair at the way English is routinely manhandled and corrupted by supposedly educated people. The Evil is all around us: boners and clunkers and solecistic howlers and bursts of voguish linguistic methane that make any snoot’s cheek twitch and forehead darken. A fellow snoot I know likes to say that listening to most people’s English feels like watching somebody use a Stradivarius to pound nails. We[Note: Please note that the strategically repeated 1-p pronoun is meant to iterate and emphasize that this reviewer is very much one too, a snoot, plus to connote the nuclear family mentioned supra. snootitude runs in families. In admau’s Preface, Bryan Garner mentions both his father and grandfather and actually uses the word genetic, and it’s probably true: 95 percent of the snoots I know have at least one parent who is, by profession or temperament or both, a snoot. In my own case, my mom is a Comp teacher and has written remedial usage books and is a snoot of the most rabid and intractable sort. At least part of the reason I am a snoot] is that for years Mom brainwashed us in all sorts of subtle ways. Here’s an example. Family suppers often involved a game: If one of us children made a usage error. Mom would pretend to have a coughing fit that would go on and on until the relevant child had identified the relevant error and corrected it. It was all very self-ironic and lighthearted; but still, looking back, it seems a bit excessive to pretend that your child is actually denying you oxygen by speaking incorrectly. But the really chilling thing is that I now sometimes find myself playing this same “game” with my own students, complete with pretend pertussion.Interpolation
As something I’m all but sure Harper’s will excise, I’ll also insert that we even had a lighthearted but retrospectively chilling little family song that Mom and we little snootlets would sing in the car on long trips while Dad silently rolled his eyes and drove (you have to remember the title theme of Underdog in order to follow the song): When idiots in this world appear And fail to be concise or clear And solecisms rend the ear The cry goes up both far and near
For Blunder Dog
Blunder Dog
Blunder Dog
Blunder Dog
[etc.]*
*(Since this’ll almost surely get cut, I’ll admit that, yes, I, as a kid, was the actual author of this song, But by this time I’d been thoroughly brainwashed. And just about the whole car sang along. It was sort of our family’s version of “100 Bottles … Wall.”) are the Few, the Proud, the Appalled at Everyone Else.
Thesis Statement For Whole Article
Issues of tradition vs. egalitarianism in u.s. English are at root political issues and can be effectively addressed only in what this article hereby terms a “Democratic Spirit.” A Democratic Spirit is one that combines rigor and humility, i.e., passionate conviction plus sedulous respect for the convictions of others. As any American knows, this is a very difficult spirit to cultivate and maintain, particularly when it comes to issues you feel strongly about. Equally tough is a d.s.’s criterion of 100 percent intellectual integrity — you have to be willing to look honestly at yourself and your motives for believing what you believe, and to do it more or less continually.This kind of stuff is advanced u.s. citizenship. A true Democratic Spirit is up there with religious faith and emotional maturity and all those other top-of-the-Maslow-Pyramid-type qualities people spend their whole lives working on. A Democratic Spirit’s constituent rigor and humility and honesty are in fact so hard to maintain on certain issues that it’s almost irresistibly tempting to fall in with some established dogmatic camp and to follow that camp’s line on the issue and to let your position harden within the camp and become inflexible and to believe that any other camp is either evil or insane and to spend all your time and energy trying to shout over them.
I submit, then, that it is indisputably easier to be dogmatic than Democratic, especially about issues that are both vexed and highly charged. I submit further that the issues surrounding “correctness” in contemporary American usage are both vexed and highly charged, and that the fundamental questions they involve are ones whose answers have to be “worked out” instead of simply found.
A distinctive feature of admau is that its author is willing to acknowledge that a usage dictionary is not a bible or even a textbook but rather just the record of one smart person’s attempts to work out answers to certain very difficult questions. This willingness appears to me to be informed by a Democratic Spirit. The big question is whether such a spirit compromises Garner’s ability to present himself as a genuine “authority” on issues of usage. Assessing Garner’s book, then, involves trying to trace out the very weird and complicated relationship between Authority and Democracy in what we as a culture have decided is English. That relationship is, as many educated Americans would say, still in process at this time.
A Dictionary of Modern American Usage has no Editorial Staff or Distinguished Panel. It’s conceived, researched, and written ab ovo usque ad mala by Bryan Garner. This is an interesting guy. He’s both a lawyer and a lexicographer (which seems a bit like being both a narcotics dealer and a dea agent). His 1987 A Dictionary of Modern Legal Usage is already a minor classic; now, instead of practicing law anymore, he goes around conducting writing seminars for j.d.’s and doing prose-consulting for various judicial bodies. Garner’s also the founder of something called the h.w. Fowler Society,[Note: If Samuel Johnson is the Shakespeare of English usage, think of Henry Watson Fowler as the Eliot or Joyce. His 1926 A Dictionary of Modern English Usage is the granddaddy of modern usage guides, and its dust-dry wit and blushless imperiousness have been models for every subsequent classic in the field, from Erie Partridge’s Usage and Abusage to Theodore Bernstein’s The Careful Writer to Wilson Follett’s Modern American Usage to Gilman’s ’89 Webster’s.::] a worldwide group of usage-Trekkies who like to send one another linguistic boners clipped from different periodicals. You get the idea. This Garner is one serious and very hard-core snoot].
The lucid, engaging, and extremely sneaky Preface to admau serves to confirm Garner’s snootitude in fact while undercutting it in tone. For one thing, whereas the traditional usage pundit cultivates a sort of remote and imperial persona — the kind who uses one or we to refer to himself — Garner gives us an almost Waltonishly endearing sketch of his own background:
I realized early at the age of 15 \footnote[7] (Garner prescribes spelling out only numbers under ten. I was taught that this rule applies just to Business Writing and that in all other modes you spell out one through nineteen and start using cardinals at 20.* De gustibus non est disputandum.)Although this reviewer regrets the biosketch’s failure to mention the rather significant social costs of being an adolescent whose overriding passion is English usage,[Note: What follow in the Preface are “… the ten critical points that, after years of working on usage problems, I’ve settled on.” These points are too involved to treat separately, but a couple of them are slippery in the extreme — e.g., “10. Actual Usage. In the end, the actual usage of educated speakers and writers is the over arching criterion for correctness,” of which both “educated” and “actual” would require several pages of abstract clarification and qualification to shore up against Usage Wars-related attacks, but which Garner rather ingeniously elects to define and defend via their application in his dictionary itself.]*Editor’s Note: The Harper’s style manual prescribes spelling out all numbers up to too. — that my primary intellectual interest was the use of the English language… It became an all-consuming’ passion… I read everything I could find on the subject. Then, on a wintry evening while visiting New Mexico at the age of 16, I discovered Eric Partridge’s Usage and Abusage. I was enthralled. Never had I held a more exciting book… Suffice it to say that by the time I was 18, I had committed to memory most of Fowler, Partridge, and their successors…
The “unprecedented” and “full disclosure” here are actually good-natured digs at Garner’s Fowlerite predecessors, and a subtle nod to one camp in the wars that have raged in both lexicography and education ever since the notoriously liberal Webster’s Third New International Dictionary came out in 1961 and included such terms as heighth and irregardless without any monitory labels on them. You can think of Webster’s Third as sort of the Fort Sumter of the contemporary Usage Wars. These Wars are both the context and the target of a very subtle rhetorical strategy in A Dictionary of Modern American Usage, and without talking about them it’s impossible to explain why Garner’s book is both so good and so sneaky.
We regular citizens tend to go to The Dictionary for authoritative guidance.[Note: Editor’s Note: The Harper’s style manual prescribes okay. and what isn’t? Nobody elected them, after all. And simply appealing to precedent or tradition won’t work, because what’s considered correct changes over time. In the 1600s, for instance, the second-singular pronoun took a singular conjugation — “You is.” Earlier still, the standard 2-s pronoun wasn’t you but thou. Huge numbers of now acceptable words like clever, fun, banter, and prestigious entered English as what usage authorities considered errors or egregious slang. And not just usage conventions but English itself changes over time; if it didn’t, we’d all still be talking like Chaucer. Who’s to say which changes are natural and which are corruptions? And when Bryan Garner or e.] Ward Gilman do in fact presume to say, why should we believe them?
These sorts of questions are not new, but they do now have a certain urgency. America is in the midst of a protracted Crisis of Authority in matters of language. In brief, the same sorts of political upheavals that produced everything from Kent State to Independent Counsels have produced an influential contra-snoot school for whom normative standards of English grammar and usage are functions of nothing but custom and superstition and the ovine docility of a populace that lets self-appointed language authorities boss them around. See for example mit’s Steven Pinker in a famous New Republic article — “Once introduced, a prescriptive rule is very hard to eradicate, no matter how ridiculous. Inside the writing establishment, the rules survive by the same dynamic that perpetuates ritual genital mutilations” — or, at a somewhat lower pitch, Bill Bryson in Mother Tongue: English and How It Got That Way:
Who sets down all those rules that we all know about from childhood the idea that we must never end a sentence with a preposition or begin one with a conjunction, that we must use each other for two things and one another for more than two … ? The answer, surprisingly often, is that no one does, that when you look into the background of these “rules” there is often little basis for them.In admau’s Preface, Garner himself addresses the Authority Question with a Trumanesque simplicity and candor that simultaneously disguise the author’s cunning and exemplify it:As you might already suspect, I don’t shy away from making judgments. I can’t imagine that most readers would want me to. Linguists don’t like it, of course, because judgment involves subjectivity.[Note: This is a clever half-truth. Linguists compose only one part of the anti-judgment camp, and their objections to usage judgments involve way more than just “subjectivity.”] It isn’t scientific. But rhetoric and usage, in the view of most professional writers, aren’t scientific endeavors. You don’t want dispassionate descriptions; you want sound guidance. And that requires judgment.Whole monographs could be written just on the masterful rhetoric of this passage. Note for example the ingenious equivocation of judgment in “I don’t shy away from making judgments” vs. “And that requires judgment.” Suffice it to say that Garner is at all times keenly aware of the Authority Crisis in modern usage; and his response to this crisis is in the best Democratic Spirit rhetorical.So….
Corollary To Thesis Statement For Whole Article
The most salient and timely feature of Garner’s book is that it’s both lexicographical and rhetorical. Its main strategy involves what is known in classical rhetoric as the Ethical Appeal. Here the adjective, derived from the Greek ethos, doesn’t mean quite what we usually mean by ethical. But there are affinities. What the Ethical Appeal amounts to is a complex and sophisticated “Trust me.” It’s the boldest, most ambitious, and also most distinctively American of rhetorical Appeals, because it requires the rhetor to convince us not just of his intellectual acuity or technical competence but of his basic decency and fairness and sensitivity to the audience’s own hopes and fears.[Note: In this last respect, recall for example w.j. Clinton’s famous “I feel your pain,” which was a blatant if not particularly masterful Ethical Appeal.]These are not qualities one associates with the traditional snoot usage-authority, a figure who pretty much instantiates snobbishness and bow-tied anality, and one whose modern image is not improved by stuff like American Heritage Dictionary Distinguished Usage Panelist Morris Bishop’s “The arrant solecisms of the ignoramus are here often omitted entirely, ‘irregardless’ of how he may feel about this neglect” or critic John Simon’s “The English language is being treated nowadays exactly as slave traders once handled their merchandise…” Compare those lines’ authorial personas with Garner’s in, e.g., “English usage is so challenging that even experienced writers need guidance now and then.”
The thrust here is going to be that A Dictionary of Modern American Usage earns Garner pretty much all the trust his Ethical Appeal asks us for. The book’s “feel-good” spirit (in the very best sense of “feel-good”) marries rigor and humility in such a way as to allow Garner to be extremely prescriptive without any appearance of evangelism or elitist putdown. This is an extraordinary accomplishment. Understanding why it’s basically a rhetorical accomplishment, and why this is both historically significant and (in this reviewer’s opinion) politically redemptive, requires a more detailed look at the Usage Wars.
You’d sure know lexicography had an underbelly if you read the little introductory essays in modern dictionaries — pieces like Webster’s deu’s “A Brief History of English Usage” or Webster’s Third’s “Linguistic Advances and Lexicography” or ahd-3’s “Usage in the American Heritage Dictionary: The Place of Criticism.” But almost nobody ever bothers with these little intros, and it’s not just their six-point type or the fact that dictionaries tend to be hard on the lap. It’s that these intros aren’t actually written for you or me or the average citizen who goes to The Dictionary just to see how to spell (for instance) meringue. They’re written for other lexicographers and critics, and in fact they’re not really introductory at all but polemical. They’re salvos in the Usage Wars that have been under way ever since editor Philip Gove first sought to apply the value-neutral principles of structural linguistics to lexicography in Webster’s Third. Gove’s famous response to conservatives who howled[Note: Really, howled, blistering reviews and outraged editorials from across the country — from the Times and The New Yorker and good old Life, or q.v. this from the January ’62 Atlantic: “We have seen a novel dictionary formula improvised, in great part, out of snap judgments and the sort of theoretical improvement that in practice impairs; and we have seen the gates propped wide open in enthusiastic hospitality to miscellaneous confusions and corruptions. In fine, the anxiously awaited work that was to have crowned cisatlantic linguistic scholarship with a particular glory turns out to be a scandal and a disaster.” when Webster’s Third endorsed OK and described ain’t as “used orally in most parts of the u.s.] by many cultivated speakers [sic]” was this: “A dictionary should have no traffic with … artificial notions of correctness or superiority. It should be descriptive and not prescriptive.” These terms stuck and turned epithetic, and linguistic conservatives are now formally known as Prescriptivists and linguistic liberals as Descriptivists.
The former are far better known. When you read the columns of William Satire or Morton Freeman or books like Edwin Newman’s Strictly Speaking or John Simon’s Paradigms Lost, you’re actually reading Popular Prescriptivism, a genre sideline of certain journalists (mostly older ones, the vast majority of whom actually do wear bow ties) whose bemused irony often masks a Colonel Blimp’s rage at the way the beloved English of their youth is being trashed in the decadent present. The plutocratic tone and styptic wit of Safire and Newman and the best of the Prescriptivists is often modeled after the mandarin-Brit personas of Eric Partridge and h.w. Fowler, the same Twin Towers of scholarly Prescriptivism whom Garner talks about revering as a kid.\footnote[15] Note for example the mordant pith (and royal we) of this random snippet from Partridge’s Usage and Abusage:
anxious of. ‘I am not hopeless of our future. But I am profoundly anxious of it’, Beverley Nichols, News of England, 1938: which made us profoundly anxious For (or about) — not of — Mr Nichols’s literary future.Or see the near-Himalayan condescension of Fowler, here on some other people’s use of words to mean things the words don’t really mean:slipshod extension … is especially likely to occur when some accident gives currency among the uneducated to words of learned origin, & the more if they are isolated or have few relatives in the vernacular… The original meaning of feasible is simply doable (L facere do); but to the unlearned it is a mere token, of which he has to infer the value from the contexts in which he hears it used, because such relatives as it has in English — Feat, feature, faction, &c. — either fail to show the obvious family likeness to which he is accustomed among families of indigenous words, or are (like malfeasance) outside his range.Descriptivists, on the other hand, don’t have weekly columns in the Times. These guys tend to be hard-core academics, mostly linguists or Comp theorists. Loosely organized under the banner of structural (or “descriptive”) linguistics, they are doctrinaire positivists who have their intellectual roots in the work of Auguste Comte and Ferdinand de Saussure and their ideological roots firmly in the u.s. sixties. The brief explicit mention Garner’s Preface gives this crew—
Somewhere along the line, though, usage dictionaries got hijacked by the descriptive linguists.[Note: Utter bushwa: As admau’s body makes clear, Garner knows exactly when the Descriptivists started influencing language guides.] who observe language scientifically. For the pure descriptivist, it’s impermissible to say that one form of language is any better than another: as long as a native speaker says it, it’s OK—and anyone who takes a contrary stand is a dunderhead… Essentially, descriptivists and prescriptivists are approaching different problems. Descriptivists want to record language as it’s actually used, and they perform a useful function—though their audience is generally limited to those willing to pore through vast tomes of dry-as-dust research.— is disingenuous in the extreme, especially the “approaching different problems” part, because it vastly underplays the Descriptivists’ influence on u.s. culture. For one thing, Descriptivism so quickly and thoroughly took over English education in this country that just about everybody who started junior high after c. 1970 has been taught to write Descriptively—via “freewriting,” “brainstorming,” “journaling,” a view of writing as self-exploratory and -expressive rather than as communicative, an abandonment of systematic grammar, usage, semantics, rhetoric, etymology. For another thing, the very language in which today’s socialist, feminist, minority, gay, and environmentalist movements frame their sides of political debates is informed by the Descriptivist belief that traditional English is conceived and perpetuated by Privileged wasp Males[Note: (which in fact is true)] and is thus inherently capitalist, sexist, racist, xenophobic, homophobic, elitist: unfair. Think Ebonics. Think of the involved contortions people undergo to avoid he as a generic pronoun, or of the tense deliberate way white males now adjust their vocabularies around non-w.m.’s. Think of today’s endless battles over just the names of things—“Affirmative Action” vs. “Reverse Discrimination,” “Pro-Life” vs. “Pro-Choice,” “Undercount” vs. “Vote Fraud,” etc.The Descriptivist revolution takes a little time to unpack, but it’s worth it. The structural linguists’ rejection of conventional usage rules depends on two main arguments. The first is academic and methodological. In this age of technology, Descriptivists contend, it’s the Scientific Method — clinically objective, value-neutral, based on direct observation and demonstrable hypothesis — that should determine both the content of dictionaries and the standards Of “correct” English. Because language is constantly evolving, such standards will always be fluid. Gore’s now classic introduction to Webster’s Third outlines this type of Descriptivism’s five basic edicts: “1–Language changes constantly; 2–Change is normal; 3–Spoken language is the language; 4–Correctness rests upon usage; 5–All usage is relative.”
These principles look prima facie OK — commonsensical and couched in the bland simple s.-v.-o, prose of dispassionate Science — but in fact they’re vague and muddled and it takes about three seconds to think of reasonable replies to each one of them, viz.:
- OK, but how much and how fast?
- Same thing. Is Heraclitean flux as normal or desirable as gradual change ? Do some changes actually serve the language’s overall pizzazz better than others? And how many people have to deviate from how many conventions before we say the language has actually changed? Fifty percent? Ten percent?
- This is an old claim, at least as old as Plato’s Phaedrus. And it’s specious. If Derrida and the infamous Deconstructionists have done nothing else, they’ve debunked the idea that speech is language’s primary instantiation.[Note: Standard Written English (swe) is also sometimes called Standard English (se]) or Educated English, but the inditement-emphasis is the same.
Semi-Interpolation
Plus note that Garner’s Preface explicitly names admau’s intended audience as “writers and editors.” And even ads for the dictionary in such organs as The New York Review of Books are built around the slogan “If you like to write … Refer to us.”*
*(Yr. snoot rev. cannot help observing, w/r/t these ads, that the opening r in Refer here should not be capitalized after a dependent clause + ellipses — Quandoque bonus dormitat Homerus.)
- Fine, but whose usage? Gove’s (4) begs the whole question. What he wants to imply here, I think, is a reversal of the traditional entailment-relation between abstract rules and concrete usage: Instead of usage ideally corresponding to a rigid set of regulations, the regulations ought to correspond to the way real people are actually using the language. Again, fine, but which people? Urban Latinos? Boston Brahmins? Rural Midwesterners? Appalachian Neogaelics?
- Huh? If this means what it seems to mean, then it ends up biting Gove’s whole argument in the ass. (5) appears to imply that the correct answer to the above “which people?” is: “All of them!” And it’s easy to show why this will not stand up as a lexicographical principle. The most obvious problem with it is that not everything can go in The Dictionary. Why not? Because you can’t observe every last bit of every last native speaker’s “language behavior,” and even if you could, the resultant dictionary would weigh 4 million pounds and have to be updated hourly.[Note: True, some sort of 100 percent compendious real-time Mega-dictionary might be possible online, though it’d take a small army of lexical webmasters and a much larger army of in situ actual-use reporters and surveillance techs; plus it’d be gnp-level expensive.] The fact is that any lexicographer is going to have to make choices about what gets in and what doesn’t. And these choices are based on … what? And now we’re right back where we started.
It is true that, as a snoot, I am probably neurologically predisposed to look for flaws in Gove et al.’s methodological argument. But these flaws seem awfully easy to find. Probably the biggest one is that the Descriptivists’ “scientific lexicography” — under which, keep in mind, the ideal English dictionary is basically number-crunching; you somehow observe every linguistic act by every native/naturalized speaker of English and put the sum of all these acts between two covers and call it The Dictionary — involves an incredibly simplistic and outdated understanding of what scientific means. It requires a naive belief in scientific objectivity, for one thing. Even in the physical sciences, everything from quantum mechanics to Information Theory has shown that an act of observation is itself part of the phenomenon observed and is analytically inseparable from it.
If you remember your old college English classes, there’s an analogy here that points up the trouble scholars get into when they confuse observation with interpretation. Recall the New Critics.[Note: New Criticism refers to t.s. Eliot and i.a. Richards and f.r. Leavis and Cleanth Brooks and Wimsatt & Beardsley and the whole “close reading” school that dominated literary criticism from wwi well into the seventies.] They believed that literary criticism was best conceived as a “scientific” endeavor: The critic was a neutral, careful, unbiased, highly trained observer whose job was to find and objectively describe meanings that were right there — literally inside — pieces of literature. Whether you know what happened to the New Criticism’s reputation depends on whether you took college English after c. 1975; suffice it to say that its star bas dimmed. The New Critics had the same basic problem as Gove’s Methodological Descriptivists: They believed that scientific meant the same thing as neutral or unbiased. And that linguistic meanings could exist “objectively,” separate from any interpretive act.
The point of the analogy is that claims to objectivity in language study are now the stuff of jokes and shudders. The epistemological assumptions that underlie Methodological Descriptivism have been thoroughly debunked and displaced — in Lit by the rise of post-structuralism, Reader-Response Criticism, and Jaussian Reception Theory; in linguistics by the rise of Pragmatics — and it’s now pretty much universally accepted that (a) meaning is inseparable from some act of interpretation and (b) an act of interpretation is always somewhat biased, i.e., informed by the interpreter’s particular ideology. And the consequence of (a) and (b) is that there’s no way around it — decisions about what to put in The Dictionary and what to exclude are going to be based on a lexicographer’s ideology. And every lexicographer’s got one. To presume that dictionary-making can somehow avoid or transcend ideology is simply to subscribe to a particular ideology, one that might aptly be called Unbelievably Naive Positivism.
There’s an even more important way Descriptivists are wrong in thinking that the Scientific Method is appropriate to the study of language:
Even if, as a thought experiment, we assume a kind of nineteenth-century scientific realism-in which, even though some scientists’ interpretations of natural phenomena might be biased[Note: (“Evidence Of Cancer Link Refuted By Tobacco Institute Researchers”)] the natural phenomena themselves can be supposed to exist wholly independent of either observation or interpretation — no such realist supposition can be made about “language behavior,” because this behavior is both hum, an and fundamentally normative. To understand this, you have only to accept the proposition that language is by its very nature public — i.e., that there can be no such thing as a Private Language\footnote[23] This proposition is in fact true, as is interpolatively demonstrated below, and although the demonstration is extremely persuasive it is also, as you can see from the size of this FN, lengthy and involved and rather, umm, dense, so that again you’d probably be better off simply granting the truth of the proposition and forging on with the main text.
Interpolative Demonstration Of The Fact That There Is No Such
Thing As A Private LanguageIt’s sometimes tempting to imagine that there can be such a things as Private Language. Many of us are prone to lay-philosophising about the weird privacy of our own mental states, for example, and from the fact that when my knee hurts only I can feel it, it’s tempting to conclude that for me the word pain has a very subjective internal meaning that only I can truly understand. This line of thinking is sort of like the adolescent pot-smoker’s terror that his own inner experience is both private and unverifiable, a syndrome that is techinically known as Cannabalic Solipsism. Eating ChipsAhoy! and staring very intently at the television’s network pga event, for instance, the adolescent potsmoker is struck by ghastly possibility that, e.g., what he sees as the color green and what other people call “the color green” may in fact not be the same color experiences at all The fact that both he and someone else call Pebble Beach’s fairways green and a stoplight’s GO signal green appears to guarantee only that there is a similar consistency in their color experience of fairways and GO lights, not that the actual subjective quality of those color experiences is the same; it could be that what the ad. pot-smoker experiences as green everyone else actually experiences as blue, and what we “mean” by the Word blue is what he “means” by green, etc., etc., until the Whole line of thinking gets so vexed and exhausting that the a.p.-s, ends up slumped crumb-strewn and paralyzed in his chair.
The point here is that the idea of a Private Language, like Private Colors and most of the other solipsistic conceits with which this particular reviewer has at various times been afflicted, is both deluded and demonstrably false.
In the case of Private Language, the delusion is usually based on the belief that a word such as pain has the meaning it does because it is somehow “connected” to a feeling in my knee. But as Mr. l. Wittgenstein’s Philosophical Investigations proved in the 1950s, words actually have the meanings they do because of certain males and verification tests that are imposed on us from outside our own subjectivities, viz., by the community in which we have to get along and communicate with other people. Wittgenstein’s argument, which is admittedly very complex and gnomic and opaque, basically centers on the fact that a word like pain means what it does for me because of the way the community I’m part of has tacitly agreed to use pain.
If you’re thinking that all this foetus not only abstract but also pretty irrelevant to the Usage Wars or to anything you have any real interest in at all, you are very much mistaken. If words’ meanings depend on transpersonal rules and these rules on community consensus, language is not only conceptually non-Private but also irreducibly public, political, and ideological. This means that questions about our national consensus on grammar and usage arc actually bound up with every last social issue that millennial America’s about — class, race, gender, morality, tolerance, pluralism, cohesion, equality, fairness, money: You name it. and then to observe the way Methodological Descriptivists seem either ignorant of this fact or oblivious to its consequences, as in for example one Charles Fries’s introduction to an epigone of Webster’s Third called The American College Dictionary:
A dictionary can be an “authority” only in the sense in which a book of chemistry or of physics or of botany can be an “authority” by the accuracy and the completeness of its record of the observed facts of the field examined, in accord with the latest principles and techniques of the particular science.This is so stupid it practically drools. An “authoritative” physics text presents the results of physicists’ observations and physicists’ theories about those observations. If a physics textbook operated on Descriptivist principles, the fact that some Americans believe that electricity flows better downhill (based on the observed fact that power lines tend to run high above the homes they serve) would require the Electricity Flows Better Downhill Theory to be included as a “valid” theory in the textbook — just as, for Dr. Fries, if some Americans use infer for imply, the use becomes an ipso facto “valid” part of the language. Structural linguists like Gove and Fries are not, finally, scientists but census-takers who happen to misconstrue the importance of “observed facts.” It isn’t scientific phenomena they’re tabulating but rather a set of human behaviors, and a lot of human behaviors are — to be blunt — moronic. Try, for instance, to imagine an “authoritative” ethics textbook whose principles were based on what most people actually do.Norm-wise, let’s keep in mind that language didn’t come into being because our hairy ancestors were sitting around the veldt with nothing better to do. Language was invented to serve certain specific purposes:[Note: Norms, after all, are just practices people have agreed on as optimal ways of doing things for certain purposes. They’re not laws,but they’re not laissez-faire, either.] “That mushroom is poisonous”; “Knock these two rocks together and you can start a fire”; “This shelter is mine!” And so on. Clearly, as linguistic communities evolve over time, they discover that some ways of using language are “better” than others — meaning better with respect to the community’s purposes. If we assume that one such purpose might be communicating which kinds of food are safe to eat, then you can see how, for example, a misplaced modifier might violate an important norm:
People who eat that kind of mushroom often get sick” confuses the recipient about whether he’ll get sick only if he eats the mushroom frequently or whether he stands a good chance of getting sick the very first time he eats it. In other words, the community has a vested practical interest in excluding this kind of misplaced modifier from acceptable usage; and even if a certain percentage of tribesmen screw up and use them, this still doesn’t make m.m.’s a good idea.
Maybe now the analogy between usage and ethics is clearer. Just because people sometimes lie, cheat on their taxes, or scream at their kids, this doesn’t mean that they think those things are “good.” The whole point of norms is to help us evaluate our actions (including utterances) according to what we as a community have decided our real interests and purposes are. Granted, this analysis is oversimplified; in practice it’s incredibly hard to arrive at norms and to keep them at least minimally fair or sometimes even to agree on what they are (q.v. today’s Culture Wars). But the Descriptivists’ assumption that all usage norms are arbitrary and dispensable leads to — well, have a mushroom.
The connotations of arbitrary here are tricky, though, and this sort of segues into the second argument Descriptivists make. There is a sense in which specific linguistic conventions are arbitrary. For instance, there’s no particular metaphysical reason why our word for a four-legged mammal that gives milk and goes Moo is cow and not, say, prtlmpf. The uptown phrase for this is “the arbitrariness of the linguistic sign,” and it’s used, along with certain principles of cognitive science and generative grammar, in a more philosophically sophisticated version of Descriptivism that holds the conventions of swe to be more like the niceties of fashion than like actual norms. This “Philosophical Descriptivism” doesn’t care much about dictionaries or method; its target is the standard snoot claim supra — that prescriptive rules have their ultimate justification in the community’s need to make its language meaningful.
The argument goes like this. An English sentence’s being meaningful is not the same as its being grammatical. That is, such clearly ill-formed constructions as “Did you seen the car keys of me?” or “The show was looked by many people” are nevertheless comprehensible; the sentences do, more or less, communicate the information they’re trying to get across. Add to this the fact that nobody who isn’t damaged in some profound Oliver Sacksish way actually ever makes these sorts of very deep syntactic errors[Note: In his Language Istinct. How the Mind Creates Language (1994), Steven Pinker puts it this way: “No one, not even a valley girl, has to be told not to say Apples the eat boy or The child seems sleeping or Who did you meet John and? or the vast, vast majority of the millions of trillions of mathematically possible combinations of words.::] and you get the basic proposition of Noam Chomsky’s generative linguistics, which is that there exists a Universal Grammar beneath and common to all languages, plus that there is probably an actual part of the human brain that’s imprinted with this Universal Grammar the same way birds’ brains are imprinted with Fly South and dogs’ with Sniff Genitals. There’s all kinds of compelling evidence and support for these ideas, not least of which are the advances that linguists and cognitive scientists and a.i.] researchers have been able to make with them, and the theories have a lot of credibility, and they are adduced by the Philosophical Descriptivists to show that since the really important rules of language are at birth already hardwired into people’s neocortex; swe prescriptions against dangling participles or mixed metaphors are basical ly the linguistic equivalent of whalebone corsets and short forks for salad. As Descriptivist Steven Pinker puts it, “When a scientist considers all the high-tech mental machinery needed to order words into everyday sentences, prescriptive rules are, at best, inconsequential decorations.”
This argument is not the barrel of drugged trout that Methodological Descriptivism was, but it’s still vulnerable to some objections. The first one is easy. Even if it’s true that we’re all wired with a Universal Grammar, it simply doesn’t follow that all prescriptive rules are superfluous. Some of these rules really do seem to serve clarity, and precision. The injunction against twoway adverbs (“People who eat this often get sick”) is an obvious example, as are rules about other kinds of misplaced modifiers (“There are many reasons why lawyers lie, some better than others”) and about relative pronouns’ proximity to the nouns they modify (“She’s the mother of an infant daughter who works twelve hours a day”).
Granted, the Philosophical Descriptivist can question just how absolutely necessary these rules are it’s quite likely that a recipient of clauses like the above could figure out what the sentences mean from the sentences on either side or from the “overall context” or whatever. A listener can usually figure out what I really mean when I misuse infer for imply or say indicate for say, too. But many of these solecisms require at least a couple extra nanoseconds of cognitive effort, a kind of rapid sift-and-discard process, before the recipient gets it. Extra work. It’s debatable just how much extra work, but it seems indisputable that we put some extra neural burden on the recipient when we fail to follow certain conventions. W/r/t confusing clauses like the above, it simply seems more “considerate” to follow the rules of correct swe … just as it’s more “considerate” to de-slob your home before entertaining guests or to brush your teeth before picking up a date. Not just more considerate but more respectful somehow — both of your listener and of what you’re trying to get across. As we sometimes also say about elements of fashion and etiquette, the way you use English “Makes a Statement” or “Sends a Message” — even though these Statements/Messages often have nothing to do with the actual information you’re trying to transmit.
We’ve now sort of bled into a more serious rejoinder to Philosophical Descriptivism: From the fact that linguistic communication is not strictly dependent on usage and grammar it does not necessarily follow that the traditional rules of usage and grammar are nothing but “inconsequential decorations.” Another way to state the objection is that just because something is “decorative” does not necessarily make it “inconsequential.” Rhetorically, Pinker’s flip dismissal is bad tactics, for it invites the very question it begs: inconsequential to whom?
Take, for example, the Descriptivism claim that so-called correct English usages such as brought rather than brung and felt rather than feeled are arbitrary and restrictive and unfair and are supported only by custom and are (like irregular verbs in general) archaic and incommodious and an all-around pain in the ass. Let us concede for the moment that these objections are 100 percent reasonable. Then let’s talk about pants. Trousers, slacks. I suggest to you that having the “correct” subthoracic clothing for u.s. males be pants instead of skirts is arbitrary (lots of other cultures let men wear skirts), restrictive and unfair (u.s. females get to wear pants), based solely on archaic custom (I think it’s got something to do with certain traditions about gender and leg position, the same reasons girls’ bikes don’t have a crossbar), and in certain ways not only incommodious but illogical (skirts are more comfortable than pants; pants ride up; pants are hot; pants can squish the genitals and reduce fertility; over time pants chafe and erode irregular sections of men’s leg hair and give older men hideous half-denuded legs, etc. etc.). Let us grant — as a thought experiment if nothing else — that these are all reasortable and compelling objections to pants as an androsartorial norm. Let us in fact in our minds and hearts say yes — shout yes — to the skirt, the kilt, the toga, the sarong, the jupe. Let us dream of or even in our spare time work toward an America where nobody lays any arbitrary sumptuary prescriptions on anyone else and we can all go around as comfortable and aerated and unchafed and unsquished and motile as we want.
And yet the fact remains that, in the broad cultural mainstream of millennial America, men do not wear skirts. If you, the reader, are a u.s. male, and even if you share my personal objections to pants and dream as I do of a cool and genitally unsquishy American Tomorrow, the odds are still 99.9 percent that in 100 percent of public situations you wear pants/slacks/shorts/trunks. More to the point, if you are a u.s. male and also have a u.s. male child, and if that child were to come to you one evening and announce his desire/intention to wear a skirt rather than pants to school the next day, I am 100-percent confident that you are going to discourage him from doing so. Strongly discourage him. You could be a Molotov-tossing anti-pants radical or a kilt manufacturer or Steven Pinker himself — you’re going to stand over your kid and be prescriptive about an arbitrary, archaic, uncomfortable, and inconsequentially decorative piece of clothing. Why? Well, because in modern America any little boy who comes to school in a skirt (even, say, a modest all season midi) is going to get stared at and shunned and beaten up and called a Total Geekoid by a whole lot of people whose approval and acceptance are important to him.[Note: In the Case of Steve Pinker Jr., those people are the boy’s peer and teachers and crossing guards etc. In the case of adult cross-dressers and drag queens who have jobs in the Straight World and wear pants to those jobs, it’s coworkers and clients and people on the subway. For the die-hard slob who nevertheless wears a coat and a tie to work, it’s mostly his boss, who himself doesn’t want his employee’s clothes to send clients “the wrong message.” But of course it’s all basically the same thing.] In our culture, in other words, a boy who wears a skirt is Making a Statement that is going to have all kinds of gruesome social and emotional consequences.
You see where this is going. I’m going to describe the intended point of the pants analogy in terms I’m sure are simplistic — doubtless there are whole books in Pragmatics or psycholinguistics or something devoted to unpacking this point. The weird thing is that I’ve seen neither Descriptivists nor snoots deploy it in the Wars.[Note: In fact, the only time one ever hears the issue made explicit is in radio ads for tapes that promise to improve people’s vocabulary. These ads are extremely ominous and intimidating and always start out with “Did You Know People Judge You By The Words You Use?”]
When I say or write something, there are actually a whole lot of different things I am communicating. The propositional content (the actual information I’m trying to convey) is only one part of it. Another part is stuff about me, the communicator. Everyone knows this. It’s a function of the fact that there are uncountably many well-formed ways to say the same basic thing, from e.g. “I was attacked by a bear!” to “Goddamn bear tried to kill me!” to “That ursine juggernaut bethought to sup upon my person!” and so on. And different levels of diction and formality are only the simplest kinds of distinction; things get way more complicated in the sorts of interpersonal communication where social relations and feelings and moods come into play. Here’s a familiar sort of example. Suppose that you and I are acquaintances and we’re in my apartment having a conversation and that at some point I want to terminate the conversation and not have you be in my apartment anymore. Very delicate social moment. Think of all the different ways I can try to handle it: “Wow, look at the time”; “Could we finish this up later?”; “Could you please leave now?”; “Go”; “Get out”; “Get the hell out of here”; “Didn’t you say you had to be someplace?”; “Time for you to hit the dusty trail, my friend”; “Off you go then, love”; or that sly old telephone-conversation ender: “Well, I’m going to let you go now”; etc.(n) And then think of all the different factors and implications of each option.
The point here is obvious. It concerns a phenomenon that snoots blindly reinforce and that Descriptivists badly underestimate and that scary vocab-tape ads try to exploit. People really do “judge” one another according to their use of language. Constantly. Of course, people judge one another on the basis of all kinds of things — weight, scent, physiognomy, occupation, make of vehicle[Note: (… not to mention color, gender, creed-you can see how fraught and charged all this is going to get)] — and, again, doubtless it’s all terribly complicated and occupies whole battalions of sociolinguists. But it’s clear that at least one component of all this interpersonal semantic judging involves acceptance, meaning not some touchy-feely emotional affirmation but actual acceptance or rejection of somebody’s bid to be regarded as a peer, a member of somebody else’s collective or community or Group. Another way to come at this is to acknowledge something that in the Usage Wars gets mentioned only in very abstract terms: “Correct” English usage is, as a practical matter, a function of whom you’re talking to and how you want that person to respond — not just to your utterance but also to you. In other words, a large part of the agenda of any communication is rhetorical and depends on what some rhet-scholars call “Audience” or “Discourse Community.”\footnote[29] Discourse Community is an example of that rare kind of academic jargon that’s actually a valuable addition to swe because it captures something at once very complex and very specific that no other English term quite can.*
*(The above is an obvious attempt to preempt readerly sneers/winces at the term’s continued deployment in this article.) And the United States obviously has a huge number of such Discourse Communities, many of them regional and/or cultural dialects of English: Black English, Latino English, Rural Southern, Urban Southern, Standard Upper-Midwest, Maine Yankee, East-Texas Bayou, Boston BlueCollar, on and on. Everybody knows this. What not everyone Posted by cds at April 28, 2004 02:27 PM