Regarding the purported rules of English syntax, we tend to divide into mutually hostile camps. Hip, open-minded types relish the never-ending transformations of the way we speak and write. They care about the integrity of our language only insofar as to ensure that we can still roughly understand one another. In the opposite corner glower the curmudgeons. These joyless, uptight authoritarians are forever muttering about clunky concepts such as “the unreal conditional” that nobody’s ever heard of.
I’ve thrown in my lot with the pedants. Yes, language is a living tree, eternally sprouting new shoots as other branches wither . . . blah, blah, blah. But a poorly cultivated plant can readily gnarl from lush foliage to unsightly sticks. The internet has turbocharged lexical fads (such as “turbocharge”) and grammatical decay. Rather than infuse English with a new vitality, this degeneration spreads the blight of sheer ignorance. So this month we address a set of developments in the prevailing conventions of the English language whose only commonality is that they drive me crazy.
I long ago developed the habit of mentally correcting other people’s grammatical errors, and sometimes these chiding reproofs escape my lips (“You mean ‘Ask us Democrats’ ”). Marking up casual conversation with a red pencil doesn’t make me popular, and I should learn to control myself. Yet fellow philological conservatives will recognize the impulse to immediately regroove one’s neural pathways, the better to preserve one’s fragile ear for proper English. That ear is constantly under assault by widespread misusage that threatens by repetition to be—another on-trend verb—“normalized.”
For even we rigid, grumpy anachronisms are vulnerable (a blobby political catchall I now encounter dozens of times a day). I recently received what I pleasantly mistook for a fan letter, only to unfold the very sort of mortifying reprimand that I myself hurl at grammatical slackers. My last column in Britain’s Spectator had employed “laid” as the past tense of “lie.” The stern correspondent was understandably disappointed in me. Granted, I don’t envy second-language speakers obliged to memorize the perverse tense pairings “lie/lay” and “lay/laid,” but for me those conjugations were once second nature. My instincts have been contaminated. Proofreading that column, I’d sailed right past the mistake. Those prissy mental corrections my only protection from descent to barbarism, I resolved forthwith to be more of an asshole, if only in my head.
I had the good fortune to be raised by articulate parents who spoke in complete sentences. They didn’t talk down to their children; we imbibed vocabulary like “echelon” along with our strained peas. I had no idea at the time what a favor they were doing me. I owe my parents that ear.
Consequently, when my seventh-grade English teacher spent the whole year on grammar, punctuation, and sentence diagrams, I was contemptuous. I wanted to write stories. I didn’t need to learn the rules. I could hear when a usage was incorrect without resort to Fowler’s. Yet I later felt I owed that teacher an apology.
When I taught Freshman Composition as an adjunct in my twenties, knowing the rules facilitated passing them on to my charges. I hammered it home to hundreds of eighteen-year-olds that, aside from rare instances of extremely short sentences that effectively function as a list (“I came, I saw, I conquered”), you absolutely must not join complete sentences with a comma, which may constitute the only true altruism of my otherwise selfish life. Put it on my tombstone: “She battled comma splice.”
As far as I can tell, most schools today downplay grammar and punctuation if they teach these subjects at all. (Last year, in Iowa, authorities banished S. Keyron McDermott as a high school substitute teacher for criticizing “second-grade” grammatical errors in students’ prose.) The neglect shows. I resist teaching creative writing if only because, on the few occasions that I’ve done so, the students have proved too creative. Young aspirant writers are working on novels but can’t produce comprehensible, error-free sentences. Whether they know it or not, today’s M.F.A. candidates are crying out for primitive instruction on the accusative case, which would readily clear up any confusion about “who” versus “whom” (a perfectly civilized distinction that the animals are now clamoring to revoke). Though what they want is tips on character development, what they need (and in my classes got) is a five-minute lowdown on the semicolon.
Absent such instruction, this endangered punctuation mark has slid willy-nilly to the em-dash, a crude demarcation that cannot imply relatedness or contrast, much less clearly separate list elements that contain commas. Capable of being inserted whimsically just about anywhere, the em-dash effectively has no rules, and is therefore horribly suited to an era of semantic anarchy.
Education’s having turned its back on teaching the technical aspects of composition is partially responsible for deteriorating standards in prose and speech. Lacking any familiarity with the structure of their language, people find linguistic rubrics arbitrary and unreasonable. Utter grammatical dereliction in English departments conveys that knowing the rudiments of one’s language is unimportant, in which case “correct” English is unimportant, too; it feeds the lazy, convenient, and therefore wildly popular view that there is no such thing as correct English.
Thus we witness the precipitous demise of the adverb, now that the very word “adverb” is lost on most people; mainstream newspapers now use “quicker” rather than “more quickly” to modify a verb. Many a subeditor suffers under the misguided impression that when the subject comprises a fair number of words, it is not only acceptable but mandatory to put a single comma between the subject and the verb (e.g., “The Jack and Jill who went up the hill to fetch a pail of water, fell down.” Anathema!). Comparative and superlative forms are no longer prescribed but a matter of mood; one of my favorite movies might be titled today Dumb and More Dumb. “Literally” now means “really,” or, worse, “figuratively.” (Anyone claiming that “my head literally exploded” would not have lived to tell the tale.) “Notorious” is employed with such abandon as a synonym for “famous” that when using it correctly one can never be certain that one’s pejorative intensions have been understood. The differentiation between quantity and number having been deep-sixed, “less” and “fewer” now are interchangeable. Thus on the rare occasions these adjectives are actually deployed accurately on TV, my husband and I will interject mischievously, “He means fewer water” or “She means less bottles.”
Just try explaining that “as” is used with clauses while “like” takes a direct object when your audience hasn’t the haziest idea what a clause or a direct object is, and don’t expect your average American to infer that a direct object will hence take the accusative case. In the absence of any structural grasp, even examples (“as I do” versus “like me”) won’t make a lasting impression, and meantime you’ve merely identified yourself as a pain in the butt. So forget the even more tortuous explanation of the restrictive and nonrestrictive uses of “that” and “which,” even though this distinction can have huge implications for the meaning of a sentence.
So when writing dialogue in fiction I often feel guilty. I’m supposed to make my characters speak as (not “like”) they would in real life. Yet rhetorical verisimilitude propagates the very errors that I revile. Now that the predicate nominative is dead and buried, I can’t have a character announce “It is I!” without also conveying that this person is unbearable, perhaps outright insane, or imported from a previous century through time travel.
Therefore I, too, contribute to semantic drift. In our digital age, online dictionaries are revised almost continually, whereas the issuance of a new print edition of Webster’s or the Oxford English Dictionary is the expensive labor of many years. In the analog world, then, official changes to meaning and usage were subject to considerable scrutiny, discouraging the institutionalization of commonplace mistakes. These days, what were once authoritative and inherently conservative reference sources easily acquiesce to mob rule. Misconceptions transform lickety-split into new conventions. We consolidate ignorance.
Although well-spoken, my parents nevertheless embraced two errors of usage, both of which my brother and I have struggled to rectify in our own speech, because misunderstandings instilled in childhood are tough to override. Hence when the copy editor on my first novel claimed that there was no such word as “jerry-rig,” I was incensed. Determined to prove her wrong, I went to my trusty, dusty-blue Webster’s Seventh (based on the august Webster’s Third), only to find she was right: “jerry-rig” wasn’t listed. Apparently I’d grown up with a garbled portmanteau of “gerrymander,” “jerry-build,” and the word I really wanted: “jury-rig.” The scales fell from my eyes.
A convert, I explained to my mother her lifelong mistake, but she was having none of it. “Oh, no,” she said gravely. “ ‘Jury-rig’ refers to rigging a jury, which is very serious.” Explaining the allusion to a “jury mast,” a makeshift sail, with no etymological relationship to a judicial “jury,” got me nowhere. It’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.
Although this is an argument I should have won in 1986, I’d lose it today. Dictionary.com informs us, “Jerry-rigged is a relatively new word. Many people consider it to be an incorrect version of jury-rigged, but it’s widely used in everyday speech.” With no such embarrassment, Merriam-Webster’s online dictionary now proudly lists “jerry-rigged” as meaning “organized or constructed in a crude or improvised manner.” The mob—and my mother—have won. So much for my precious filial condescension.
Or take “nonplussed,” which I was taught meant “blasé.” When another copy editor forced me to look this one up, it turned out to mean almost the opposite: “at a loss as to what to say, think, or do.” What I thought meant “unruffled” pretty much meant “ruffled.” But after laboriously internalizing the correct meaning of “nonplussed,” I find I needn’t have bothered. Enough other people have made my parents’ mistake that at the top of a Google search “nonplussed” is defined as “surprised and confused so much that they are unsure how to react,” and “Informal, North American: not disconcerted; unperturbed.” Great.
I ask you: What good is a word that now means both “perturbed” and “unperturbed”? This democratic inclusiveness of delusion effectively knocks “nonplussed” out of the language’s functional vocabulary. If it means two opposite things, it ceases to communicate. If I say I’m “nonplussed,” what do you know? I’m either dumbfounded or indifferent. I might as well have said nothing.
So, given the pervasive misunderstanding of “enervated,” any day now online dictionaries are bound to start listing an accepted meaning of the word as “excited and keyed up,” and that will be the end of “enervated.” If the adjective ever formally means either “energized” or “without energy,” we’ll have to chuck it on the trash heap.
We also find semantic drift in pronunciation, one instance of which has ruined a favorite party trick. I used to love submitting that “flaccid” is actually pronounced “flak-sid,” challenging my incredulous audience to look it up and sitting back to watch the consternation. (That un-onomatopoeic hard c in a word for “floppy” is counter-instinctual.) My defiant company always vowed to keep mispronouncing the word anyway. At last mass cluelessness has prevailed. According to Business Insider, “The standard pronunciation is ‘flak-sed,’ not ‘flas-sid.’ . . . Until recently, most dictionaries listed only the first pronunciation.” That “until recently” pours cold water on all my fun. The accepted pronunciation “flas-sid” has even slithered into the modern O.E.D.
Within the past couple of years, one misappropriation has spread like knotweed. In linguistics, “performative” has an interesting and specific definition. It describes a verb whose usage enacts its action, as in “I promise,” “I curse you,” “I apologize,” “bless you”: these are performative verbs. “I now pronounce you husband and wife” is a classic “performative utterance.” In my old print dictionaries, the word meaning “relating to performance” is “performatory”—an adjective that has failed to catch on—and the linguistic meaning of the now-fetishized word has been lost. For “performative” in the sense of “posturing and insincere” is everywhere, now that “virtue signaling” appears to have exhausted itself. As we went through “virtue signaling” like single-ply toilet paper—the term only took off after a Spectator piece in 2015—there must be a brisk market for descriptions of left-wingers vaunting their ethical credentials with self-serving theatricality. (Search “performative” and Google suggests “performative wokeness.”) Given such a hunger for words to capture it, moral flamboyance is clearly a mark of the age.
The steady decay of English syntax is a first-world problem par excellence, and tsk-tsking over sloppy grammar amounts to a haughty and rather geriatric form of entertainment. Besides, my own generation probably instigated this decline in the first place. For my erudite father, “decimate” can only mean “destroy a tenth of”; hypocritically, some semantic drift strikes me as sensible, and I happily employ the verb’s broader sense. My father decried Captain Kirk’s “to boldly go where no man has gone before!” though split infinitives leave me, if you will, nonplussed.
We let-it-all-hang-out boomers may have celebrated lingual creativity, but the dangling dependent clauses and modifiers that have grown rife, even in books, hardly qualify as inventive. Neither can “between you and I” pass for a form of self-expression. Honestly, English requires so little declension in comparison with most languages that expecting the declension of pronouns in compound objects isn’t asking for the moon.
However picayune and pitifully old-fashioned the bereavement may seem to most people, for me the erosion of style, clarity, and precision in everyday speech and prose is a loss. Call it a quality-of-life issue. A century ago, in diaries or letters to the editor, ordinary people wrote with astonishing elegance and correctness. Elegance is related to correctness.
In the fiction biz, of course, syntax is a matter of craft. Early in my career, I still had a blind, unjustified confidence in my semantic inner ear, often railing against the edicts of officious, nitpicking copy editors. I was always wrong. If nowadays I also wrangle with copy editors, that’s because the more recent crop’s acquaintance with English syntax is abysmal. Their poor grasp of the discretionary and nondiscretionary comma is not their fault. Never having been taught the rules in seventh grade, they don’t even have the vocabulary to cogently discuss our differences, because they don’t know a predicate nominative from a hole in the ground. But I want to be saved from myself, because I suffer from the same misconceptions as anyone else. (I’m still shaky on “may” versus “might.”) I want an expert, a stickler, a real whip-wielding dominatrix. Yet all the terrifying taskmasters bashing me over the head with Strunk and White appear to have died off.
It’s always dangerous to display hubris about one’s proper English, since pedants like nothing more than to catch out other pedants. Fellow curmudgeons will also recognize all of my bugbears as losing battles. Ultimately, the evolution of language is a story of mob rule. But surely there’s a tattered nobility to valiantly fighting wars that we know we can’t win.