Go to Hachette Book Group home
Join the Club!
Use code DAD23 for 20% off + Free shipping on $45+ Shop Now!
The Years of Talking Dangerously
Formats and Prices
Format:ebook $12.99 $16.99 CAD
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around May 5, 2009. This date is subject to change due to shipping delays beyond our control.
Also available from:
The pieces collected here are snapshots of the language during the final years of the Bush era. Of course labeling a stretch of time as an “era” tends to make it seem tidier and more coherent than it actually was. Islamo-fascism and intrapreneur, ownership and under the bus, marriage and macaca—what exactly do they have in common, other than that they were all airing in the same season? You think of a TV commercial for one of those Hits of the 1970s compilation CDs, with a montage of images of beanbag chairs, the fall of Saigon, streakers, Kent State, Bruce Lee, and Saturday Night Fever flashing across the screen against a soundtrack by Gloria Gaynor, Kenny Rogers, and The Clash.
That sense of disconnected jumble is inescapable when you try to pin down “the state of the language.” At any given time there are a lot of different things going on that leave their marks on the way we speak, each moshing to its own rhythm. There are changes in technology, which have not only added perky new words like blogosphere, splog, and twitter (a clear improvement on older, clunkier neologisms formed with prefixes like virtual and cyber) but pioneered new frontiers in English orthography, a glimpse of the way we’ll all write when the st8 hz withrd awA. (See “All Thumbs.”) There are the subtle shifts in sensibility that would be hard to pin down if the language didn’t give them away. Snarky was doing quiet duty as British slang for “ill-tempered” until a decade or so ago, when journalists and bloggers repurposed it as a vogue word for the territory between bitchy and cheeky. Whatever exactly precipitated the shift, it was the same spirit that led people to start using um to introduce sarcastic corrections of other people’s foolish mistakes, as in “Um . . . you might try plugging it in first” (“Pause for Thought”).
Some innovations seem to answer only to the caprices of fashion. I can come up with a story about why throw so-and-so under the bus suddenly caught on to describe a self-serving betrayal (“Under the Bus”). But I suspect it has the after-the-fact feel of most explanations of fads and vogues. (In retrospect, it seems historically inevitable that the 1970s should have seen the rise of the four-inch platform shoe, but it would probably have seemed just as inevitable if the decade had wound up shuffling around in flats.)
Then there are the evanescent items that linger for only as long as the news stories they were connected to. Macaca made its first and only appearance in the language in 2006, when Senator George Allen used it to refer to an Indian American at a campaign rally; the ensuing ruckus probably cost Allen a close election and tipped the Senate to the Democrats. The public’s interest in persistent vegetative state expired with Terri Schiavo. And dwarf planet swam into our ken for a month or so that same year when astronomers voted to strip Pluto of its full planetary citizenship (“Last Planet Standing”).
The Pluto affair showed how purely semantic squabbles could penetrate even the austere reaches of the hard sciences. In the end, of course, it makes no difference to astronomy whether Pluto is classified as a planet or a “trans-Neptunian object,” though apparently it mattered quite a bit to astronomers. And the planet’s demotion was embarrassingly ad hoc, since the new definition of a planet is to apply only to objects in our own solar system (you could think of it as an example of what legal scholars call “result-oriented jurisprudence,” like the Supreme Court’s decision in Bush v. Gore). But then, as the New York Times’ Clyde Haberman pointed out, most redefinitions are made for the sake of convenience, not to change anything in the world. The New York City Transit Authority’s adoption of a more elastic definition of on time didn’t change how long subway riders had to wait on the platform. The New York Knicks’ redefinition of sellout didn’t make the empty seats in Madison Square Garden any less visible. It’s all semantics—along with label and rhetoric, a word that naturally attracts dismissive modifiers like just and mere (“A Duck by Any Other Name”).
But that dismissal of “mere semantics” sits uneasily with the importance that people attach to names and labels and the lengths they’ll go to in bending definitions to their purposes. Take the administration’s reaction to charges of prisoner abuse at Guantanamo and elsewhere. Americans don’t torture, the president insisted indignantly, but the Justice Department was obliged to qualify the assertion with an intricately argued footnote explicating the fine distinction between out-and-out torture and what Bush and Cheney described as “enhanced interrogation techniques” (“The Language of Abuse”). It was an uncharacteristically crude Orwellism for an administration that usually managed language more deftly. That tin ear was also evident when the White House press secretary Tony Snow proposed sectarian violence aimed at expressing differences as an alternative to civil war to describe the Sunni-Shia conflict in Iraq.
The fact is that it’s impossible to talk about social or political values without wading knee-deep into questions about what ought to count as what. It’s hard to think of an important recent story or controversy that hasn’t forced us to reexamine the meanings of familiar words that we used to take for granted.
Sometimes the ethical issues surrounding a story are crystallized in an incidental semantic dilemma. Looking at the images from Katrina, we were forced to confront the subtleties of looting: was it right to use the same verb for carrying off a flat-screen TV from a Best Buy and taking diapers from a convenience store, particularly when the authorities had made such a mess of the relief efforts (“When Words Break Down”)? And sometimes our views on an issue are inextricably connected to what we decide to call things. In surveys, people are a lot more likely to support laws that allow doctors to help “end the lives” of terminally ill patients who request it than to “help them commit suicide.” Opponents of the laws criticize phrases like aid in dying as deceptive euphemisms, but supporters argue that suicide isn’t really appropriate in this case, pointing out that the media and authorities avoided using the word for the people who jumped from the World Trade Center (“The Language of Death”). The debate may be semantic, but it’s anything but “mere”—until you’ve decided what to call things, you don’t know where they fit in the moral scheme of the universe.
But when I chose the title of this book, I wasn’t thinking of the way people talked about planets, blogs, or even suicide. Prediction is a mug’s game, but my guess is that when people look back on the language of the early years of the twenty-first century, the first thing that will come to mind is the political vocabulary—well, that and the language of real estate—just as the sixties evoke the language of rock, drugs, and disaffection; the seventies evoke the language of disco and New Age; the eighties evoke management jargon and Valley Girl slang; and the nineties evoke techno-talk and fit-speak. That’s partly because the era has been more rancorously political than any period since the sixties, but it’s also because people have given so much attention to political language itself.
At the outset, the Bush administration and its allies seemed to conduct symbolic politics more adroitly than anyone since the early years of the New Deal. They didn’t simply package their messages succinctly and memorably; they coordinated the language of all the quarters of the right, from the think tanks to Congress to the radio and cable talk shows, and managed to cajole or intimidate much of the media into going along. The efforts seemed to be so successful that by the 2004 election, the Republicans’ dominance of the political discourse had become something of an idée fixe among the Democrats, many of whom blamed the party’s electoral eclipse on its problems with framing, messaging, and branding. How could the Republicans persuade so many voters to ignore their own best interests, if not by handing them a snappy line of patter?
But our faith in the power of words can be overdone. Not that language doesn’t cast a long shadow on truth, as Auden put it—embellishing, disguising, palliating, making things sound grander or baser than they really are. But language can’t keep reality at a distance indefinitely: when the dissonance becomes too great, something has to give. And by the end of the Bush presidency, the administration’s language lay in tatters, emblematic of its substantive failures. Bush himself conceded as much in an interview just after the 2008 election, when he said he regretted having said things like “wanted dead or alive” and “bring ’em on” and appearing beneath the “Mission Accomplished” banner on the USS Abraham Lincoln in May 2003. “I was trying to convey a message,” Bush said. “I could have conveyed it more artfully.”
But it wasn’t artlessness that undid the language; it was the stubborn disparity between words and things. By the 2006 midterm elections, as the public was losing patience with the Iraq war, the administration was already issuing slogan recalls. “We’ve never been stay the course,” Bush told George Stephanopoulos just before the election, as if he could unsay hundreds of repetitions of the phrase. The White House floated Islamo-fascism around the same time in an effort to equate the war on terror to the last great war against absolute evil, but no one but neoconservatives found the analogy compelling, and within a few months the phrase had disappeared from Bush’s and Cheney’s speeches (“Islamo-Creeps Would Be More Accurate”). The language of the administration’s domestic policy fared no better: the ownership society was quietly dropped after failure of the administration’s Social Security privatization plan, and Clear Skies was dropped even earlier (“Even in English It’s Hard to Translate”).
Of course those catchphrases were bound to lose their power sooner or later. Once a euphemism becomes thread-bare, either it’s abandoned, like most of the Bush-era slogans, or it loses its euphemistic character and becomes a neutral name. Recession, welfare, and affirmative action began their lives as euphemisms, and Social Security was disparaged as a “glittering title” when the program was first proposed in 1936. Or consider the recent fortunes of home equity loan. Banks introduced the term in the early 1980s as a consumer-friendly name for second mortgages, which were associated with finance companies who preyed on homeowners desperate for funds. But the connotations of prudent husbandry that home equity loan was supposed to evoke were quickly forgotten once the proportions of the mortgage crisis became evident in 2007.
By the end of the Bush era, though, one could sense a sea change more significant than the eclipse of a few catchphrases. The 2008 election demonstrated the enervation of the language with which the right had been prosecuting the culture wars since the Nixon years. Not that the Republicans weren’t still trying. Country first, palling around with terrorists, raising the white flag of surrender, the pro-America areas of this great nation, and finally, a little desperately, socialist—no campaign in recent memory has worked the language of patriotism and cultural populism as energetically and, in the end, as fruitlessly, as McCain and Palin did. Not even the combined efforts of Joe the Plumber and the Joe Sixpack-identified vice presidential candidate could stanch the defections from the middle-American voters that Republicans had been tenaciously cultivating since the Nixon era (“Just a Thing Called Joe,” “The Ism Dismalest of All”).
Of course you could argue that economic issues simply trumped cultural ones in this election or that these Republican candidates weren’t the right messengers for charging Obama with being unpatriotic and subversive: Republicans suspected that McCain didn’t really believe that charge, and everybody else was scared that Palin really did. But whatever happens in the next few years, it’s hard to imagine the Republicans riding back into Pennsylvania and Ohio with the same bumper stickers the next time around. Their language suffers from a kind of structure fatigue, brought on by the strain of spanning the increasing distance between its literal and symbolic meanings.
Take the epithet elite, which worked so well for cultural populists in the past, precisely because it managed to marry its original sense of power and wealth with implications of upper-crust snootiness (that’s what made John Kerry so vulnerable to the charge). Once the word becomes simply a marker of attitude—once its members are defined, as Laura Ingraham says, “not so much by class or wealth or position as they are by a general outlook”—then it becomes just another way of calling somebody stuck up. You could hear that shift at the moment in the campaign when Lady Lynn Forester de Rothschild, an American multimillionaire who married an heir to the Rothschild banking fortune, announced on CNN that she was switching her support from Hillary Clinton to John McCain and charged that Barack Obama couldn’t connect with ordinary Americans because he was an elitist—and that so, by the way, was Wolf Blitzer. The remark was greeted with the derision it merited (she should ask her husband to explain to her the concept of chutzpah), but she really wasn’t taking any more liberties with the word than upper-middle-class Ivy Leaguers like Ingraham and Ann Coulter, who are only a foreign preposition and a zero or two short of de Rothschild’s station. Indeed, you get a sense of how condescending this ostensibly “classless” use of elite is when you realize that practically the only people who use the word that way are the ones who would qualify as elite in its traditional sense (“Where the Elites Meet”).
That isn’t to say the right is about to retire the vocabulary of populism. Some of it will hang around for a long time in the form of what linguists call a hearth language, the stage a dying language passes through when it is used only by familiars around the kitchen table (or what passes for the kitchen table in the age of blogs). But the language can be effective for Republicans only if it’s reinvigorated with some genuine economic content. McCain and Palin acknowledged as much toward the end of the 2008 campaign with their attacks on “predatory lenders” and “greed and corruption on Wall Street.” But as the Bush administration learned with ownership society and the rest, voters won’t respond to that language unless they perceive that the Republicans have a second shoe to drop.
But as it happens the Democrats are facing something of the same linguistic challenge, even if it’s a little less urgent. The striking thing about the collapse of the language of the right is that it was achieved without the Democrats making any real effort to neutralize it or replace it with some new language of their own. The Obama campaign made the perfunctory noises about patriotism, but there was nothing like the effort to co-opt the right’s language that the Democratic campaign had made in 2004, which was launched with Kerry’s “reporting for duty” and billed itself “a celebration of American values.” That was probably wise—words like values and traditional can’t simply be shorn of forty years of accumulated associations to be appropriated for the other side. And Obama did just fine confining himself to soaring evocations of hope and change. But while that has obviously been effective for him, you didn’t hear the stirrings of any new tunes that other Democrats could sing along with. The strength of the right’s language of cultural populism, after all, is that it could be effective even for singularly uncharismatic figures like John Kyl and Mitch McConnell.
It will take a while for the new languages of left and right to emerge—from very early on, it’s looking like the candidates for 2009’s word of the year will be something down-to-earth like shovel-ready or workout (of your mortgage, I mean, not of your abs). It’s hard to imagine what the language of politics will sound like a decade from now, on both sides. But it’s a good bet it will change more in that period than it has in the last quarter century. It won’t be long before the language of the last few years sounds very retro indeed.
One lesson of these exercises is that there’s no place you can’t get to when you take language as your starting point—which is to say that you’re almost sure to wind up in unfamiliar territory. So I’m thankful to all the guides I could turn to for help, particularly Leo Braudy, Rachel Brownstein, Paul Duguid, Kathleen Miller, Barbara Nunberg, Scott Parker, Tom Wasow, my co-bloggers at Language Log, and especially Bob Newsom. I’m grateful as always to Phyllis Myers, the Fresh Air producer who has worked with me companionably and patiently over the years, almost always under a deadline I’ve cut too close, and to Terry Gross, Danny Miller, and the others who created and sustained the program that I have been fortunate to be associated with since it was launched more than twenty years ago. (Hitch your wagon to a star, people tell you, but really, who knew?) Thanks, too, to my agent, Joe Spieler, and to Clive Priddle, my editor at PublicAffairs. And thanks, once more, to Sophie Nunberg, who has progressed over the years from being a source of material to being a source of advice.
These pieces appear here pretty much as they did on the radio or in the press, though I’ve taken the opportunity to make some edits, correct a few factual errors, and include material that had to be cut in the original versions (or on a couple of occasions, bleeped—rest assured that I didn’t say those words on the radio). In pieces that appeared in the New York Times, I’ve also eliminated some peculiarities of Times-style, like the insistent use of “Mr.” and “Ms.”; it seemed odd to refer to the president as “Mr. Bush” on one page and simply as “Bush” on another. And with pieces that originally appeared in the press, I’ve occasionally restored my original title when I felt the title it was given by the editors was uninspired. But I resisted the temptation to revise pieces to make me look more prescient in retrospect.
WATCHING OUR WORDS
Fresh Air Commentary, June 20, 2005
A few years ago, Ruth Lilly, the heir to a pharmaceutical fortune, left a $100 million bequest to Poetry magazine. Armed with what have to be the deepest coffers of any literary publication in history, the foundation established by the magazine’s publishers recently joined with the National Endowment for the Arts to hold the first of a series of recitation competitions patterned after the national Spelling Bee.
I have to say I’m a little uneasy about that model. The national Spelling Bee is one of those odd competitions that turn an ordinary activity into a high-performance event, like extreme ironing. And when you think of poetry recitation contests, you might have the image of overachiever kids declaiming “The Boy Stood on the Burning Deck” with appropriate gestures, while their parents and elocution coaches watch nervously from the audience.
But that’s probably unfair. You have to welcome any program that might encourage more learning of poetry by heart, after a half century you could think of as the Great Forgetting. “In Xanadu did Kubla Khan . . . ”; “I wandered lonely as a cloud”; “The Assyrian came down like a wolf on the fold”—nowadays high-school graduates don’t recognize any of those lines, or the hundreds of others that used to paper the walls of the collective memory.
Only a few scraps remain. Students may know the first stanza of “The Highwayman,” which comes in handy for teaching about metaphor. (“Is the poet saying that the moon was really a ghostly galleon?”) They probably know Shelley’s “Ozymandias,” which makes for a good lesson about irony, not to mention the futility of big government. And they almost certainly know a bit of “Stopping by Woods on a Snowy Evening,” which is pretty much the last poem left in the American literary canon—well, that, and “Casey at the Bat.”
That obliteration was already well under way when I was in grade school, and I was spared some of its ravages only because I picked up the habit of memorizing poetry from my dad, who liked to recite to me when I was little—a mix of patriotic ballads like “Barbara Frietchie” and light verse by Don Marquis and the sadly forgotten Arthur Guiterman. I’ve tried to pass on some of these to my daughter, Sophie. She does an impressive job with the beginning of “The Cremation of Sam McGee,” though she usually loses the track just after “mushing our way over the Dawson Trail.”
But that’s normal. Unless you’re one of those freaks of nature who can soak this stuff up effortlessly, most of what you’ve got left of the poems you’ve learned is only snips and snatches—“My heart aches, and a something something pains my sense”; “I will arise and go now, and go to whatchamacallit”; “Ta tum ta tum, your mum and dad / They may not mean to but they do.” Yet the odd thing is that once you’ve memorized a poem you still own it, even after you’ve forgotten most of the words and have to google it up the same way everyone else does.
That’s reason enough for learning poems by heart, and there’s no need to sully the case for memorization by claiming that it’s good for mental discipline or cognitive development. Memorizing poetry does seem to make people a bit better at memorizing poetry, but there’s no evidence that the skill carries over to other tasks.
For that matter, it’s doubtful whether memorization makes you a better writer, either. Robert Pinsky once suggested that “anyone who has memorized a lot of poetry . . . [can’t] fail to write coherent sentences and paragraphs.” There’s probably some truth to that nowadays, since the only people who know a lot of poetry by heart are the ones who were drawn to it out of a love of language. But the Victorian schoolchildren who learned reams of verse at the end of their teachers’ canes grew up to write an awful lot of bad prose, most of it happily lost to literary memory.
It’s misguided to wax nostalgic for a time when students were required to memorize sentimental ballads and patriotic rousers in the name of character building, and when kids who misbehaved were given twenty lines of poetry to learn as punishment. Memorization back then was a kind of conscription—the whole world learned to nod to a four-beat singsong: “The BOY stood ON the BURNing DECK / Whence ALL but HE had FLED.”
The progressive educators of the twentieth century were right to want to sweep all that away. But they were wrong to dismiss memorization as mindless rote learning, as if the sounds alone communicated nothing by themselves. If you think you can understand poems without feeling them in your body, you’re apt to treat them as no more than decorative op-ed pieces—you wind up teaching kids to value “The Road Not Taken” as merely a piece of sage advice about making difficult decisions.
I was about seven or eight years old when I learned Burns’s “Scots wha’ hae’ wi’ Wallace bled” from my dad. I had absolutely no idea what the poem was about or even what half the words meant—“Let him on wi’ me”? But I learned something else—how verse can become a physical presence, in Robert Pinsky’s words, which “operates at the borderland of body and mind.”
That’s an experience that you can only live fully when the poem comes from within rather than from the page in front of you. I like the way the Victorianist Catherine Robson put it: “When we don’t learn by heart, the heart does not feel the rhythms of poetry as echoes of its own incessant beat.”
Fresh Air Commentary, June 14, 2005
The Washington DC Court of Appeals will be ruling soon on a case involving a petition brought to cancel the trademark of the Washington Redskins on the grounds that the trademark law forbids the registration of marks that are disparaging. As it happens, I served pro bono as the linguistics expert for the seven Indians who brought the petition and wrote a report documenting the word’s long history as an epithet, often a very nasty one.
One thing you won’t find in that report, though, is a story that you often hear nowadays about where redskin comes from. As some people have been telling it, the word originally referred not to skin color, but to the bloody Indian scalps that whites paid bounties for. It’s true that there’s no way to tell for sure, since the origins of the word are lost in the late seventeenth century. But as best I can tell there’s no historical record that connects redskin to the bounties for scalps, and in fact nobody seems to have mentioned the connection until about a dozen years ago. So it’s almost certain that the word was originally a reference to skin color—after all, people refer to Indians as the red man, too, and that couldn’t have anything to do with scalps. Not that Indians are really red, any more than people of other races are really white or black or yellow. But that Crayola theory of racial groupings runs very deep in our culture, and when kindergartners sit down at the play table, those are the crayons they reach for.
In a way, that story about redskin
“Full of fun little moments that should delight language mavens…. It’s the kind of book that you read, absorb, and then think about for a while afterward.”
San Francisco Chronicle
“[B]y paying attention to the changing nature of our common language, Nunberg has made it possible for us to feel less imprisoned by the idioms of the day, and perhaps more capable of creating (or at least laughing at) the idioms of tomorrow.”
- On Sale
- May 5, 2009
- Page Count
- 288 pages