Promotion
Free shipping on $45+ Shop Now!
Stop Being Reasonable
How We Really Change Our Minds
Contributors
Formats and Prices
Price
$26.00Price
$33.00 CADFormat
Format:
- Hardcover $26.00 $33.00 CAD
- ebook $14.99 $19.99 CAD
- Audiobook Download (Unabridged)
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around October 22, 2019. This date is subject to change due to shipping delays beyond our control.
Also available from:
In Stop Being Reasonable, Eleanor Gordon-Smith weaves a narrative that illustrates the limits of human reason.
Excerpt
Explore book giveaways, sneak peeks, deals, and more.
introduction
Everything Was Protein Powder and Nothing Hurt
Somewhere in the technological belt of California, where the only thing more precisely engineered than the software is the people—or maybe the people’s teeth—lives an organization called the Center for Applied Rationality.1 For the low price of $3,900, the center will sell you a four-day workshop on reasoning, during which participants eat, sleep, and take part in nine hours of back-to-back activities together daily under one (presumably rationally designed) roof. This year, just like every other year, the center will receive hundreds of applications from people who want to attend because, as they put it, “Everyone I know is irrational, and I want to fix them.”2
These folks make for an easy punch line, a good group to laugh at. But it turns out many of us make a version of the same mistake when we think about persuasion. We think we know what it is to change our minds rationally, and the only question is why other people don’t do it more often. The ideal mind change is calm. It reacts to reasoned argument. It responds to facts, not to our sense of self or the people around us. It resists the siren song of emotion. People like to talk about the public sphere—if there is such a thing, then its convex edge reflects this idealized image back at us. Think of the number of programs dedicated to the mind-changing magic of two sides saying opposite things: Meet the Press, State of the Union, Face the Nation. The branding of these things often bakes in a little reward: how brave I am, for attending the Festival of Dangerous Ideas; how clever, for my subscription to the Intelligence Squared debates. The proper way to reason, at least according to our present ideal, is to discard ego and emotion and step into a kind of disinfected argumentative operating theater where the sealed air-conditioning vents stop any everyday fluff from floating down and infecting the sterilized truth.
Years ago I used to share this view. I’ll tell you why, even though it will rightly make you want to take my lunch money: when I was in school, I was a champion debater, which is another way of saying I spent my weekends wearing a blazer and telling people in precisely timed intervals exactly how wrong they were. My teammates and I constructed arguments for twenty hours a week, putting premises in the crosshairs with the unblinking accuracy of people whose whole egos were on the line. We weren’t bad, either. Eventually, we made it to the world championships in Qatar, where we wore blazers embroidered with the Australian coat of arms in gold and competed in what looked, in hindsight, like a scene in an apocalypse movie just before the purge begins: all of us in matching uniforms on fleets of white buses being shepherded through the desert haze to auditoriums where we would sit locked up together for an hour surrounded by stopwatch-wielding officials. Debating left me with an attitude toward persuasion that was as precise as Euclidean geometry: find the foundation; show why it’s wrong. Buttress analysis with evidence. Emotion is for decorative flourishes only—do not expect it to be load bearing. Of course, I knew you could change minds by appealing to things like emotion or your opponents’ sense of self, but doing that seemed kind of base. It felt nobly sportsmanlike to arm yourself with argument alone. It was the intellectual equivalent of turning up at dawn for your duel with your pistols shined and paces counted: it was how you were meant to fight.
This perspective began to change after I produced a piece for the radio show This American Life in 2016. The idea had seemed simple: turn around to my own catcallers—men who had wolf-whistled or made sexual comments to me on the street—and try to reason them out of doing it again. I spent hours giving these men all the evidence, all the reasoning, all the fancy footwork with premises. But after dozens of conversations, I walked away defeated. Over and over again, they walked away from our conversations as sure as they’d ever been that it was okay to grab, yell at, or follow women on the street.
These men didn’t seem fundamentally irrational or unstuck from reality—in fact, in a funny sort of way, I quite liked a few of them. One told me he modeled his courtship rituals on the animal kingdom: “I’m just another paradise bird, flaunting my shit,” he said triumphantly, as though this explained everything that needed to be explained. That’s a good line. He made me laugh. But I couldn’t change their minds. The experience deflated me not just as a person and as a woman but as someone who had always been optimistic about our ability to talk each other into better beliefs. We finished recording in November 2016, right after the US general election, which set a grim backdrop for a newly found pessimism toward rational debate and persuasion.
But when the piece aired, a strange thing happened. I was inundated with interview requests. Could I write a ten-step guide to changing minds? Would I accept an award for the successful use of rational persuasion in public? What advice did I have for talking people out of being workplace harassers? I was astonished. Large numbers of people had apparently listened to my conversations with catcallers on the radio—conversations that I had walked away from feeling dejected and defeated—and heard instead instances of persuasive success. I think the explanation is that these conversations bore a sort of Madame Tussauds–like resemblance to what we think good mind changes look like. I said one thing, my catcallers said the opposite thing, and each of us tried to explain why we were right. I had stayed calm; they had been prepared to hear me out. I had used statistics. It looked for all the world like a rational debate, and the fact that I had failed to change any minds with this approach disappeared under the shadow of the unquestioned assumption that I deserved congratulations for even trying. I started to smell a rat—a big one that lives in the sewers and never takes a shower.
Everywhere we look we see the gospel that reasoned argument is the currency of persuasion and that the “right” way to change our minds is by entering a sort of gladiatorial contest of ideas where we leave the personal behind. But what if our eagerness to congratulate each other for employing that ideal stops us from asking whether it is worth aspiring to at all?
Another reason I started to look at persuasion differently is that I started working in academic philosophy, where it takes about two minutes to hit a wall of unanswered questions about what reason actually is or what it asks of us. You can’t stay wedded to the importance of reasoned debate when you don’t even know what it is to be reasonable in the first place. Maybe you think it’s simple, that being reasonable just means believing things in proportion to the evidence, but if that was your first thought, then please accept my condolences as you plummet backward down the rabbit hole.
What counts as evidence? Are sensory perceptions evidence? Or feelings, like empathy? If not, what licenses your belief that other people’s suffering matters? When is there enough evidence to believe something? Do different beliefs require different amounts of evidence, and, if so, what sets them? Could anything else have a bearing on what we should believe, like the costs of error? And what sort of “should” are we using when we ask, “What should we believe?” Are we aiming at truth, or at morality, or are they in some way the same goal? Are the standards for believing mathematical or scientific truths different from moral or interpersonal ones, or is there no distinction? What’s the responsible way to respond to the news that someone as intelligent as you, in possession of as much evidence as you, believes a different conclusion? Should you downgrade your confidence in your own view? If so, why?
There is a bigger question underneath it all: When is it ever possible to know anything? Thousands of years before Descartes wondered what it was possible to know, Greek philosopher Sextus Empiricus had already fathered skepticism by answering, “Nothing.” This is a possibility so genuinely frightening that people prefer to parse it as a silly thought experiment about whether we’re in The Matrix than to engage with the awful specter it raises. One philosopher who took that specter seriously was Stanley Cavell, who spent years trying to answer the skeptic’s challenge and whose work is so captivating to a certain sort of reader that for years, two big East Coast university libraries in the United States refused to restock his books. There was no point—students just did not bring them back. “How do we stop?” Cavell wrote. “How do we learn that what we need is not more knowledge but the willingness to forgo knowing?”3
Over millennia these questions about what to believe, when, and why have pinballed back and forth between the most blisteringly intelligent people of their day, and still nobody has the settled answers. Not all that long ago, on philosophy’s timescale, Pittsburgh philosopher John McDowell wrote his seminal work Mind and World, which wonders among other things what sort of thing rationality could be. Reviewing it, Rutgers-by-way-of-MIT professor Jerry Fodor wrote that “we’re very close to the edge of what we know how to talk about at all sensibly.”4
I do not have the poetic instincts or vocabulary to be able to describe, against that backdrop, what hair-tearing frustration it is to see the concept of “rationality” bandied about in public without any acknowledgment of the longevity and complexity of these questions. Instead, pundits who take themselves to be the chief executors of rationality simply assert things about what it is to be reasonable by taking as bedrock the very things that still remain to be proved.
You will see people speak as though “being reasonable” is just being unemotional, as British member of Parliament Michael Gove did when he appeared on Good Morning Britain after the fire in Grenfell Tower, speaking in tones usually reserved for children who’ve had too much sugar. “We can put victims first [by] responding with calmness,” he told Piers Morgan. “It doesn’t help anyone, understandable though it is, to let emotion cloud reason.… And you, [Piers], have a responsibility to look coolly at this situation.”
The idea seems to be that being emotional precludes being rational and that we can select only one way of being at a time. Antonin Scalia, the former US Supreme Court justice, argued along these lines when he wrote, “Good judges pride themselves on the rationality of their rulings and the suppression of their personal proclivities, including most especially their emotions.… [O]vert appeal to emotion is likely to be regarded as an insult. (‘What does this lawyer think I am, an impressionable juror?’)”5 Where did we get this confidence that emotion has no place in reasoning? Who says emotions can’t themselves be rational? Many philosophers think they can be: at minimum, it’s an open question.
Or you will see people speak as though “being rational” is just the task of getting your behavior to match your goals, an inheritance from an economic model of the rational consumer. You want to save more money? Here’s an app. You want to exercise more? Here’s a morning routine. Geoff Sayre-McCord, an impossibly genial philosopher at the University of North Carolina at Chapel Hill who collects motorcycles and writes about belief, told a story at a Princeton workshop on the ethics of belief that illustrates how slippery this idea of “rationality” can be. An economist friend of his gives a speech at a conference, espousing the idea that “rationality” governs the space between our goals and how we act, but has nothing much to say about which goals we should pursue. Hours later, at the pub, the economist laments his teenage son’s failure to set the “proper” goals: “All he does is play drums and drive around with his friends! He doesn’t study; he doesn’t think about his future. He’s being so irrational.”
Or worse, you will see people speak as though being reasonable demands waiting perpetually for more evidence, as they do during any publicized allegation of sexual assault. “Innocent until proven guilty!” we hear, as dozens of women come forward. “We have to wait for the evidence!” If testimony doesn’t count as evidence, what does? Why is it that for any other criminal act, testimony and the balance of probability make it rational to believe “that guy did it,” but in this case they do not?
At the same time as all of these apparent confidences about what it means to be reasonable, we also seem firmly convinced that there is one issue that rationality cannot decide: the question of what morality is and what it demands of us. I cannot count the number of times I have told a dentist or train-seat companion that I work on rationality and ethics, only to hear the confident reply, “But morality’s just a matter of opinion!” That may well be true, but it also may not be: the best and brightest philosophical minds disagree over whether moral truths exist or can be reached by logic, yet everywhere I go I see it taken as obvious that moral questions fall outside the jurisdiction of reason—every time I teach Ethics 101, my students come into the class repeating the demure refrain “Everyone’s entitled to their opinion.”
Here’s my point: in our haste to congratulate ourselves for being reasonable, we accidentally untied the very notion of rationality from its rich philosophical ancestry and from the complexity of actual human minds, and now the idea of being reasonable that underpins our public discourse has little to do with helping us find our way back to the truth, or to each other, and altogether more to do with selling us $3,900 workshops and an anesthetized dream of an optimized future where everything is protein powder and nothing hurts.
The strange thing is that most of us are already suspicious of this image of “rational debate.” How many times have you seen a TV panel discussion in which the defender of one view turned to their opponent and said, “You know, actually, that’s a pretty good point”? Ever? And when you have changed your mind about something close to you, was it because of a rational argument, or was the process something stranger and more difficult to map, like a subterranean rumble you weren’t aware of until it was over, or a single moment in which the old facts cast a new shadow? Most of us learned long ago that changing our minds about something that matters—whether we were right to act the way we did, whether to believe what we are being told, whether we are in love—is far messier than any topiaried argument will allow. Those spaces aren’t debates. They are moments between people—messy, flawed, baggage-carrying people—and our words have to navigate a space where old hurts and concealed fears and calcified beliefs hang stretched between us like spun sugar, catching the light for only a second or two before floating out of view again.
In that space, reason and logic don’t work the way we’re trained to think they will. Language doesn’t even work. Where once stood your helpful little army of words marshaling themselves into formations that express yourself and your point, there now stands a mob of rogue mercenaries inflicting damage in ways you can’t even understand, let alone apologize for. You make the right phonetic noises for something like “I’m sorry,” but what appears in the common ground is closer to “You’re overreacting” or “I had a really good excuse,” and nothing seems further away than reason when your little “I’m sorry” looks to the other person like a weird molten Frankenstein of insult and shame. Wittgenstein said that if a lion could speak, we wouldn’t understand him. Sometimes I think it’s not just lions.
So why, when we know that changing our minds is as tangled and difficult and messy as we are, do we stay so wedded to the thought that rational debate is the best way to go about it? Why do we hold our ideal of rationality fixed and try to mold ourselves around it, instead of the other way around? Why do we still think the important question is a psychological one about how we do change our minds, instead of a philosophical one about how we should?
I now think that our typical views about how we should change minds are not just wrong but bad for all of us, and my hope is that this book will go some way to showing why. You are about to read a series of true stories about people who changed their minds while under the kind of pressure that gives a person the bends. Many of them tell their stories here for the first time.
Some of them are stories of revelation, like the moment when Susie discovered her husband had been telling a criminal lie since he was twelve years old and began to fear for herself and her young child, or when Peter opened his ailing mother’s mail for her and discovered she wasn’t who he’d thought she was, or when Dylan quit the strict apocalypse-heralding religious sect he’d been raised in after more than twenty years as a believer. Others are stories about not knowing what to believe, like the shifting cognitive sands Upper-Class English Gentleman Alex found himself in when he finished a stint on a reality TV program that had trained him as a London bouncer, only to realize he was no longer entirely sure which of those two identities he’d been faking. Or there’s the former navy pilot Nicole, who has spent years mired in confusion after alleging as a six-year-old that her mother had abused her, then reading an exposé about her own case many years later that argued the abuse may never have occurred.
Each of these people did something in their mind change that defies the public orthodoxies about how we “ought” to reason. These are not stories of neat deliberation. These people were influenced by the sorts of things our rational pundits tell us to check at the door: their sense of self, or what they’d been told, or how they felt, or the costs of uncertainty, or who (not what) they believed, or even whom they loved. These stories show us, in vivid detail, that sometimes it can be perfectly rational to change our minds on the basis of these things.
Part of why I tell these stories is because I hope they can give us a blueprint for our own most difficult persuasive projects.
When we set out to change people’s views, it’s easy to forget just how resistant minds are to changing. It’s easy to feel, as I did when I spoke to my catcallers, like we’ve spent a whole lot of conversational energy ticking all the boxes in the rational persuasion manual and been rewarded with nothing but a frustrating stalemate. It is a uniquely teeth-grinding moment: not just to fail to persuade, but to have no idea what went wrong. My hope is that in telling these stories here, we can learn to better diagnose these moments and, more optimistically, to see what happens when things do click. The people in these stories pulled off massive mind changes using altogether human tools of reasoning like trust, and credibility, and their sense of self, and their emotions, and their ways of avoiding the shame of having to reckon with the fact that they were wrong. If we can understand those tools a little better, and see them as rational as well as practical, then we may be able to use them in changes of mind we most want to accomplish.
That goes for our own changes of mind, not just other people’s. Though few of us will find ourselves in situations like the ones in this book, we very often find ourselves asking the questions these people had to: Who am I really? What if I’ve been wrong? Who am I without this belief? What should I think? Often it’s not at all clear how we get out of these tangles. I used to very seriously believe in God, for instance. I have the physical memory of carpet pressing into my bare knees at school as I prayed for the soul of Nguyen Tuong Van, who was due to be executed in Singapore that day for drug trafficking. But just as vivid as that memory is my certainty that I didn’t change my mind because of a rational argument. I just noticed one day that it had been a while since I’d believed in God. And often we could use a more definitive way out of those tangles: I also used to believe I was in love with someone who splintered furniture when I spoke to other men, and I’d have given two of my toes and probably an eyeball for the syllogism that could have changed my mind about that. But it doesn’t exist. Uprooting these foundational beliefs and working out what to do with the clods of selfhood they shook off on the way out was just too messy a job for a tidy argument. If we can find an ideal of rationality that doesn’t berate us for that messiness, we may find it easier to change our own minds when it matters most.
And partly I tell these stories because if our ideal of rational persuasion turns out to be wrong, we had better stop constructing our public discourse around it, and soon. We broadcast opinions grosser than the stuff they scrape out of clogged sewage pipes because we are so confident that rational debate can work, and we do this despite fairly good evidence that just entertaining a thought makes people more likely to believe it. Producers, newspapers, and event managers give airtime to racist, sexist, science-denying speakers by saying we need to Challenge These People, but seldom do those same producers and event managers seem to have researched how to make those challenges successful or sought advice from professionals who have spent years embedded with these ideologies studying how, if ever, they change. We risk all the harms that come from giving these people a platform simply because we’re so confident that we’re the ones who know how persuasion should work. Everyone I know is irrational, and I want to fix them.
It strikes me as a colossal tragedy that in deference to our rusted-out idea of persuasion, we structured our public discourse around regimented debates, leaving persuasive strategies built on how we actually
Genre:
-
"I knew how piercingly smart Eleanor Gordon-Smith is, and what a curious and resolute interviewer. But I was unprepared for how entertainingly she writes! I read this with pleasure."
—Ira Glass, host of This American Life
- On Sale
- Oct 22, 2019
- Page Count
- 240 pages
- Publisher
- PublicAffairs
- ISBN-13
- 9781541730441
Newsletter Signup
By clicking ‘Sign Up,’ I acknowledge that I have read and agree to Hachette Book Group’s Privacy Policy and Terms of Use