Why We're Wrong About Nearly Everything

A Theory of Human Misunderstanding

Contributors

By Bobby Duffy

Formats and Prices

Price

$17.99

Price

$22.99 CAD

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around November 26, 2019. This date is subject to change due to shipping delays beyond our control.

A leading social researcher explains why humans so consistently misunderstand the outside world

How often are women harassed? What percentage of the population are immigrants? How bad is unemployment? These questions are important, but most of us get the answers wrong. Research shows that people often wildly misunderstand the state of the world, regardless of age, sex, or education. And though the internet brings us unprecedented access to information, there’s little evidence we’re any better informed because of it.

We may blame cognitive bias or fake news, but neither tells the complete story. In Why We’re Wrong About Nearly Everything, Bobby Duffy draws on his research into public perception across more than forty countries, offering a sweeping account of the stubborn problem of human delusion: how society breeds it, why it will never go away, and what our misperceptions say about what we really believe.

We won’t always know the facts, but they still matter. Why We’re Wrong About Nearly Everything is mandatory reading for anyone interested making humankind a little bit smarter.

Excerpt

Explore book giveaways, sneak peeks, deals, and more.

Tap here to learn more.



INTRODUCTION

PERILS EVERYWHERE

I hated my psychology classes at college. As I remember them now, they were taught by a succession of super-smart, suave professors who all looked the same, closer to snake-hipped rock stars than fusty academics. They were all tall and slim, with haircuts that didn’t play by professorial rules. They wore all-black clothes or, rarely, paisley shirts, and shoes that were just that bit too pointy. (I admit, jealousy may be clouding my own perceptions a little; in fact, I think I’ve just described Russell Brand.) The students, of both genders, swooned—partly because of the professors’ rebellious looks, partly because they seemed to know so much about how we thought. There’s nothing more attractive to most confused young adults than someone who really understands them.

But I had a problem with that. I hated the cognitive tricks that demonstrated we nearly all fall into the same mistaken ways of thinking. They’d set us up with questions or experiments that were custom made to elicit a particular answer and show how typical our brains were. At that insecure but arrogant age, I wanted to be special and unpredictable—but my answers were just like everybody else’s.

Take this example, from a professor at the University of Maryland:

You have the opportunity to earn some extra credit on your final grade. Select whether you want two points or six points added onto your final paper grade. But there’s a small catch: if more than 10 per cent of the class selects six points, then no one gets any points, not even the people who chose two points.1

Here is a very direct and teachable moment, a lesson in the ‘tragedy of the commons’—where individuals try to obtain the greatest benefit from a particular resource, taking more than their equal or sustainable share, and therefore ruin it for everyone, including themselves. Of course, the class conformed to type, and failed. Around 20 per cent selected six points, so they all got nothing. In fact, only one class in one semester over the eight years the professor had been conducting his mildly cruel experiment had actually managed to get the extra points.

Given my lingering sensitivity to psychological tricks, it’s not without irony that a lot of my working life has been focused on running similar tests. I spent twenty years at opinion research firm Ipsos MORI, designing and dissecting research from around the world to help understand what people think and do, and why. I’m now a (definitely not snake-hipped) professor at King’s College London, focusing on the same challenges of public delusion, and what they mean for public policy. Across these roles, I’ve run hundreds of surveys on public misperceptions—what we call the ‘Perils of Perception’—investigating a range of social and political issues, from sexual behaviour to personal finance, across a large number of countries. We now have over 100,000 interviews, across forty countries on some questions, allowing us to weigh up our perceptions against reality. This is a unique and fascinating source of data on how we see the world, and why we’re often so wrong about it. Previous work has tended to focus on one issue or sphere of life, and few get beyond a handful of countries. You can dig into the full set of Ipsos studies at www.perils.ipsos.com

Across all the studies and in every country, people get a lot wrong on nearly every subject we’ve covered, including immigration levels, teen pregnancy, crime rates, obesity, trends in global poverty, and how many of us are on Facebook. But the key question is ‘Why?’

Let’s start off with a question that’s got very little to do with the sort of social and political realities we’ll look at later, but helps to highlight why there might be this gap between perceptions and reality: ‘Is the Great Wall of China visible from outer space?’ What do you think? If you’re anything like the population in general, there’s about a 50–50 chance that you answered ‘yes’, as surveys show that half of people say they believe the Great Wall is visible from space.2 They’re wrong—it’s not.

At its widest, the Great Wall is only nine metres across, about the size of a small house. It’s also built of rock that is similar in colour to the surrounding mountains, so it blends in with the landscape. When you take a bit of time to think about it, the idea that the Great Wall is visible from space is actually slightly ridiculous, but there are some very good reasons why you might have thought it is.

First, it’s not something you’ll have pondered on a lot. Unlike me, you probably haven’t looked up the width of the Wall or its distance from outer space (and then got caught up in endless forum discussions about the claim). You don’t have the pertinent facts readily available to you.

Second, you may have vaguely heard someone say it when you weren’t paying much attention. You may even have seen it in print or heard it on the television. For years, Trivial Pursuit had it as an (incorrect) answer. You’re less likely to have seen it in Chinese school textbooks, but it’s still noted as a fact in those. However, you’ve likely seen it somewhere, probably more than once, and haven’t seen anything to contradict the assertion, so it settled in your head.

Third, you almost certainly answered the question quickly, wanting to get on with the rest of the book—the sort of ‘fast thinking’ popularised by the Nobel Prize–winning behavioural scientist Daniel Kahneman that relies on mental shortcuts. You may therefore have confused different measures of scale. We know that the Great Wall of China is extremely ‘big’—in fact, it’s one of the largest man-made structures on earth. But that is mainly due to its length, which isn’t the property that will make it visible from outer space.

Most important, your answer was also perhaps more emotional than you might think for such a mundane trivia question. Spend some time researching the answer, and you’ll discover that even astronauts argue over it. (For the record, Neil Armstrong says it’s not visible, which is good enough for me.) You’ll even find photos from seemingly reliable sources purporting to show the Great Wall as seen from space. (In at least one case, the photo was of a canal.) With something as big as the Great Wall, we want to believe that astronauts, aliens, even gods, can see our handiwork. We want it to be true because it’s impressive—and this emotional response alters our perception of reality.

Drawing on faulty prior knowledge, answering a different question than the one we are asked, juggling comparisons across different scales, relying on fast thinking, and missing how our emotions shape what we see and think are just some of the perils of perception we face every day. The Great Wall of China is a real, physical thing, an object that can be measured. Imagine now how the same problems of perception wreak havoc when we are contemplating complex and disputed social and political realities.

But there is a final point. Now I’ve pointed out that the best evidence suggests that the Great Wall is not visible from space, you probably believe me, and if you had a vague idea it was, you’ve probably changed your mind. Of course, this is not a highly charged debate, tied up with your identity and tribal connections, so it is easier to shrug and update your view. But still the point remains that we have the ability to adapt our beliefs in the face of new facts.

Having started with (literally) a trivia question, it is worth emphasizing that this is firmly not the focus of the book, fascinating and satisfying as (other people’s) factual ignorance and belief in the absurd can be. We love to smirk at the one in ten French people who still believe the earth may be flat; the quarter of Australians who think that cavemen and dinosaurs existed at the same time; the one in nine Brits who think the 9/11 attacks were a US government conspiracy; or the 15 per cent of Americans who believe that the media or government adds secret mind-controlling signals to television transmissions.3 Our main interest is not niche stupidity or minority belief in conspiracies, but much more general and widespread delusions about individual, social, and political realities.

Let’s look at one very basic question about the state of society that is much closer to our focus: ‘What proportion of the population of your country is aged sixty-five or over?’ Think about it yourself. You may have heard that your country has an ageing population, or that it even faces a demographic ‘time bomb’, that the population of older people is getting too large for the younger people in your country to support in their retirement. The media frequently highlight the pressures on the economy of supporting a growing elderly population, particularly in countries such as Italy and Germany. There have even been stories on how, in Japan, adult nappy sales are set to overtake baby nappy sales. These stories may be apocryphal, but they provide such a vivid image that they stick with us.

So, what would you guess?

When we asked members of the public in fourteen countries, in every single country the average guess was much higher than the actual proportion. In Italy the actual figure is 21 per cent, whereas in Japan it’s 25 per cent. These are big numbers—one in five and one in four of the whole population, and roughly double the proportion compared with the actual figures a generation or two ago. Yet the average guesses were around twice the actual population figures. People in Italy thought 48 per cent of the population—about half—were sixty-five or older.

Figure 1. All countries hugely overestimated the proportion of their population aged 65 or over.

As you can see from this one, very simple example, our delusions are not just driven by the particularly febrile political moment we’re living through. There are no massive misinformation campaigns by automated bots on Facebook or Twitter trying to convince us that our populations are older than they really are, but we’re still very wrong. Our misperceptions are wide, deep, and long-standing. Political ignorance has been a concern from the very dawn of democracy, with Plato’s grousing that the general public were too ignorant to select a government or hold it to account.

It is hard to prove that delusions have been widespread for a long time, because measuring them requires representative surveys, and social scientists started conducting rigorous public opinion polls only relatively recently. In the middle of the twentieth century, surveys of people’s perception of social realities were rare, limited primarily to simple political facts—for example, which party was in power, what their policies were, and who the leaders were. But some of these early questions, first posed as far back as the forties, have been asked again in recent studies, and, as we’ll see, the responses suggest that nothing much has changed.4 People were as likely to be wrong back then as they are now, long before 2016, when ‘post-truth’ (the idea that objective facts are less influential in shaping public opinion than appeals to emotion and personal belief) was named ‘Word of the Year’ by Oxford Dictionaries.

That’s not to say that our current, ideologically driven discourse and the explosion of social technology have no effect on our perceptions of reality, or that we’re not living in particularly dangerous times. In fact, those technological shifts are particularly terrifying in their effect on our accurate view of the world or key issues—because the quantum leap in our ability to choose and others to push ‘individual realities’ at us plays to some of our deepest biases, in preferring our existing worldview and in avoiding conflicting information.

But that’s exactly the point—if we only focus on what’s out there, what we’re told, we’ll miss a key element of the problem: it’s partly how we think that causes us to misperceive the world.

This raises an important point about the findings of the Perils of Perception surveys—the focus of these studies is not primarily to root out ignorance so much as to discover delusions. It seems a fine distinction, and drawing a clear line between the two is often difficult in practice, but the principle is essential.

Ignorance means literally ‘to not know’ or to be unacquainted with. Delusions and misperceptions, however, are a positive misunderstanding of reality, or, as Brendan Nyhan, a professor of government at Dartmouth College in New Hampshire, and his colleagues put it, ‘misperceptions differ from ignorance insofar as people often hold them with a high degree of certainty… and consider themselves to be well informed’.5 Few of the people we’ve surveyed think of themselves as ignorant; they are answering what they believe to be true.

In practice, rather than a neat delineation, there is a spectrum of false belief from ignorance to delusion. People are moveable and unsure of their certainty in many cases. The distinction shows us how difficult it is to change people’s delusions simply by giving them more information, as though they are an empty vessel just waiting to be filled with facts that will fix their mindset and behaviour.

An investigation of delusion instead of ignorance shifts the focus from public opinion as a blank slate to be written on, to a sense of a range of people holding a range of opinions and beliefs motivated by many of the same, underlying ways of thinking. It raises the vital question of why we believe what we do—this is the real value in understanding the perils of perception. Our delusion can provide clues to what we’re most worried about—and where we’re not as worried as we should be. As we’ll see, attention-grabbing stories of teenage pregnancies or terrorist attacks make us think these phenomena are more common than they really are, whereas our own self-denial leads us to underestimate obesity levels in the population as a whole.

Our delusion also provides more subtle lessons. What we think others do and believe—that is, what we think the ‘social norm’ is—can have a profound effect on how we ourselves act, even when our understanding of that norm is hopelessly misguided. For example, many of us are saving too little into our pension pots to support a decent lifestyle when we retire—but we think this is more common than it actually is. Given we instinctively feel there is safety in being in the ‘herd’, this delusion that it’s normal not to save could negatively impact our own behaviour.

More than that, when we compare what we think others do with what we say we do, we get a hint of how we view those behaviours—for instance, what things we do that we’re ashamed of. Sometimes what we’re ashamed of is surprising—and enlightening. As we’ll see in the first chapter, it seems we’re more ashamed of overeating sugar than of not exercising. Realising that we’re more likely to lie to ourselves about how much sugar we consume is a vital step to improving our health—as individuals and as a society. There are lessons for each of us, even if we feel pretty well-informed about the world. Our errors aren’t about gross stupidity: we’re all subject to personal biases and external influences on our thinking that can distort our view of reality.

We can classify all the varied explanations of our misperceptions into two groups: how we think and what we’re told.

HOW WE THINK

We have to start with how our brains grapple with numbers, mathematics, and statistical concepts. Given that we’re often asked to quantify the world and our perceptions of it, numeracy plays a large part in how well we understand the world overall. The statistics about data growth are themselves impossible for us to fully grasp: incredibly, over 90 per cent of the data on the Internet was created in the last two years; 44 billion gigabytes of data were created on the Internet every day in 2016, but this is projected to grow to 463 billion gigabytes a day by 2025.6 With the exponential growth in data being created and communicated about many of the things that concern us, the issue of numeracy is ever more vital.

Dealing with the types of calculations we now need to make doesn’t come completely naturally to many of us. MRI studies of the brains of humans (and monkeys!) indicate that we have a built-in ‘number sense’, but we are particularly attuned to the numbers one, two, and three, and, beyond that, to detecting large (not small) differences in comparing numbers of an object.7 We often fall back on these evolutionary number skills.

But much of life involves calculations that are more complex than comparing the relative size of small numbers. A century ago the great science fiction writer H. G. Wells said,

Endless social and political problems are only accessible and only thinkable to those who have had a sound training in mathematical analysis, and the time may not be very remote when… for complete initiation as an efficient citizen of one of the new great complex world-wide States that are now developing, it is as necessary to be able to compute, to think in averages and maxima and minima, as it is now to be able to read and write.8

Wells’s reference to how important mathematical understanding is to ‘endless social and political problems’ seems made for our times, but we’ve got a long way to go before we’ll completely satisfy his vision. Countless experiments show that around 10 per cent of the public don’t understand simple percentages.9 Many more of us have problems understanding probability. The French scholar Pierre-Simon Laplace called probabilities ‘common sense reduced to a calculus’, but that doesn’t make most of us any better at calculating them.10 For example, if you spin a coin twice, what’s the probability of getting two heads? The answer is 25 per cent, because there are four equal-probability outcomes: two heads, two tails, heads then tails, and tails then heads. Worryingly, only one in four people in a nationally representative survey got this right, even when they were prompted with multiple-choice answers.11 This may seem a rather abstract test of our ability to understand key facts about the world, but, as we’ll see, probabilistic thinking is the foundation for building an accurate sense of social realities.

So it is concerning that we don’t seem to be that bothered about our lack of basic mathematical fluency. In a study we conducted for the Royal Statistical Society, we found that, contrary to Wells’s vision, the public put much more importance on words than we do on numbers (which was a bit depressing, both for me and the Royal Statistical Society). When we asked people what would make them prouder of their kids, being good with words or being good with numbers, only 13 per cent said they would be most proud about their child’s mathematical ability, with 55 per cent saying they’d be most proud of their child’s reading and writing ability. (The other 32 per cent said they wouldn’t be proud about either, which seems particularly mean-spirited tiger parenting!)12

Our delusions are very far from all being about our less-than-perfect knowledge of probabilistic statistics. Over the past decades, pioneers in the fields of behavioural economics and social psychology have conducted thousands of experiments to identify and understand other mistakes and shortcuts commonly made by the human mind—what are called ‘biases’ and ‘heuristics’. They have explored our bias towards information that confirms what we already believe, our focus on negative information, our susceptibility to stereotyping, and how we like to imitate the majority. Daniel Kahneman and his long-time collaborator Amos Tversky hypothesized that our judgements and preferences are typically the result of so-called fast thinking, unless or until they are modified or overridden by slow, deliberate reasoning.13

One common mental error that is worth flagging up front, both because it may be less familiar and because it is so crucial to many delusions that we’ll discuss, is ‘emotional innumeracy’, a theory which proposes that when we’re wrong about a social reality, cause and effect may very well run in both directions: our concern means we overestimate the prevalence of an issue as much as the prevalence causing our concern. For example, say that people overestimate the level of crime in their country. Do they overestimate crime because they are concerned about it, or are they concerned about it because they overestimate it? There are good reasons to think it’s a bit of both, creating a feedback loop of delusion that is very difficult to break.

Finally, there is the possibility that our delusions are almost entirely shaped by instinctive workings in our brain—an idea born out of the field of psychophysics (the study of our psychological reactions to physical stimuli). This has only just started to be applied to social issues, and analyses by David Landy and his graduate students Eleanor Brower and Brian Guay at Indiana University suggest that a significant portion of many of the errors we make in estimating social realities might be explained by the sorts of biases they see in how people report physical stimuli. For example, we underestimate loud sounds and very bright light, and overestimate quiet sounds and low lights, in a quite predictable way—a pattern we also see in the data about how we perceive the state of social and political realities. We hedge our bets towards the middle when we’re uncertain, which may mean that our underlying view of the world is not as biased as it might seem.

However, unlike sound and light, the realities we’ll look at are often socially mediated, and our explicit estimates have meaning to us, that we defend, and are related to other attitudes. Despite this, I find psychophysics an encouraging addition to our understanding of our delusion: we may not always be as wrong as we think, or, rather, our errors may not represent such a biased view of the world.

WHAT WE’RE TOLD

The second group of factors influencing how and what we think about the world is external in origin.

First, there is the media. Whenever I present any findings from the Perils of Perception surveys at conferences, without fail the very first ‘question’ I get—sometimes shouted from the audience while I’m still speaking—is, ‘That’ll be the Daily Mail effect!’ (if I’m in the UK) or ‘That’ll be the Fox News effect!’ (if I’m in the United States) or ‘That’ll be the fake news effect!’ (when I’m presenting, well, anywhere).

‘Fake news’ as a concept quickly gained incredible traction in 2017, being named ‘Word of the Year’ by at least one dictionary publisher. But I think it’s a pretty unhelpful term, and it has only passing relevance to the types of delusion we’re interested in here, for a couple of reasons.

Properly defined, it’s way too small a concept. Our key delusions do not have their roots in entirely fabricated stories, created sometimes as clickbait to earn money for the creators and publishers or for more sinister reasons, as we’ll explore.

Even this limited use of the term has been undermined, mainly by the locus of many of the ‘real’ fake news stories, Donald Trump, as he has helped turn it into an attack phrase for both the media in general and individual reports that opponents do not agree with. The ‘2017 Fake News Awards’ hosted on the website of the Republican National Committee, for example, featured a perplexing array of ‘winners’, from actual errors in reporting, tweets from a journalist’s personal account that had been retracted and deleted, photographs that showed crowds as smaller than they really were, supposed faux pas on how to feed koi carp, rebuffed handshakes that turned out to be accepted—all the way up to a denial of collusion with Russia during the 2016 presidential election.

As we will see, our delusions are far from being just a ‘fake news effect’—although we will look at the incredible reach and frightening levels of belief in a few of the highest profile examples of actual fake news, to highlight the broader challenge of disinformation.

While there is going to be relatively little simplistic media-bashing in our explanations, it is still a vital actor in the system creating and reinforcing delusion. However, the media more generally is not actually the most important root cause of our delusion, though it is influential: we get the media we deserve, or demand.

These days, information technology and social media present even more challenges to our understanding of the world around us, given the extent to which we can filter and tailor what we see online, and how it is increasingly done without us even noticing or knowing it. ‘Filter bubbles’ and ‘echo chambers’ incubate our delusion. Unseen algorithms and our own selection biases help create a personalized view of the world, tailored to our own individual realities. The pace of technological progress that is allowing this splintering is frightening, but also so apparently complex and unstoppable that it’s numbing. A very few years ago the suggestion that we would each be experiencing our own individual realities online would have seemed like a Black Mirror episode, but now it’s accepted with a shrug. That is dangerous because it plays to some of our deepest psychological quirks—our desire to have our already held views validated and our instinctive avoidance of anything that challenges them.

In 2018, Facebook was caught supplying the data of around 87 million of its users to political consulting firm Cambridge Analytica to target communications during the 2016 US presidential campaign and the EU Referendum vote in Britain. However, the signs are that even this shocking example did not lead to wholesale rejection of our ‘filtered world’: even at the height of coverage, and the #deletefacebook campaign, technology monitoring firms reported the worldwide usage of Facebook remained within normal, expected ranges.14

Politics and political culture also feed directly into our delusion. Few of us have regular, direct personal contact with serving politicians, so much of what we’re told by politicians and the government comes via the media, and the statements made by politicians gather a disproportionate amount of media coverage, particularly during key election campaigns. And in recent years we’ve had a glut of key campaigns. Both Donald Trump’s election in America and the Brexit vote in the UK were widely called out as the apogee of deceptive communications, giving birth to new phrases, such as ‘alternative facts’. Yet, of course, there has never been a golden age when political communications were 100 per cent accurate, in any country. For example, in France in the mid-1600s, during the Civil War, an infamous series of pamphlets provided an outlet for justified outrage at royal suppression, alongside entirely fake accusations that Louis XIV’s chief minister, Cardinal Mazarin, had committed a whole series of sexual transgressions, including incest.15

Genre:

  • "Illuminating. Reading this book is like stepping from the shadows into the light. You will see the world anew."—Julia Gillard, 27th Prime Minister of Australia
  • "A first class book with hugely important and relevant analysis, really pertinent to the issues we're thinking about today... incredibly well-written and easy to read."—Dame Margaret Hodge, Labour Party MP
  • "A tour de force of delusion. In Why We're Wrong About Nearly Everything, master social researcher Bobby Duffy offers a thoroughly convincing account of how our false beliefs often tell us more about who we are than our true ones."—Seth Stephens-Davidowitz, author of Everybody Lies
  • "With wit and wisdom, Bobby Duffy reveals how the misperceptions we share shape the world we live in. Required reading for a post-truth era."—Jonah Berger, author of Invisible Influence
  • "Mandatory reading. This mind-altering book show how most of us are badly deluded about the state of the world."—Steven Pinker, Johnstone Professor of Psychology, Harvard University, and author of Enlightenment Now
  • "Illuminating and important. Duffy has spent a decade finding the gaps between our perceptions and reality. The result is this fascinating study."—Dan Gardner, co-author of Superforecasting
  • "A masterful overview of how our perceptions are repeatedly off the mark. Consequential and timely."—David Halpern, chief executive of the Behavioural Insights Team
  • "Simply indispensable. Marrying fascinating data with superb analysis, this is a unique book."—Matthew d'Ancona, author of Post-Truth and editor-in-chief of Drugstore Culture
  • "A great read that will help you get a better fix on reality. This book will help you understand why many of the things you think are probably wrong."—Hetan Shah, executive director of The Royal Statistical Society
  • "Fantastic: there are eye-opening and shocking statistics on every page. This book may force you to reconsider your most deeply held views."—Jamie Bartlett, author of The Dark NetDancyger

On Sale
Nov 26, 2019
Page Count
304 pages
Publisher
Basic Books
ISBN-13
9781541618091

Bobby Duffy

About the Author

Bobby Duffy, one of the UK's most respected social researchers, is professor of public policy and director of the Policy Institute at King's College London. Duffy previously directed public affairs and global research at Ipsos MORI and the Ipsos Social Research Institute, which, among other initiatives, ran the world's largest study of public perception. He is the author of The Perils of Perception, which sold over 30,000 copies in the UK, and which we published as Why We're Wrong About Nearly Everything in the US. His research has been covered by the Washington Post, Economist, Financial Times, Quartz, NBC, BBC, and elsewhere. He lives in London.

Learn more about this author