The Chaos Machine

The Inside Story of How Social Media Rewired Our Minds and Our World

Contributors

By Max Fisher

Formats and Prices

Prices

  • Sale Price $3.99
  • Regular Price $14.99
  • Discount (73% off)

Prices

  • Sale Price $3.99 CAD
  • Regular Price $19.99 CAD
  • Discount (80% off)

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around September 6, 2022. This date is subject to change due to shipping delays beyond our control.

Finalist for the Helen Bernstein Book Award for Excellence in Journalism


From a New York Times investigative reporter, this “authoritative and devastating account of the impacts of social media” (New York Times Book Review)  tracks the high-stakes inside story of how Big Tech’s breakneck race to drive engagement—and profits—at all costs fractured the world. The Chaos Machine is “an essential book for our times” (Ezra Klein).

We all have a vague sense that social media is bad for our minds, for our children, and for our democracies. But the truth is that its reach and impact run far deeper than we have understood. Building on years of international reporting, Max Fisher tells the gripping and galling inside story of how Facebook, Twitter, YouTube, and other social network preyed on psychological frailties to create the algorithms that drive everyday users to extreme opinions and, increasingly, extreme actions. As Fisher demonstrates, the companies’ founding tenets, combined with a blinkered focus on maximizing engagement, have led to a destabilized world for everyone.

Traversing the planet, Fisher tracks the ubiquity of hate speech and its spillover into violence, ills that first festered in far-off locales, to their dark culmination in America during the pandemic, the 2020 election, and the Capitol Insurrection. Through it all, the social-media giants refused to intervene in any meaningful way, claiming to champion free speech when in fact what they most prized were limitless profits. The result, as Fisher shows, is a cultural shift toward a world in which people are polarized not by beliefs based on facts, but by misinformation, outrage, and fear.

His narrative is about more than the villains, however. Fisher also weaves together the stories of the heroic outsiders and Silicon Valley defectors who raised the alarm and revealed what was happening behind the closed doors of Big Tech. Both panoramic and intimate, The Chaos Machine is the definitive account of the meteoric rise and troubled legacy of the tech titans, as well as a rousing and hopeful call to arrest the havoc wreaked on our minds and our world before it’s too late.

Excerpt

Author’s Note

This book is based on interviews with hundreds of people who have studied, combated, exploited, or been affected by social media, as well as with workers and executives in Silicon Valley. In some cases, for readability, a source’s name and title may appear in the Notes section rather than in the narrative. All interviews were conducted on the record save one, with a third-party moderator who asked to remain pseudonymous. I reviewed pay stubs and corporate files that confirm his account.

The book also draws heavily on academic research, court records, and many other primary sources, which are listed in the Notes as supporting evidence for every figure or assertion presented in the narrative, as well as for any quote that I did not report myself. A handful of statements draw on scholarly research that has not yet been published outside this book. In these cases, a brief overview of the findings, methodology, and authorship appear in the Notes section.




One

Trapped in the Casino

1. The Sky Is Falling

RENÉE DIRESTA HAD her infant on her knee when she realized that social networks were bringing out something dangerous in people, something already reaching invisibly into her and her son’s lives. No one in her immediate circle had children, so she had joined online groups for new parents, looking for counsel on sleep training or teething. But the other users, though mostly friendly, she said, occasionally slipped into “flame wars” that were thousands of posts long, and all over a topic she’d rarely encountered off-line: vaccinations.

It was 2014, and DiResta had only recently arrived in Silicon Valley, there to scout startups for an investment firm. She was still an analyst at heart, from her years both on Wall Street and, before that, at an intelligence agency she hints was the CIA. To keep her mind agile, she filled her downtime with elaborate research projects, the way others might do a crossword in bed.

Her curiosity provoked, she began to investigate whether the anti-vaccine anger she’d seen online reflected something broader. Buried in the files of California’s public-health department, she realized, were student vaccination rates for nearly every school in the state—including the preschools she was considering for her son. What she found shocked her. Some of the schools were vaccinated at only 30 percent. “What on earth is going on?” she asked herself. She downloaded ten years’ worth of records. The trend during that period—a steady increase in opt-outs—was clear, she told me. “Holy shit,” she thought, “this is really bad.”

With rates so low, outbreaks of diseases like measles or whooping cough became a grave danger, putting everyone’s children at risk. She called her state senator’s office to ask if anything could be done to improve vaccination rates. It wasn’t going to happen, she was told. Were vaccines really so hated? she asked. No, the staffer said. Their polling showed 85 percent support for a bill that would tighten vaccine mandates in schools. But lawmakers feared the extraordinarily vocal anti-vaccine movement—comprising young California parents gripped by paranoia and rage—that seemed to be emerging from Twitter, YouTube, and Facebook.

“That was what really sent me down this rabbit hole,” DiResta said. “For six months, no joke, from 8:00 p.m. to 2:00 a.m., this was what I did.” In time, that rabbit hole led her not to any secret hand behind the anti-vaccine movement but, rather, to the very social networks on which it had arisen. Hoping to organize some of those 85 percent of Californians who supported the vaccination bill, she started a group—where else?—on Facebook. When she bought Facebook ads to solicit recruits, she noticed something curious. Whenever she typed “vaccine,” or anything tangentially connected to the topic, into the platform’s ad-targeting tool, it returned groups and topics that were overwhelmingly opposed to vaccines. And something else: when she targeted her ads to display for California moms, the users who received them responded with a flood of anti-vaccine invective. It was as if her real-life community’s pro-vaccine views had been inverted online.

Curious, she joined a handful of anti-vaccine Facebook groups. Their users seemed to live and breathe social media, circulating YouTube clips and coordinating Twitter hashtag campaigns. Most expressed genuine anguish over what they believed to be a vast conspiracy to pump their children’s arms with dangerous shots. But if they represented only 15 percent of Californians, why were they so dominant here? Soon, DiResta noticed Facebook doing something strange: pushing a stream of notifications urging her to follow other anti-vaccine pages. “If you joined the one anti-vaccine group,” she said, “it was transformative.” Nearly every vaccine-related recommendation promoted to her was for anti-vaccine content. “The recommendation engine would push them and push them and push them.”

Before long, the system prompted her to consider joining groups for unrelated conspiracies. Chemtrails. Flat Earth. And as she poked around, she found another way that the system boosted vaccine misinformation. Just as with the ad-targeting tool, typing “vaccines” in Facebook’s search bar returned a stream of anti-vaccine posts and groups. Even though mainstream health and parenting pages often had groups with many more members, those results showed up farther down.

DiResta had an inkling for what was going on. She’d held a fascination with computers since childhood, when her father, a biomedical engineer who worked in cancer research, had taught her to code at age nine. She’d owned an early-1980s Timex machine that played simple games. In high school in New York, engineering engaged her love of creative problem-solving as well as the clean absolutes of math. She interned at MRI labs, helping to program computers for processing brain-scan imagery.

“I really liked the idea that you could build your way to a solution,” she said. “I like the rigor. I like the logic.” And computers were fun. Freewheeling chat rooms on America Online, the dial-up internet service, provided thrillingly randomized connections. Forums for esoteric shared interests, like DiResta’s favorite band, Nine Inch Nails, felt like real communities. In college she majored in computer science but decided against a graduate degree, opting instead for work in intelligence and finance. Still, when the dust from the financial crisis settled, she got in touch with friends at Google. Come out West, they said.

Though her investment work in Silicon Valley focused on hardware, she’d picked up enough about social media to understand what she’d found in her Facebook searches. The reason the system pushed the conspiratorial outliers so hard, she came to realize, was engagement. Social media platforms surfaced whatever content their automated systems had concluded would maximize users’ activity online, thereby allowing the company to sell more ads. A mother who accepts that vaccines are safe has little reason to spend much time discussing the subject online. Like-minded parenting groups she joins, while large, might be relatively quiet. But a mother who suspects a vast medical conspiracy imperiling her children, DiResta saw, might spend hours researching the subject. She is also likely to seek out allies, sharing information and coordinating action to fight back. To the A.I. governing a social media platform, the conclusion is obvious: moms interested in health issues will come to spend vastly more time online if they join anti-vaccine groups. Therefore, promoting them, through whatever method wins those users’ notice, will boost engagement. If she was right, DiResta knew, then Facebook wasn’t just indulging anti-vaccine extremists. It was creating them.

“I felt like Chicken Little, telling people the sky was falling,” she said. “And they were looking at me like, ‘It’s just some social media post.’” But what DiResta had discerned was that there was something structurally wrong with the platform. Friends in the Valley got in touch with her to say they were noticing strangely similar disturbances across all sorts of online communities. She sensed a common set of dynamics at play, perhaps even a common origin point somewhere in the bowels of the social web. And if this was the effect on something narrow like school vaccination policy or video game discussions, what would happen when it reached politics or society more broadly?

“I was looking at it and saying, ‘This is going to be such a disaster,’” she recalled.

It was a journey that would eventually take her to the trails of the Islamic State and Russian military intelligence. To State Department meeting rooms and a congressional witness table. And to a set of shocking realizations about social media’s influence on us all. But it started in California, fighting an online fringe she did not yet realize represented something much deeper and more intractable.

Almost certainly, no one at Facebook or YouTube wanted to promote vaccine denial. The groups represented such a tiny slice of their empires that any ad money they brought in was likely trivial. Zuckerberg, in a tacit response to the problem, wrote in 2015 that “the science is completely clear: vaccinations work and are important for the health of everyone in our community.” But the technology building this fringe movement was driven by something even the company’s CEO could not overcome: the cultural and financial mores at the core of his entire industry.

2. American Galápagos

LESS THAN A century ago, the Santa Clara Valley, in central California, was a sleepy expanse of fruit orchards and canneries, specked by the occasional oil derrick. That began to change in 1941, when the Japanese navy struck Pearl Harbor, setting in motion a series of events that remade this backwater into one of the greatest concentrations of wealth the world has ever known.

The story of that transformation, which bears little resemblance to the hacker legends or dorm-room tales that pass for Silicon Valley’s mostly self-invented lore, instilled the Valley with cultural and economic traits that were built into the products that increasingly rule our world. And it began with a wave of pioneers who played a role as crucial as any of the engineers or CEOs who came after them: the military-industrial complex.

After Pearl Harbor, the Pentagon, preparing to push into the Pacific but fearing another surprise attack, dispersed military production and research across parts of the West Coast that still had a touch of the frontier to them. One such location was Moffett Field, a largely disused air-base on a protected bay, shielded by the Santa Cruz Mountains. When the war ended, the war machine stayed, repurposed for the ever-escalating standoff with the Soviet Union. Planning for nuclear war, the Pentagon encouraged contractors to shift vital projects away from major population centers. The aerospace giant Lockheed complied, moving its missiles and space division to the quiet Santa Clara Valley, just behind hangar three on Moffett Field. Much of the Cold War arms race was conducted from its campus. Apple co-founder Steve Wozniak, like many of his era, grew up watching a parent head to Lockheed every morning.

Equally important was an unusual new academic research center, just a few miles away. Frederick Terman, the son of a psychology professor at then-unremarkable Stanford University, spent World War II at Harvard’s labs, overseeing joint military–academic research projects. He returned home with an idea: that this model continue into peacetime, with university scientists cooperating instead with private companies. He established the Stanford Research Park, where companies could work alongside academic researchers.

With Cold War contractors already next door, there were plenty of takers. The arrangement drew talented scientists and graduate students from back East, offering them the chance to get in on a lucrative patent or startup. University research departments usually toil, at least in theory, on behalf of the greater good. Stanford blurred the line between academic and for-profit work, a development that became core to the Silicon Valley worldview, absorbed and propagated by countless companies cycling through the Research Park. Hitting it big in the tech business and advancing human welfare, the thinking went, were not only compatible, they were one and the same.

These conditions made 1950s Santa Clara what Margaret O’Mara, a prominent historian of Silicon Valley, has called a silicon Galápagos. Much as those islands’ peculiar geology and extreme isolation produced one-of-a-kind bird and lizard species, the Valley’s peculiar conditions produced ways of doing business and of seeing the world that could not have flourished anywhere else—and led ultimately to Facebook, YouTube, and Twitter.

The chance migration that seeded much of the Valley’s technological DNA, like an adrift iguana landing on a Galápagos shore, was a cantankerous engineer named William Shockley. At Bell Labs, perhaps the most prestigious of the East Coast research firms, he’d shared a 1956 Nobel Prize for pioneering new semi-conductive transistors. The tiny devices, which direct or modify electrical signals, are the building blocks of modern electronics. Shockley became convinced he could beat Bell’s methods. When his mother’s health declined, he returned home, the same year as his Nobel, to care for her and start his own transistor company. His hometown just happened to be Palo Alto, five miles from Moffett Field. His transistor design called for replacing the conventionally used germanium with silicon.

Shockley, who had a reputation for being difficult and arrogant, struggled to convince Bell engineers to follow him. Besides, even with money flowing from the Pentagon, few scientists with any pedigree wanted to relocate to the backwater of San Jose. So he hired talented engineers with backgrounds that limited their opportunities in Boston: nongraduate, immigrants, Jews. Some, like Shockley, were brilliant but difficult to work with. It established Valley startups, forever after, as the domain of self-starter misfits rising on raw merit—a legacy that would lead its future generations to elevate misanthropic dropouts and to excuse toxic, Shockley-style corporate cultures as somehow essential to the model. Within a year of Shockley’s launch, though, his talent all quit. His “fondness for humiliating his employees,” his knee-jerk rejection of any idea not his own, and his inclination toward extremes—he later embraced eugenics and called Black people genetically inferior—was too much to endure.

For the defectors, the easy and expected thing would’ve been to bring their innovations back East, where the rest of the industry still lay. Instead, perhaps for no better reason than California weather, they got East Coast financing and stayed put. Because they happened to be based in the Santa Clara Valley, that was where future semiconductor investment and talent came as well. The little industry thrived thanks to the mass of engineers already in town for Lockheed, ensuring top-flight recruits for any promising startup. And the Stanford Research Park put cutting-edge research within easy reach.

That pool of talent, money, and technology—the three essential ingredients—would be kept in the Valley, and the rest of the world kept out, by an unusual funding practice: venture capitalism. Wall Street money mostly stayed away. The products were too esoteric and the market too opaque for outside financiers. Seemingly the only people able to identify promising ideas, the engineers themselves, provided startup funding. Someone who’d made some money on their own project would hear about a new widget getting designed across town and grant seed money—venture capital—for a percentage stake.

The arrangement went beyond money. An effective venture capitalist, to safeguard an investment, would often take a seat on the company’s board, help select the executive team, even personally mentor the founder. And venture capitalists tended to fund people whom they trusted—which meant people they knew personally or who looked and talked like them. This meant that each class of successful engineers reified their strengths, as well as their biases and blind spots, in the next, like an isolated species whose traits become more pronounced with each subsequent generation.

As semiconductors developed into the circuit board, then the computer, then the internet, and then social media, each technology produced a handful of breakout stars, who in turn funded and guided the next handful. Throughout, their community remained a commercial-cultural Galápagos, free to develop its own hyper-specific practices for how a business should work, what constitutes success, and what responsibilities a company has to its customers and the wider world.

The consequences of their model, in all its peculiarities, would not become apparent until Shockley’s successors assumed, in the form of social media giants, indirect control over us all. But the first indications were already emerging by the mid-2000s, as Silicon Valley began tinkering with a bit of hardware more complex than any semiconductor or computer: the human mind.

3. Against News Feed

IF YOU HAD to pinpoint the dawn of the social media era, you might pick September 2006, when the operators of a dorm-grown website, Facebook.com, made an accidental discovery while trying to solve a business problem. Since launching the site two and a half years earlier, they’d run a modestly successful entry in the modestly successful social media industry, in which users maintained personalized profile pages and did little else. At the time, Facebook had 8 million users, impressive for a bunch of kids barely old enough to drink, but not enough to guarantee survival. Even Friendster, already seen by then as a catastrophic failure, had about 10 million. So did LiveJournal. Orkut had 15 million. Myspace was nearing 100 million.

Facebook’s two competitive advantages were coming to look like liabilities. Its clean design made it visually appealing but less lucrative than ad-stuffed LiveJournal or Myspace. And its exclusivity to college campuses had won it a large share of a market that was both limited and cash-poor. The company had tried expanding to workplaces, but few workers signed up. What self-respecting adult would put their professional life on a website for college kids?

User growth had stalled when, that summer, a life raft appeared: Yahoo offered to buy Facebook for $1 billion. The internet giant was bringing in at least that much in revenue every quarter. But its web-portal business was growing obsolete and the company was casting around for new growth markets. Social media seemed promising. But to much of the industry’s surprise, after months of negotiation, Zuckerberg turned it down. He did not want to get off the startup roller coaster and, at age twenty-two, become a cog at ossifying, unhip Yahoo. However, denying employees pulling all-nighters a chance to retire rich in their twenties left Zuckerberg under tremendous pressure not only to turn Facebook around but to succeed so wildly that Yahoo’s billion would seem small.

Part two of his two-part plan was to eventually open up Facebook to anyone. But the failed expansion to workplaces made it uncertain that would succeed, and might even be counterproductive if it drove out college kids, which was why so much rested on part one. He would overhaul Facebook’s homepage to show each user a personalized feed of what their friends were up to on the site. Until then, you had to check each profile or group manually for any activity. Now, if one friend changed her relationship status, another posted about bad pizza in the cafeteria, and another signed up for an event, all of that would be reported on your homepage.

That stream of updates had a name: the news feed. It was presented as a never-ending party attended by everyone you knew. But to some users it felt like being forced into a panopticon, where everyone had total, unblinking visibility into the digital lives of everyone else. Facebook groups with names like “Students Against Facebook News Feed” cropped up. Nothing tangible happened in the groups. Joining signaled your agreement; that was it. But because of the site redesign, each time someone joined, all of that person’s friends got a notification on their feed alerting them. With a tap of the mouse, they could join, too, which would be broadcast in turn to their friends. Within a few hours, the groups were everywhere. One attracted 100,000 members on its first day and, by the end of the week, nearly a million.

In reality, only a minority of users ever joined. But proliferating updates made them look like an overwhelming majority. And the news feed rendered each lazy click of the “join” button as an impassioned shout: “Against News Feed” or “I HATE FACEBOOK.” The appearance of widespread anger, therefore, was an illusion. But human instincts to conform run deep. When people think something has become a matter of consensus, psychologists have found, they tend not only to go along, but to internalize that sentiment as their own.

Soon, outrage became action. Tens of thousands emailed Facebook customer service. By the next morning, satellite TV trucks besieged Facebook’s Palo Alto office, as did enough protesters that police asked the company to consider switching off whatever had caused such controversy. Some within Facebook agreed. The crisis was calmed externally with a testy public apology from Zuckerberg—“Calm down. Breathe. We hear you”—and, internally, with an ironic realization: the outrage was being ginned up by the very Facebook product that users were railing against.

That digital amplification had tricked Facebook’s users, and even its leadership, into misperceiving the platform’s loudest voices as representing everyone, growing a flicker of anger into a wildfire. But, crucially, it had also done something else: driven engagement up. Way up. In an industry where user engagement is the primary metric of success, and in a company eager to prove that turning down Yahoo’s billion-dollar overture had been more than hubris, the news feed’s distortions were not just tolerated, they were embraced. Facebook soon allowed anyone to register for the site. User growth rates, which had barely budged during the prior expansion round, exploded by 600 or 700 percent. The average amount of time each person spent online grew rapidly, too. Just thirteen months later, in the fall of 2007, the company was valued at $15 billion.

I have come to think of this as Silicon Valley’s monolith moment, akin to the scene at the beginning of Kubrick’s 2001: A Space Odyssey, when a black pillar appears before a clan of chimpanzees, who suddenly learn to wield tools. The breakthrough sent Facebook leaping ahead of competitors it had previously lagged far behind. Others went extinct as a new generation arose in their place.

When the news feed launched in 2006, 11 percent of Americans were on social media. Between 2 and 4 percent used Facebook. Less than a decade later, in 2014, nearly two thirds of Americans used social networking, among whom Facebook, YouTube, and Twitter were near-universal. That year, halfway through the second Obama term, a significant threshold was crossed in the human experience. For the first time, the 200 million Americans with an active Facebook account spent, on average, more time on the platform (forty minutes per day) than they did socializing in person (thirty-eight minutes). Just two years later, by the summer of 2016, nearly 70 percent of Americans used Facebook-owned platforms, averaging fifty minutes per day.

These systems hooked so many users so effectively that, by then, the market value of Facebook, a free-to-use web service with almost no physical products or consumer services, exceeded that of Wells Fargo, one of the world’s largest banks. That same year it also surpassed General Electric and JPMorgan Chase, then, by the end of 2017, ExxonMobil. Ever since, two of the world’s largest companies have been Facebook and Google, another mostly free web service that makes much of its money from ads, particularly on YouTube, its subsidiary.

Long after their technology’s potential for harm had been made clear, the companies would claim to merely serve, and never shape or manipulate, their users’ desires. But manipulation had been built into the products from the beginning.

4. The Casino Effect

WHEN FACEBOOK WAS getting going, I had these people who would come up to me and they would say, ‘I’m not on social media,’” Sean Parker, who had become Facebook’s first president at age twenty-four, recalled years later. “And I would say, ‘Okay, you know, you will be.’ And then they would say, ‘No, no, no. I value my real-life interactions. I value the moment. I value presence. I value intimacy.’ And I would say, ‘We’ll get you eventually.’”

Parker prided himself as a hacker, as did much of the Silicon Valley generation that arose in the 1990s, when the term still bespoke a kind of counterculture cool. Most actually built corporate software. But Parker had cofounded Napster, a file-sharing program whose users distributed so much pirated music that, by the time lawsuits shut it down two years after launching, it had irrevocably damaged the music business. Parker argued he’d forced the industry to evolve by exploiting its lethargy in moving online. Many of its artists and executives, however, saw him as a parasite.

Facebook’s strategy, as he described it, was not so different from Napster’s. But rather than exploiting weaknesses in the music industry, it would do so for the human mind. “The thought process that went into building these applications,” Parker told the media conference, “was all about, ‘How do we consume as much of your time and conscious attention as possible?’” To do that, he said, “We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you more likes and comments.” He termed this the “social-validation feedback loop,” calling it “exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.” He and Zuckerberg “understood this” from the beginning, he said, and “we did it anyway.”

Throughout the Valley, this exploitation, far from some dark secret, was openly discussed as an exciting tool for business growth. The term of art is “persuasion”: training consumers to alter their behavior in ways that serve the bottom line. Stanford University had operated a Persuasive Tech Lab since 1997. In 2007, a single semester’s worth of student projects generated $1 million in advertising revenue.

“How do companies, producing little more than bits of code displayed on a screen, seemingly control users’ minds?” Nir Eyal, a prominent Valley product consultant, asked in his 2014 book, Hooked: How to Build Habit-Forming Products. “Our actions have been engineered,” he explained. Services like Twitter and YouTube “habitually alter our everyday behavior, just as their designers intended.”

Genre:

  • The Chaos Machine is an authoritative and devastating account of the impacts of social media . . . The way the book connects the dots is utterly convincing and should obliterate any doubts about the significance of algorithmic intervention in human affairs.”—New York Times Book Review
  • “A stark warning about the extent to which Facebook et al distort our perception of reality.”—The Guardian
  • “The single most complete understanding of how social media has rewired our brains, our culture, and our politics that I have ever read. It’s outstanding.”—Jon Favreau, Offline
  • “A sobering investigation into the effects of platforms including Facebook, Twitter, and YouTube. Tracking political movements that spread over social media, both in America and worldwide, Fisher describes how algorithms . . . systematically promote extreme content that sparks moral outrage and forges group identities united by a sense of threat.”—The New Yorker
  • “Fisher has drawn together a chilling record of the cultural and commercial imperatives behind violent events, and how the perversity of machines and of people, working in tandem, have upended life around the globe.”—Air Mail
  • “Well argued, engaging, and often necessarily discomforting… The Chaos Machine’s greatest achievement is perhaps how skillfully it traces seemingly disparate phenomena back to the design of social media platforms that prioritize engagement at all costs… At a time when calls for the regulation of platforms is growing ever louder, Fisher makes an urgent, compelling case for change.”—Mary McGill, Irish Independent
  • “Social media isn’t just changing our lives. It’s changing the world, and even its creators and would-be overseers have only the foggiest ideas about how. In this meticulously reported, grippingly told account, Max Fisher chases the results across continents, and paints a disturbing picture of not just where we are, but where we’re going. The Chaos Machine is an essential book for our times.”—Ezra Klein, author of the New York Times bestseller Why We’re Polarized
  • “Well-researched and thoroughly unnerving… Fisher’s lucid, clear explanations and convincing arguments are bound to leave readers questioning their own use of social media.”—Booklist (starred review)
  • “A scathing account of the manifold ills wrought by social media… There’s no shortage of books lamenting the evils of social media, but what’s impressive here is how Fisher brings it all together: the breadth of information, covering everything from the intricacies of engagement-boosting algorithms to theories of sentimentalism, makes this a one-stop shop. It’s a well-researched, damning picture of just what happens online.”—Publishers Weekly
  • “An often riveting, disturbing examination of the social media labyrinth and the companies that created it… Fisher dives into the chaotic social media landscape, synthesizing dozens of interviews from a wide range of sources… he examines the rise of the social media giants and the dangers they have created for our society… Fisher is spot-on when he describes how the promotion and manufacture of moral outrage were not glitches in the system but inherent features.”—Kirkus Reviews
  • “In this timely book, Max Fisher reveals how powerful social-media giants set all of humanity on an alternative course to the future. The Chaos Machine boldly exposes how a few technology companies chose profit over people, helped spread salacious misinformation, and ultimately ripped the fabric of society apart. I hope everyone will read this important investigation with an open mind, because we must choose a different path forward, and fast.”—Amy Webb, author of The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity
  • “Max Fisher blends together deep reporting, riveting stories, and a global canvas in this gripping and definitive work on the damage wrought by social media. The Chaos Machine is essential reading if you want to understand a force that is reshaping the world and the very real consequences it is having on people everywhere.”—Ben Rhodes, author of the New York Times bestseller The World as It Is: A Memoir of the Obama White House

On Sale
Sep 6, 2022
Page Count
320 pages
ISBN-13
9780316703314

Max Fisher

About the Author

Max Fisher is an international reporter for the New York Times, where he authors a column called “The Interpreter,” which explains global trends and major world events, and where he contributed to a series about social media that was a finalist for the Pulitzer Prize in 2019. Fisher previously covered international affairs at The Atlantic and the Washington Post. He lives in Los Angeles.

Learn more about this author