Whiplash

How to Survive Our Faster Future

Contributors

By Joi Ito

By Jeff Howe

Formats and Prices

Price

$11.99

Price

$15.99 CAD

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around December 6, 2016. This date is subject to change due to shipping delays beyond our control.

This “brilliant and provocative” (Walter Isaacson) guide shares nine principles to adapt and survive the technological changes shaping our future from the director of the MIT Media Lab and a veteran Wired journalist.

The world is more complex and volatile today than at any other time in our history. The tools of our modern existence are getting faster, cheaper, and smaller at an exponential rate, transforming every aspect of society, from business to culture and from the public sphere to our most private moments. The people who succeed will be the ones who learn to think differently.

In Whiplash, Joi Ito and Jeff Howe distill that logic into nine organizing principles for navigating and surviving this tumultuous period:

  • Emergence over Authority
  • Pull over Push
  • Compasses over Maps
  • Risk over Safety
  • Disobedience over Compliance
  • Practice over Theory
  • Diversity over Ability
  • Resilience over Strength
  • Systems over Objects


Filled with incredible case studies and cutting-edge research and philosophies from the MIT Media Lab and beyond, Whiplash will help you adapt and succeed in this unpredictable world.

Excerpt

"Our journey so far has been pleasant, the roads have been good, and food plentiful.… Indeed, if I do not experience something far worse than I have yet done, I shall say the trouble is all in getting started."

—Tamsen Donner, June 16, 1846




Introduction

On December 28, 1895, a crowd milled outside the Grand Café in Paris for a mysterious exhibition. For one franc, promoters promised, the audience would witness the first "living photographs" in the history of mankind. If that sounds a lot like a carnival sideshow to the modern ear, it wouldn't have deterred a Parisian of the late nineteenth century. It was an age of sensation—of séances, snake charmers, bear wrestlers, aboriginal warriors, magicians, cycloramas, and psychics. Such wonders shared headlines with the many legitimate scientific discoveries and engineering advances of the 1890s. In just the previous few years, Gustave Eiffel had erected the tallest man-made structure in the world, electricity had turned Paris into the City of Light, and automobiles had begun racing past carriages on the capital's broad boulevards. The Industrial Revolution had transformed daily life, filling it with novelty and rapid change, and a Parisian could be forgiven for thinking that anything might happen on any given night, because anything often did.

Eventually the first viewers of the first living photographs were ushered down a set of dark, narrow steps into the café's basement and into neat rows of folding chairs. In the middle of the room a man stood fiddling with a small wooden box on a raised platform. After a few awkward moments, light burst from this apparatus, illuminating a linen screen with a blurry image of women emerging from the shadows of a factory. This was an underwhelming spectacle; the patrons could see people leave a factory in half the districts of Paris. Then the image flickered strangely and sprung to life. The women onscreen began streaming out of the factory, in pairs or alone or in small, hurried clusters. The grainy footage is laughably primitive today, but in the Grand Café's basement in the middle of Paris that night the audience gasped and applauded and laughed. Some just sat dumbfounded at the sight. And then, exactly fifty seconds later, it was over. That was as much film—seventeen meters—as Auguste and Louis Lumière, the brothers responsible for the first movie screening in history—could fit inside their invention, the Cinématographe.

What was it like to be among the first people to see light transformed into a moving image, the first to look at a taut screen and see instead a skirt rustling in the breeze? "You had to have attended these thrilling screenings in order to understand just how far the excitement of the crowd could go," one of the first projectionists later recalled. "Each scene passes, accompanied by tempestuous applause; after the sixth scene, I return the hall to light. The audience is shaking. Cries ring out."1

Word of this most marvelous of sensations quickly spread. The crowds outside the Grand Café grew so chaotic that the police were required to maintain order.2 Within a month the Lumière brothers had doubled their repertoire, shooting several dozen new "views," as the fifty-second films were called. Savvy businessmen as well as inventors, by that spring they were holding exhibitions of their work across Europe and America. And yet the Lumières are remembered less as the inventors of the motion picture—others, including Thomas Edison, were right on their heels—than for a single film, L'Arrivée d'un Train. Or, to be more accurate, they are remembered for the riot the film incited when it was first screened.

You don't need to be fluent in French to guess that L'Arrivée d'un Train features a train in the act of arriving. No one warned the first audience, though. Convinced, supposedly, that the train was about to trundle off the screen and turn them into ripped sacks of lacerated flesh, the tightly packed audience stumbled over one another in a frantic dash for the exits. The lights came up on a mass of humanity jammed into the narrow stairwell. The extent of the tragedy depends on whose telling you believe, and modern scholars question whether it really happened at all.

True or false, the story quickly passed into film lore, becoming what the critic Martin Loiperdinger calls "cinema's founding myth."3 This urban folktale clearly served some kind of vital purpose: Perhaps it was the most accurate way to convey the sheer, uncanny strangeness of witnessing the impossible happen, right in front of your eyes. Simple facts were not audacious enough to describe the sensation—we had to invent a myth in order to tell the truth. Technology had exceeded our capacity to understand it, and not for the last time.

One might reasonably anticipate the Lumières, with worldwide fame and a burgeoning catalog, to become fantastically rich, and instrumental to the evolution of the medium. Yet by 1900 they were done. Auguste declared that "the cinema is an invention without a future," and the brothers devoted themselves to creating a reliable technique for developing color photographs.

What's amazing about this pronouncement isn't that two bright entrepreneurs made a mammoth miscalculation. What's amazing is that it surely seemed like a smart bet at the time. By the turn of the century, the Lumières occupied a crowded field, their films having inspired countless imitators. Up to that point the early films were single scenes shot from one perspective. There were no pans or cutaways or even plots beyond man steps on rake; rake snaps up to hit him in the nose; hilarity ensues. Small wonder that, like the other sensations of the day, once the novelty wore off films became little more than boardwalk amusements. The technology of film had been created, but not the medium. When we watch these early films, we see pictures that move, but not a movie.

In failing to comprehend the significance of their own invention, the Lumières put themselves in excellent company. Some of our most celebrated inventors, engineers, and technologists have failed to understand the potential of their own work. In fact, if history is any guide, it's those closest to a given technology who are least likely to predict its ultimate use. In May 1844, Samuel Morse unveiled the world's first commercial telecommunications system. Standing in the basement of the U.S. Capitol Building, he tapped out a message to a train station in Baltimore some thirty-eight miles away. It consisted of a quote from the Old Testament: "What hath God wrought." Within a few years every major American city enjoyed instantaneous communication. Within the decade the first transatlantic cable was being laid.

As it appears in the Bible (Num. 23:23), "What hath God wrought" is understood to be an expression of gratitude—"Look at everything your dad's done for you!" At the time, Morse said the intention was to "baptize the American Telegraph with the name of its author," by which he meant the Almighty, not himself. Yet later in the day, when he recorded the phrase onto a small strip of paper for posterity's sake, he added a question mark, which changes the meaning altogether.4 Morse had a reputation as something of a pious blowhard, but by introducing the question mark he emerges as a more thoughtful figure. For thousands of years information had never traveled faster than a horse, whether the messenger was the king or his cook. Now it would move with the speed of some cosmic force. How could he, or anyone else for that matter, really know what the world was in for?

He couldn't. Morse died in the secure belief that the next great step in telecommunications would not be the telephone—dismissed as "an electric toy" when Alexander Graham Bell first exhibited his invention—but instead telegraph wires capable of carrying multiple messages simultaneously. Decades later Thomas Edison showed scarcely more insight. He marketed the first phonograph, or his "talking machine," as he called it, as a device to allow businessmen to dictate their correspondence. He called it the "Ediphone," and for many years afterward insisted that few if any customers would want to use the thing to play music. It took a self-taught engineer named Eldridge Reeves Johnson to realize the phonograph's potential to bring music into every family parlor and saloon. Johnson founded Victor Records in 1901, and started hiring famous performers like Enrico Caruso to join his label. Edison may have invented the phonograph, but Johnson did something more significant: he invented the recording industry.5

It's easy to smirk at such strategic blunders, as if Edison was the stodgy straight man in a Buster Keaton movie, stumbling blindly into some historical pratfall, and that we, with our instantaneous communications systems and our command of vast stores of information, are immune from such epic failures of foresight. But like Tarzan in the city, humans are perpetually failing to grasp the significance of their own creations. The steam-driven engines used in late nineteenth-century factories were invariably arranged around the large central axle connected to the turbine. As the economist Paul David discovered when conducting research into the first electrified factories, factory planners continued to needlessly cluster electrical engines in a central location, even starting from scratch in a new factory. As a result, an innovation that should have increased productivity seemed to have no effect at all. It took thirty years before managers exploited the flexibility electrical engines allowed and organized factories according to work flow, doubling and sometimes even tripling productivity.6

Our own era isn't immune either. In 1977, Ken Olson, the president of one of the world's largest and most successful computer companies, Digital Equipment Corporation, told an audience that there was "no reason for any individual to have a computer in his home."7 He stuck to this view throughout the 1980s, long after Microsoft and Apple had proved him wrong. Thirty years later, former Microsoft CEO Steve Ballmer told USA Today that there was "no chance that the iPhone is going to get any significant market share."8

These anecdotes, besides being funny and amazing in a nerdy kind of way, do have a point, and it's not to bring ridicule down on long-deceased American inventors. It's to recognize that we are all susceptible to misinterpreting the technological tea leaves, that we are all blinkered by prevailing systems of thought. As much as has changed—and our book is nothing if not a documentation of radical change—our brains, at least, remain largely the same organs that believed the automobile to be a passing fancy, or for that matter, that fire was just a technology for keeping us warm and producing interesting shadows on the cave wall.

Our book proceeds from the conviction that any given period of human development is characterized by a set of commonly held systems of assumptions and beliefs. We're not talking about opinions or ideologies. Beneath these lie another set of ideas, the assumptions that are unconscious, or more accurately, preconscious, in nature: Strength is better than weakness; knowledge is better than ignorance; individual talent is more desirable than difference. Imagine for a moment that your opinions, your political beliefs, and all your conscious ideas about the world and your place within it are the furniture inside a house. You acquire these quite consciously over a long period of time, discarding some, keeping others, and acquiring new pieces as the need arises. This book is about something else—the framework of joists, studs, and beams that support all your conscious ideas. In other words this isn't a book about what you know; it's a book about what you don't know you know, and why it's important to question these problematic assumptions.

The French philosopher Michel Foucault believed that this matrix of beliefs, prejudices, norms, and conventions makes up a set of rules that guide our thinking and, ultimately, the decisions we make. He called it the "episteme," and believed certain historical periods could be identified by these systems of thought, just as the archaeologist identifies layers of history by the type of pottery in use at that time.9 In his classic work The Structure of Scientific Revolutions, the American philosopher of science Thomas Kuhn called such all-encompassing belief systems "paradigms." 10

By carefully studying the evolution of scientific thought and practices over the preceding centuries, Kuhn identified patterns in how scientific disciplines like chemistry or physics accommodated new ideas. He observed that even the most careful scientists would regularly ignore or misinterpret data in order to maintain the "coherence" of the reigning paradigm and explain away the anomalies that are the first sign of fault lines in a scientific theory. For example, Newtonian physicists performed impressive feats of intellectual acrobatics in order to explain away the anomalies in astronomical observations that would eventually lead to Einstein's theory of relativity. Such upheavals—scientific revolutions, or what Kuhn called paradigm shifts—were followed by brief periods of chaos that led, in time, to stability as a new scientific consensus formed around a new paradigm.11

Our book—targeted squarely to anyone with a lively curiosity—sidesteps the debate over terminology altogether. Alexis de Toqueville may have put it best way back in the 1830s. In trying to identify the source of the United States' singular strangeness and remarkable prosperity, he noted that Americans possessed unique "habits of the mind" (an earthy pragmatism, for example) that rendered us well qualified to play a leading role in the industrial revolution.

Our own habits of mind are different in content, but no less stubborn in character. And while our book deals with some complex subjects—cryptography, genetics, artificial intelligence—it has a simple premise: Our technologies have outpaced our ability, as a society, to understand them. Now we need to catch up.

We are blessed (or cursed) to live in interesting times, where high school students regularly use gene editing techniques to invent new life forms, and where advancements in artificial intelligence force policymakers to contemplate widespread, permanent unemployment. Small wonder our old habits of mind—forged in an era of coal, steel, and easy prosperity—fall short. The strong no longer necessarily survive; not all risk needs to be mitigated; and the firm is no longer the optimum organizational unit for our scarce resources.

The digital age has rendered such assumptions archaic and something worse than useless—actively counterproductive. The argument we develop in the following pages is that our current cognitive tool set leaves us ill-equipped to comprehend the profound implications posed by rapid advances in everything from communications to warfare. Our mission is to provide you with some new tools—principles, we call them, because one characteristic of the faster future is to demolish anything so rigid as a "rule."

It's no easy task. We can't tell you what to think, because the current disconnect between humans and their technologies lies at that deeper level—the paradigm, the basic assumptions behind our belief system. Instead our book is intended to help correct that incongruity by laying out nine principles that bring our brains into the modern era, and could be used to help individuals and institutions alike navigate a challenging and uncertain future.

One might think that such deeply held beliefs evolve gradually over time, just as a species of insect slowly develops attributes that help it compete in a given environment. But this doesn't seem to be how systems of belief change—in fact, it's not even how evolution among living organisms works. In both cases long periods of relative stability are followed by periods of violent upheaval triggered by a rapid change in external circumstance, be it political revolution, the rise of a disruptive new technology, or the arrival of a new predator into a previously stable ecosystem.12 These transitions—evolutionary biologists call them "periods of speciation"13—aren't pretty. There's a strong case to be made that we're going through a doozy of a transition right now, a dramatic change in our own ecosystem. It is, in short, a helluva time to be alive, assuming you don't get caught in one of the coming cataclysms.

Our principles aren't a recipe for how to start an Internet business, or an attempt to make you a better manager—though both endeavors might benefit from them. Envision the principles as pro tips for how to use the world's new operating system. This new OS is not a minor iteration from the one we've been using the last few centuries; it's a major new release. And like any totally new OS, we're going to have to get used to this one. It runs according to a different logic. There isn't going to be an instruction manual, because, honestly, even if the developers issued one it would be outdated by the time you got hold of it.

What we offer is, we hope, more useful. The principles are simple—but powerful—guidelines to that system's new logic. They can be understood individually, but their whole is greater than the sum of their parts. That is because at the root level this new operating system is based on two irreducible facts that make up the kernel—the code at the very heart of the machine—of the network age. The first is Moore's law. Everything digital gets faster, cheaper, and smaller at an exponential rate.14 The second is the Internet.

When these two revolutions—one in technology, the other in communications—joined together, an explosive force was unleashed that changed the very nature of innovation, relocating it from the center (governments and big companies) to the edges (a twenty-three-year-old punk rock musician and circuit-board geek living in Osaka, Japan). Imagine: Charles Darwin first conceived of natural selection while reviewing the specimens he had collected as the HMS Beagle's botanist, a post he had accepted when he was twenty-three years old. He then spent more than thirty years gathering data to back up his claim, an act so patient and cautious that it strikes the modern mind as otherworldly in its monklike devotion to the scientific method.15

But then, it was another world. Reliant on the libraries of the Athenaeum Club, the British Museum, and professional organizations like the Royal Society, as well as shipments of books that could take months to arrive from abroad, he could only access a tiny fraction of the information available to the modern scientist. Without a phone, much less the Internet, collegial input was limited to that quintessentially Victorian communications network, the penny post. Research and discovery followed a glacier's pace, and real innovation required considerable sums of money, meaning family wealth or institutional patronage, and all the politics that went with it.16 Today a geneticist can extract enough DNA from an ice core sample to develop a portrait of an entire Neolithic ecosystem, then refine the results with a global community of colleagues, all over the course of a summer break. That's no mere change in degree. That's a violent change to the status quo.

So, what's next? It's the perennial question of our era. But if our predecessors—living in simpler, slower times—couldn't answer the question, what chance do we have? It's hard to say. Nuclear fission represents one of mankind's most impressive achievements. It simultaneously poses the greatest threat to survival our species has ever encountered. The Haber process led to synthetic fertilizers that increased crop yields. Its inventor, Fritz Haber, is credited with preventing the starvation of billions of people, and he won the Nobel Prize for his efforts. He also invented chemical warfare, personally overseeing the chlorine gas releases that resulted in sixty-seven thousand casualties in World War I.17 So it goes. Marc Goodman, a security expert and the founder of the Future Crimes Institute, points out that some cybersecurity technologies are used by hackers as well as the people trying to protect against them. Thus has it ever been, writes Goodman: "Fire, the original technology, could be used to keep us warm, cook our food, or burn down the village next door."18

The truth is that a technology means nothing, in and of itself. Zyklon B, another product of Haber's research, is just a gas—a useful insecticide that was also used to kill millions during the Holocaust.19 Nuclear fission is a common atomic reaction. The Internet is simply a way to disassemble information and reassemble it somewhere else. What technology actually does, the real impact it will eventually have on society, is often that which we least expect.

By the time you read this sentence, Oculus VR will have released a consumer version of its Oculus Rift, a virtual reality headset. How will we put it to use? Developers are already at work on video games that will take advantage of the intense immersion the Rift provides. Porn, that $100 billion industry, will not be far behind. It could allow doctors to perform remote surgical operations, or simply provide checkups for patients unable to get to a doctor's office. You'll visit Mars and Antarctica and that Denver apartment you might otherwise have to buy sight unseen. But the real fact is that we don't have any idea how humans will use the second, third, or tenth generation of the technology. The advances—the ideas—will come from the least likely places. If you had been tasked with finding someone to invent the telephone, you probably wouldn't have canvassed schools for the deaf. And yet in hindsight, Professor Bell—son of a deaf mother, husband to a deaf wife, and pioneering student of sound waves and methods of using vibrating wires as a system of communicating sound to those who could not hear it—seems like the perfect choice.20

The shock of the new would become a common refrain in the century of marvels that followed the telegraph: From the sewing machine to the safety pin, from the elevator to the steam turbine, mankind hurtled forward, ever faster, the technology always outstripping our ability to understand it. Will genetic engineering eradicate cancer or become a cheap weapon of mass destruction? No one knows. As Moore's law demonstrates, technology lopes along according to power laws of one or another magnitude. Our brains—or at least the sum of our brains working together in the welter of institutions, companies, governments, and other forms of collective endeavor—plod along slowly in its wake, struggling to understand just what God, or man, hath wrought.

"The future," science-fiction writer William Gibson once said, "is already here. It's just not evenly distributed."21 This is less witty observation than indisputable truth. Even in Boston, the city both authors of this book call home, decades of progress seem to melt away in the time it takes you to drive from the humming laboratories of MIT to the cash-strapped public elementary schools just across the river.

Back to the Lumières for a moment, and their exhilarating but choppy stab at moving pictures. Things went along pretty much according to the status quo for nearly a decade. Then in 1903, George Albert Smith—hypnotist, psychic, and English entrepreneur quick to embrace the new medium—was filming two primly dressed children nursing a sick kitten. This was just the kind of domestic scene popular with Smith's middle-class Victorian audiences. But a viewer would have had difficulty seeing the detail of the girl spoon-feeding the swaddled kitten. So Smith did something radical. He scooted his camera closer to his subject until only the kitten and the girl's hand were in the frame. Up to that time conventional wisdom held that such a composition would throw the moviegoing public into an ontological quandary: What happened to the girl? Has she been sliced in two? Smith tempted fate and edited the shot into the final cut. Viewers responded positively, and like that, Smith had invented the close-up.22

Ponder this for a moment. It took eight years, hundreds of filmmakers, and thousands of films before someone conceived of the new technology as anything other than a play in two dimensions. This simple innovation helped jump-start a period of experimentation and progress in the cinema. Yet it would take another twelve years before a film appeared—D. W. Griffiths's Birth of a Nation—that a modern audience would recognize as such.23 Not because the technology didn't exist, but because in the end, technologies are just tools—useless, static objects until they are animated by human ideas.

For most of Earth's history, change has been in rare supply. Life appeared 4 billion years ago. It took another 2.8 billion years to discover sex, and another 700 million years before the first creature with a brain appeared. The first amphibian squirmed up onto land about 350 million years after that. In fact, complex life is a fairly recent phenomenon on this planet. If you were to condense Earth's history into a single year, land-dwelling animals come onstage around December 1, and the dinosaurs don't go extinct until the day after Christmas. Hominids start walking on two feet around 11:50 p.m. on New Year's Eve, and recorded history begins a few nanoseconds before midnight.

And even then, change moves at a glacial pace. Now let's pretend that that last ten minutes—the era of the "behaviorally modern" human—is a year. Not a thing happens until December. The Sumerians begin smelting bronze in the first week of December, the first recorded languages pop up around the middle of the month, and Christianity starts spreading on December 23. But for most people life is still nasty, brutish, and short. Just before dawn on December 31, the pace finally begins to pick up, as mass production ushers in the industrial age. That morning, train tracks blossom across the land, and humans finally start moving faster than a horse. The rest of the day is action-packed: Around 2:00 p.m., infant mortality and life expectancy—both largely unchanged since the exodus from Africa last January—are improved with the introduction of antibiotics. Planes are circling the earth by late afternoon, and well-heeled companies begin buying mainframe computers around dinnertime.

It took us 364 days before a billion humans walked the earth. By 7:00 p.m., there are three billion people on the planet, and we just uncorked the first bottle of champagne! That number doubles again before midnight, and at the rate we're going (roughly another billion people every eighty minutes) we'll reach Earth's expected capacity for humanity by 2:00 a.m. on New Year's Day.24 At some recent point—the geological equivalent of a hummingbird's single heartbeat—everything from the speed of travel to population growth to the sheer amount of information our species now possesses started to metastasize. In short, we entered an exponential age.

But the "big shift," as an influential 2009 Harvard Business Review article termed it,25

Genre:

  • "WHIPLASH remarkably, entertainingly, and, most importantly, optimistically gives critical context for the exponential evolution in which we find ourselves, then provides a path toward adaptability and flexibility. In short, it's a badass read."

J. J. Abrams
  • "This book is a brilliant and provocative guide to reading the tea leaves of technological change. It's a great mix of stories, analysis, and practical tips. Read it or get left behind."—Walter Isaacson, bestselling author of Steve Jobs and president and CEO, Aspen Institute
  • "Earnestly followed, their nine principles will promote fluidity, exalt hybridity, make our brains receptive and elastic, in other words prepare us to navigate our present and grow our future."—Paola Antonelli, senior curator, Museum of Modern Art
  • "This book is an essential and pragmatic guide to navigating our technologically accelerating future. Joi and Jeff have distilled nine excellent principles that you can ignore only at your peril. Read this book to accelerate your thinking and become part of the future instead of part of the past."—Reid Hoffman, cofounder, LinkedIn
  • "A vitally important book, one worth reading and then rereading. Joi and Jeff will open your eyes, inspire you, and help you teach the others. Every three pages you'll find another reason to reset what you thought you knew about the world. A must read."—Seth Godin, author of Survival Is Not Enough
  • "Readers interested in technology, science history, futurism, innovation, and entrepreneurship will find this book to be very fascinating, thought provoking, and focused."—Booklist
  • "This provocative gem is a must-read for anyone interested in the cutting-edge research and exploration happening at MIT's Media Lab, innovation at countless universities and companies worldwide, or futuristic thinking in general."—Publisher's Weekly
  • "This exhilarating and authoritative book actually makes sense of our incredibly fast-paced, high-tech society. A standout among titles on technology and innovation, it will repay reading--and rereading--by leaders in all fields."—Kirkus (starred review)
  • On Sale
    Dec 6, 2016
    Page Count
    288 pages
    ISBN-13
    9781455544585

    Joi Ito

    About the Author

    Joichi “Joi” Ito has been recognized for his work as an activist, entrepreneur, venture capitalist, and advocate of emergent democracy, privacy, and Internet freedom. As director of the MIT Media Lab, he is currently exploring how radical new approaches to science and technology can transform society in substantial and positive ways. Ito has served as both board chair and CEO of Creative Commons, and sits on the boards of Sony Corporation, Knight Foundation, the John D. and Catherine T. MacArthur Foundation, The New York Times Company, and The Mozilla Foundation. Ito’s honors include Time magazine’s “Cyber-Elite” listing in 1997 (at age 31) and selection as one of the “Global Leaders for Tomorrow” by the World Economic Forum (2001). In 2008, BusinessWeek named him one of the 25 Most Influential People on the Web. In 2011, he received the Lifetime Achievement Award from the Oxford Internet Institute. In 2013, he received an honorary D.Litt from The New School in New York City, and in 2015 an honorary Doctor of Humane Letters degree from Tufts University. In 2014, he was inducted into the SXSW Interactive Hall of Fame; also in 2014, he was one of the recipients of the Golden Plate award from the Academy of Achievement.

    Jeff Howe is the program coordinator for Media Innovation at Northeastern, and an assistant professor at Northeastern University. A longtime contributing editor at Wired magazine, he coined the term crowdsourcing in a 2006 article for that magazine. In 2008 he published a book with Random House that looked more deeply at the phenomenon of massive online collaboration. Called Crowdsourcing: How the Power of the Crowd is Driving the Future of Business, it has been translated into ten languages. He was a Nieman Fellow at Harvard University during the 2009-2010 academic year, and is currently a visiting scholar at the MIT Media Lab. He has written for the Washington Post, the New Yorker, the New York Times, Time, Newsweek, and many other publications. He currently lives in Cambridge with his wife and two children.

    Learn more about this author