Promotion
Use code DAD23 for 20% off + Free shipping on $45+ Shop Now!
Bit by Bit
How Video Games Transformed Our World
Contributors
By Andrew Ervin
Formats and Prices
Price
$19.99Price
$25.99 CADFormat
Format:
- ebook $19.99 $25.99 CAD
- Hardcover $30.00 $38.00 CAD
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around May 2, 2017. This date is subject to change due to shipping delays beyond our control.
Also available from:
Video games have seemingly taken over our lives. Whereas gamers once constituted a small and largely male subculture, today 67 percent of American households play video games. The average gamer is now thirty-four years old and spends eight hours each week playing — and there is a 40 percent chance this person is a woman.
In Bit by Bit, Andrew Ervin sets out to understand the explosive popularity of video games. He travels to government laboratories, junk shops, and arcades. He interviews scientists and game designers, both old and young. In charting the material and technological history of video games, from the 1950s to the present, he suggests that their appeal starts and ends with the sense of creativity they instill in gamers. As Ervin argues, games are art because they are beautiful, moving, and even political — and because they turn players into artists themselves.
Excerpt
Midway upon the journey of our life
I found myself within a forest dark,
For the straightforward pathway had been lost.
Ah me! how hard a thing it is to say
What was this forest savage, rough, and stern,
Which in the very thought renews the fear.
So bitter is it, death is little more;
But of the good to treat, which there I found,
Speak will I of the other things I saw there.
—Dante Alighieri, Inferno (c. 1320)
Who gives a shit about these old video games? You know, that's a question that some people might ask.
—Warren Robinett (2015)
INTRODUCTION
the purpose of playing
The massive HDTV loomed in front of the windows and blocked the view of the beach. Two of my nephews, John and Logan, were watching cartoons at a volume that came to find me even after I retreated three rooms away. It was Christmas Day at my parents' house on the Jersey Shore. I heard sound effects but no dialogue to speak of. The background music seemed cheery at first, but the steady repetition of some sort of crunching noise called to mind a deranged celery-eating contest. The soundtrack was dominated by a Philip Glass-esque exercise in serialism: rapid crashes interrupted by bleating sheep that went on, full blast, for an hour. I was trying to read, but my noise-canceling headphones proved useless.
The headache kettle-drumming at my temples shot pinpricks into the backs of my eyes. My face burned bright red because I was wearing a new cashmere sweater the color of reflective bibs worn by highway work crews, and I'm allergic to wool. In the kitchen, my father was loading his Crock-Pot with the ingredients for his broccoli surprise, but it sounded like he was performing "Flight of the Bumblebee" using every metal pan and piece of flatware in the house. My wife Elivi, the smart one in the family, had gone out for a long run. Unfortunately, my sneakers were in the nephew-colonized living room. I closed the Inferno and then my eyes. The microwave started beeping and no one made it stop, so I took a breath and ventured out to the kitchen.
I found my running shoes buried in the living room amid the discarded wrapping paper and Legos and cardboard boxes. I planned to make a hasty exit and catch up with Elivi, but the TV grabbed my attention. The picture quality looked terrible, like a chunky 8-bit Atari game but in 3D. I sat on the rug between my nephews, who didn't notice my presence, and saw that John was tethered to the screen. In his hands was no simple joystick, but rather a plastic device shaped like a bat that had been squashed, taxidermied, and shellacked. The game was called Minecraft (2011) and I had never seen anything like it. I soon learned that it had sold over 100 million copies worldwide.
There existed no narrative structure—no story—to Minecraft, at least as far as I could tell. John toggled between pop-up boxes when he was not making his way among abstract trees and along poorly-rendered bodies of water. Block-like animals lumbered around. Though rudimentary, I found the action on screen enthralling. What I would have thought of as the camera didn't adhere to a stable or situated point of view. Instead, it flew around to capture the action from different angles. It was incomprehensible and yet I knew deep in my bones that it was a work of absolute genius, albeit one that I didn't understand.
Maybe every generation believes as much about their childhoods—or at least I hope they do—but the 1970s were the best imaginable time to grow up in America. We had the works of J. R. R. Tolkien, the first Star Wars movies, Dungeons & Dragons games, Monty Python's Flying Circus, and I, Claudius reruns on PBS—and emergent video game technology that every year matured and grew in sophistication right alongside us. I played my share of arcade and computer games growing up, and worked briefly as a video game designer in the 1990s for a Budapest-based startup. Later, I spent more hours of my life than might be reasonable playing World of Warcraft (2004). And yet, for some reason, I never thought of myself as a gamer.
I don't remember ever looking down on video games and the people who played them, but as a voracious reader, and eventually a graduate student in fiction writing, I naturally spent far more time with books than with video games. My interests leaned more toward the literary than the technological, but on that day down the shore I began to see how intertwined those two concerns could be. Although essential differences remain, and always will, literature and video games have more in common than I could have predicted.
Having been buried nose-first in books for so long, I had missed a fascinating cultural sea change. In my lifetime, video games have expanded from a small, geeky diversion to a mainstream phenomenon as popular in many regards as professional sports. I set out to write this book in order to learn what I had been missing. In doing so, I discovered that video games are not just games, but constitute a powerful storytelling medium, one that has provided startling new ways to think about my own life and the world in which I live.
The ways we interact with others have changed dramatically since Dr. William "Willy" A. Higinbotham first set up Tennis for Two on an oscilloscope in 1958. "Since 1959, we have come to live among flows of data more vast than anything the world has seen," Thomas Pynchon wrote in 1984, of all years, a full decade before Yahoo! and Geocities invited us online. Whereas computer scientists and video or computer gamers once constituted small subcultures of our society, relegated to government institutions and university labs, today a full 67% of American homes have video games of one sort or another. Many universities now offer MFA degrees in video game design, and the Museum of Modern Art has added games to its permanent collection. Instead of herding into noisy arcades or gluing ourselves to a single boxy TV in the den, we now hide our heads in mobile games like Angry Birds (2009), Canabalt (2009), and Pokémon Go (2016). The flows of data that Pynchon mentioned have indeed augmented our reality.
But who exactly is playing all these video games? Reliable statistics are tough to come by. As of 2010, the average gamer was thirty-four years old and spent eight hours each week gaming. That is equivalent, of course, to a full day of work. I find it particularly fascinating that at that time 40% of all gamers were women. A 2015 study revealed that 49% of American adults play video games; among those between the ages of 18 and 29, 77% of men and 57% of women played. It was also recently found that women over 35 made up half of the video-gaming demographic. Yet the gaming community is still widely—and unfortunately—perceived as a boys' club.
In many respects, video games have surpassed movies in terms of gross income and popularity. The World of Warcraft franchise (1994–present) has reportedly earned in excess of $10 billion for its developer, Blizzard Entertainment. In 2014 in the United States alone, sales of video games on disc reached $5.47 billion. That does not include downloaded software, and that staggering sum was actually down 14% from the previous year; we're downloading more and therefore relying less on physical storage objects like CD-ROMs or DVDs. In that same timeframe, sales of downloaded digital games (such as from Apple's App Store and Valve's Steam service) increased by 11% and reached $1.2 billion. In 2015 alone, American consumers alone spent $23.5 billion on video gaming. More than half of those sales—56%—were digital downloads. Baseball might be America's national pastime, but video games have become a global obsession. To call games like Call of Duty: Advanced Warfare (2014) or Madden NFL 16 (2015) big business would be the understatement of the millennium. Of course, video games are big business, but they are also more than that—or they can be.
Many more of us than ever before are now spending our commutes, our cubicle hours, and our free time enraptured by games, from the minor and mindless but irresistible (Bubble Breaker (2003) and Peggle Blast (2014)) to the magisterial and totalizing (Fallout 4 (2015) and No Man's Sky (2016)). Whereas once we might have visited with friends, written a letter, or gone to the movies, now we find entertainment—and genuine human connection—through our TVs, our computers, and our phones. That reality alone shows the impact of video games on our lives and culture.
Less remarked upon are the benefits games can offer. Witness the documented effect Minecraft has on children's problem-solving abilities. Or what in Reality Is Broken Jane McGonigal calls "stronger social connectivity." Or even the ability of Pokémon Go to prompt us to get outside and explore the natural (albeit digitally augmented) world around us. And yet it remains easy to think of video games as childish and the playing of them beneath the dignity of reasonable, mature adults, who should be spending their free hours on more socially acceptable leisure activities, like getting drunk on cheap beer and watching football. Many persistent prejudices about the medium derive from the—inaccurate, as it turns out—term "video game" itself. Yet even were video games only mere playthings, they would hold value. In his 1922 Homo Ludens: A Study of the Play Element in Culture, the Dutch historian Johan Huizinga likened play to sacred or religious rituals of the past: "Formally speaking, there is no distinction whatever between marking out a space for a sacred purpose and marking it out for purposes of sheer play," he wrote. "The turf, the tennis-court, the chessboard and pavement-hopscotch cannot formally be distinguished from the temple or the magic circle."
Playtime can serve a purpose beyond entertainment, distraction, and edification; it is a valuable and necessary element of being human. Or, as Huizinga put it, "culture arises in the form of play, … it is played from the very beginning. Even those activities which aim at the immediate satisfaction of vital needs—hunting, for instance—tend, in archaic society, to take on the play-form." We are an innately play-oriented species. Play serves a valuable, ritualistic function. For us today, video games can provide a return to the rituals of that magic circle. The sociologist Roger Caillois's 1958 Man, Play and Games spoke to a "pure space" that we enter during playtime. "In effect, play is essentially a separate occupation, carefully isolated from the rest of life, and generally is engaged in with precise limits of time and place." The truly great video games, the ones of interest to this book, evoke that pure space and help us return to the magic circle of our human past. As Drexel University digital media scholar Frank J. Lee told me, "text-based games and current games are part of the larger storytelling and play that people have been doing since the beginning of time."
Because video games are intended to be fun, it does not necessarily follow that they are frivolous. Every new style of art has inspired the previous generation to bellyache, perhaps in part because change reminds us of our mortality. I've heard many laments about the time children and adults waste on video games, but since that day sitting agog in front of the TV with my nephews, I have understood that new technologies always meet with resistance and games are an important subject for cultural and artistic analysis. After all, the early Sumerian poets who once recited their epics at great length and from memory likely bemoaned the new-fangled clay tablets upon which the stories of Gilgamesh were suddenly being preserved.
In preparation for a recent transatlantic flight, I chose two video games to add to my iPad. I confirmed my password to pay for the downloads, which transferred a series of 1s and 0s representing funds I had never seen from my bank account to Apple's App Store. I waited all of thirty seconds for the games to download over my lousy Comcast home Wi-Fi network, and then there they were. One of my new apps—if it really belonged to me—allowed me to play Warren Robinett's Adventure (1979), or at least a version of it.
Not to be confused with the text-based Colossal Cave Adventure, Robinett's game was published by Atari, Inc. in 1979 for play on what we now think of as the Atari 2600 home console. It sold over one million copies, and remains one of the most important and iconic video games ever made. I was overjoyed to see it included in the Atari's Greatest Hits app and looked forward to mastering it 30,000 feet above Greenland. When the pilot finally announced that we were free to use electronic devices, I pulled out my iPad and discovered that the game looked great: the lines clean and straight, the colors bright, the movement responsive to the faintest swipe of my finger. Just seeing Adventure again after all those years—the geometric mazes and that pixel-dragon that looked a bit like a duck—brought back a flood of pleasant memories.
After five minutes I turned it off. By the time I arrived in Stockholm, I had deleted the app. Fortunately, the other game I had downloaded, Monument Valley (2014), turned out to be an incredible piece of digital storytelling and, as I discuss later in the book, one of the most affecting video games I have played. The re-developers of Adventure had created a lovely reproduction of the original, but the "aura," as Walter Benjamin would have put it, had vanished. It felt wrong to play such a brilliant game on such a mundane device, to see Adventure so polished and shiny but also sanitized and voided of personality.
That sense of being transported back to the 1970s lingered, however, so when I got home to Philadelphia a few weeks later I dusted off my fake wood-grained Atari 2600 and purchased a copy of Adventure on eBay so I could play the game as it was meant to be played. Instead of waiting thirty seconds, I had an entire week to contemplate the extent to which I've come to expect instant gratification. Then the game showed up, a strip of old circuitry embedded in a casement of mass-produced, molded plastic: a perfect combination of digital and analog technologies. I powered up the console, popped in the cartridge, and the game looked awful—which was exactly what I had hoped for.
The original version of Adventure—as it appeared on my old Dell-manufactured flat screen TV, after extensive finagling with different adaptors and wires—was blurry and clunky and not remotely as polished as the app. The resolution was hideous yet beautiful, in much the same way that hearing the pops and cracks of an old LP is beautiful. I loved playing Adventure that way like I love hearing Ben Webster on a scratchy old record instead of on remastered and lifeless CDs. The scratches and crackle, or in this case the pixelated blurs and wonky motion, are essential to my enjoyment.
In writing this book, I have whenever feasible played the original incarnation of each game under discussion. I dropped quarters into arcade cabinets, loaded 5 1/2" floppy diskettes into buzzing C=64 drives, and spun tiny 3" CDs in a Nintendo GameCube. Without a single peep of complaint from Elivi, I installed a full-sized and obscenely loud Donkey Kong (1981) machine in our basement. However, with video games, and digital media in general, locating an original of a program can be futile. Newer games in particular are infinitely reproducible. It makes little sense to adhere to some false notions, rooted in nostalgia, for an authenticity that never existed. For that reason, I did not lose sleep over my inability to play Spacewar! (1962) on a PDP-1 computer or Pong (1972) on an original arcade cabinet. I tried to be respectful of Benjamin's aura without being inexorably tied to it.
I have also focused my attention on the games that advanced the medium in creative ways. No work of history can be truly comprehensive, and this one is not. What you hold in your hands consists of a selective and ultimately subjective—and hopefully still corrective—survey. The contributions of women to video game history have been overlooked for too long, for instance. Margaret Atwood has written that, "In most conventional histories, women simply aren't there. Or they're there as footnotes. Their absence is like the shadowy corner in a painting where there's something going on that you can't quite see." Make no mistake: this is no conventional history. I have tried to celebrate some of the many brilliant contributions, otherwise often ignored, that women programmers and designers have made to the field of video games.
When I set off to write the book, I quickly and unexpectedly found myself learning about the origins of video games in the time of World War II and their parallel growth alongside and within the military-industrial complex. My research took me to a government laboratory and dusty junk shops, to design studios and arcades, to universities, and to museums that had recently added games to their collections. I met professors and professional gamers, scientists and hobbyists, critics and game makers themselves. I designed a university course about video games and taught it online, holding virtual student conferences inside World of Warcraft. And, yes, with the help of my nephews I even learned how to play Minecraft. In fact, I played enough to suffer from Minecraft Syndrome, the effect of seeing the objects around me in real life as constituted of component blocks.
As Warren Robinett, the mind behind Adventure, has put it, "If you accept the idea that a new artform is emerging, then interviewing the genre-creators is equivalent, for a Classicist, to interviewing Homer; for an English professor, to interviewing Shakespeare." Self-aggrandizement aside, I could not agree more. I was fortunate, in a way, that many of the most important people behind games are still alive; my conversations with Robinett and Tim Schafer and other video game designers enlivened and complicated this book. Few of the people I met would agree on what makes video games important, or if they are art or not, but every last one shared an abiding—and contagious—passion for playing games.
And what is it exactly that I came to love about video games? The works of art I value the most—Inferno and Moby-Dick, Magritte's "The Treachery of Images" and Bartók's Concerto for Orchestra—always and necessarily resist definition; they allow me to question my own tastes and beliefs and values. Art cannot be pinned down like a butterfly to a taxonomist's board. Art helps me enter Huizinga's magic circle of ritual, where cordoned off from the workaday world I can find the mental and emotional bandwidth to better mediate my relationship between the real and the mystical or extra-real. My favorite video games do precisely that.
Before we hit START, I would like to thank you, the reader who has put down the controller long enough to pick up this book. You might find that I have ignored some popular games that you're excited about and that I've devoted what may feel like excessive attention to lesser-known games that I love. Please bear with me. Fortunately, as I discovered, there are enough video games, and enough kinds of video games, for everyone. The diversity of video games—and of video gamers—can be the medium's greatest strength. The gaming community we build together is entirely up to us. Let's play, shall we?
CHAPTER 1
epic origins
Every existent ounce of the glassine substance known as Trinitite originated at one moment and at one place on our planet. On July 16, 1945, as part of the Manhattan Project, scientists at Los Alamos National Laboratory in New Mexico successfully detonated the first nuclear bomb. The heat of the blast melted some of the desert's quartz sand and feldspar into glassy green chunks. Currently illegal to gather, a small amount made its way to collectors and curiosity seekers in the years immediately following World War II. I keep a sample of Trinitite on a bookshelf in my Philadelphia row house. It was a gift from my friend Hans and I understand that it retains some amount of radioactivity.
Hans wrote, on the inside of the box in which it sits: "fused sand ('glass') from the first man-made atomic bomb." I have held in my hand one of the rarest and most obscenely frightening substances known to humankind. The nickel-sized object is smooth and ever so slightly pockmarked on one side and as abrasive as sidewalk cement on the other. It is lustrous and beautiful. When I attempted to chip off a small piece it scratched my thumbnail, but it did eventually crack. It possessed no noticeable smell or taste.
The name of the substance is derived from the codename for that first atomic detonation: Trinity. The physicist J. Robert Oppenheimer, head of Los Alamos, famously said that the test brought to mind the apocalyptic line from the Bhagavad Gita: "Now I am become Death, the destroyer of worlds." Another witness to the detonation was the physicist William A. "Willy" Higinbotham. As group leader of the electronics division at Los Alamos, Higinbotham was responsible for creating the timing circuits for the Manhattan Project. A little over a decade later, he invented the first video game.
The famous lines that Orson Welles added to Graham Greene's screenplay for The Third Man, released in 1949, addressed the historical correlation of art to war: "In Italy, for thirty years under the Borgias, they had warfare, terror, murder, and bloodshed, but they produced Michelangelo, Leonardo da Vinci, and the Renaissance. In Switzerland they had brotherly love, they had five hundred years of democracy and peace. And what did that produce? The cuckoo clock." For all its ravages, warfare has inspired innumerable artistic triumphs. So perhaps I should not have been surprised to learn that the invention of video games can be traced directly to World War II.
Higinbotham was born on October 25, 1910 in Bridgeport, Connecticut. His father, a Presbyterian minister, encouraged his interest in the sciences, and at age fourteen Willy began tinkering with radios. He graduated from Williams College in 1928 with a degree in physics and, unable to find a job at the beginning of the Great Depression, he pursued his graduate studies at Cornell University. In 1941, as the United States appeared likely to enter the war, he went to work with the experimental physicist Robert Fox Bacher at the Massachusetts Institute of Technology, where he helped adapt early, analog computing technology for military use, including a high-altitude bombing system. A few years later, it was Bacher who recruited Higinbotham to Los Alamos.
One of the figures responsible for initiating the Manhattan Project was the engineer Vannevar Bush, who helmed the US Office of Scientific Research and Development. In July 1945, The Atlantic published Bush's essay "As We May Think," in which he attempted to plot a new, peacetime course for scientific discovery. "Machines with interchangeable parts can now be constructed with great economy of effort," he wrote. "The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it." Several weeks after the publication date of that essay, the United States unleashed atomic devastation on Hiroshima and Nagasaki.
Higinbotham was one of the physicists who, as Bush wrote, was "thrown most violently off stride, who [had] left academic pursuits for the making of destructive gadgets, who [had] had to devise new methods for their unanticipated assignments." Higinbotham lost his brothers Philip and Frederick in the war and suffered terrible pangs of conscience about his role in the Manhattan Project. For the rest of his career, in fact, he consistently threw himself into numerous anti-nuclear proliferation efforts. He moved to Washington, D.C. to become the first executive director of the Federation of American Scientists, which sought to call attention to the humanitarian consequences of scientific research. Soon thereafter, he accepted a position at Brookhaven National Laboratory, founded by the Atomic Energy Commission in 1947 on the grounds of a former US Army base in east-central Long Island. Higinbotham took the job, he said, because he "wanted to be involved in instruments and also be at an institution that wouldn't complain if [he] continued to be active in arms control."
Beginning in 1958, at the height of the Cold War, Brookhaven instituted annual visitors' day celebrations so the public could learn more about the scientific discoveries their tax dollars were funding. By that time, Higinbotham had become head of the Instrumentation Division, the official purpose of which was to "develop state-of-the-art electronic instruments to help scientists collect and analyze data that would come from the big machines built for smashing the atom." For Visitors Day, he created an interactive exhibition, one that would demonstrate the application of the government's technological innovations in a way that the general public could understand. Using a Donner Model 30 analog computer—and basing his work upon the calculation of ballistic-missile trajectories—Higinbotham created a two-player tennis simulation. Today we would immediately recognize what he called Tennis for Two as a precursor to Pong.
The era's analog computers differed from our current digital machines in many ways, and the distinctions would be the cause of debate among video game historians for decades to come. In a digital computer, such as the ones eventually used to program and play Spacewar! and every subsequent video game, a central processing unit (CPU) converts numbers to graphic representations on a display monitor. An analog computer, by contrast, does not get programmed so much as assembled: the organization and arrangement of the physical components themselves create the gameplay. Some gamers continue to believe, for that reason, that Tennis for Two was not a video game at all.
Higinbotham's blueprint-sized diagram for Tennis for Two depicted how a complicated network of physical circuits and relays would employ electrical and mechanical processes to simulate the flight of a tennis ball back and forth over a simulated net. With the schematic in hand, Higinbotham enlisted the engineer Bob Dvorak to assemble the computer and connect it to an oscilloscope, the globular screen of which was more commonly used to display the jagged lines of signal voltages. "By all accounts," according to a pamphlet published by the Instrumentation Division, "the game was a huge success. Willy Higinbotham was amazed that people were lined up completely around the gym to wait their turn to play." To appreciate the technological marvel of Tennis for Two, imagine trying today to create a playable version of Minecraft on a contraption made out of Lincoln Logs, wires, and a few 9-volt batteries hooked up to an Etch A Sketch.
In 1997, for the fiftieth anniversary of Brookhaven's founding, the human resources department organized a celebration for the lab's staff and their families. Like those early Visitors' Days in Higinbotham's time, it was also open to the public. To commemorate the occasion, Dr. Peter Z. Takacs, director of the Optical Metrology Lab in the Instrumentation Division, agreed to recreate Tennis for Two
Genre:
- "Believe it or not, all those hours playing Super Mario Bros. or Sonic the Hedgehog really meant something. Andrew Ervin takes a brilliant look at the effects of those games--which did not, in fact, rot our brains."—Rolling Stone
- "A fun and insightful analysis of the cultural, educational, and historical value of video games. Ervin deftly traces the evolution of our most interactive art form from Adventure to Minecraft, while offering riveting first-hand accounts from many of the men and women who made it all happen. Bit by Bit is an essential addition to every video game lover's library."—Ernest Cline, author of Ready Player One and Armada
- "Not many books about video games allow Denis Johnson to rub shoulders with Monkey Island or Vladimir Nabokov with Peter Molyneux. Ervin's taste in games is excellent, his points are thought-provoking, and his cultural omnivorousness (take note, aspiring game journalists) is thrilling. A terrific book."—Tom Bissell, author of Extra Lives and Apostle
- "Ervin brings a literary sensibility to his study... [he] makes an affable guide through the history of the medium... For me, the book's key statement is this: 'Today, if there is in fact a distinction between mass entertainment and the fine arts, it gets complicated more effectively by video games than any other medium.' Bit by Bit plumbs these complications with welcomed intelligence."—Washington Post
- "Andrew Ervin slaloms through their cultural and technological history, from physicist William Higinbotham's 1958 analog simulation Tennis for Two to Atari classics, arcade stalwart Pac-Man and the Warcraft franchise. Ervin even plays the original games, research that involves the installation of vintage computer drives and an 'obscenely loud' Donkey Kong machine. A vivid foray into alternative worlds." —Nature
- "An engrossing and necessary read."—Electric Literature
- "Literary and playful... Bit by Bit provides a fascinating exploration of the world of video games, their history and importance to modern culture." —Winnipeg Free Press
- "[Bit by Bit] is a contemplative ode to electronic entertainment...It's a personal journey that speaks volumes on how video games have grown, evolved, and multiplied to fill myriad roles over the years."—Publishers Weekly
- "An urbane, witty, passionate, and eminently literate history of video games from their infancy in the 1950s to today... Ervin, who gives equally satisfying treatment to game sounds, special effects, and music, is a terrific storyteller, and he provides profiles of dozens of game developers and fanatics."—Philadelphia Inquirer
- "A brisk, thoughtful tour of video game history. Ervin is an ideal guide... Bit by Bit might persuade holdouts just how awesome video games are."—Games World of Puzzles
- "It's unusual for a history of video games to feature multiple quotes from Rilke, references to philosophy and Zen Buddhism, and comparisons to great works of art. But that's exactly what Ervin serves up to support his compelling argument: video games can be art."—Booklist
- "Bit by Bit is the perfect video game book: it's part gamers' history, part history of games, and by a writer inclined to philosophical insight and literary reference. Extra hearts for a history that actually includes the contributions of women, too!"—Amber Sparks, author of The Unfinished World: And Other Stories and May We Shed These Human Bodies
- "Like spaceships or skyscrapers, video games are a collaboration of humans and machines, of art and commerce. One part flesh, one part metal, one part markets, one part truth. Andrew Ervin composes a winsome but measured portrait of games from all these pieces, bit by bit."—Ian Bogost, author of Play Anything: The Pleasure of Limits, the Uses of Boredom, and the Secret of Games
- On Sale
- May 2, 2017
- Page Count
- 304 pages
- Publisher
- Basic Books
- ISBN-13
- 9780465096589
Newsletter Signup
By clicking ‘Sign Up,’ I acknowledge that I have read and agree to Hachette Book Group’s Privacy Policy and Terms of Use