Too Much of a Good Thing

How Four Key Survival Traits Are Now Killing Us


By Lee Goldman

Formats and Prices




$19.99 CAD

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around December 8, 2015. This date is subject to change due to shipping delays beyond our control.

The dean of Columbia University’s medical school explains why our bodies are out of sync with today’s environment and how we can correct this to save our health.

Over the past 200 years, human life-expectancy has approximately doubled. Yet we face soaring worldwide rates of obesity, diabetes, high blood pressure, mental illness, heart disease, and stroke. In his fascinating new book, Dr. Lee Goldman presents a radical explanation: The key protective traits that once ensured our species’ survival are now the leading global causes of illness and death.

Our capacity to store food, for example, lures us into overeating, and a clotting system designed to protect us from bleeding to death now directly contributes to heart attacks and strokes. A deeply compelling narrative that puts a new spin on evolutionary biology, Too Much of a Good Thing also provides a roadmap for getting back in sync with the modern world.


Begin Reading

Table of Contents


Copyright Page

In accordance with the U.S. Copyright Act of 1976, the scanning, uploading, and electronic sharing of any part of this book without the permission of the publisher constitute unlawful piracy and theft of the author's intellectual property. If you would like to use material from the book (other than for review purposes), prior written permission must be obtained by contacting the publisher at Thank you for your support of the author's rights.


Ever since I began practicing medicine, my family and friends have inevitably asked me about real and perceived health problems. Common questions always include: "Why is it so hard for me to lose weight?" "Do I really need to take my blood pressure medicine even if I feel okay?" "Should I take an aspirin every day?"

In thinking about these questions and others, I began to appreciate what seems to be a peculiar conundrum. Some of the key protective strategies our bodies have used to assure the survival of our human species for tens of thousands of years now cause many of the major diseases of modern industrialized societies.

In writing this book, my first goal is to emphasize the historical fragility of the human species and why we wouldn't be here today, let alone dominate the world, if it were not for fundamental survival traits such as hunger, thirst, fear, and our blood's ability to clot. My second goal is to explain why these hardwired survival traits are often now "too good"—not only more powerful than we need them to be to survive in the modern world but also so strong that, paradoxically, they've become major causes of human disease and death. My third and perhaps most important goal is to explain how the future may play out, including how we can continue to use our brains to influence that outcome.

Too Much of a Good Thing focuses on the four key human survival traits, without which we wouldn't be here today:

Appetite and the imperative for calories. Early humans avoided starvation by being able to gorge themselves whenever food was available. Now that same tendency to eat more than our bodies really need explains why 35 percent of Americans are obese and have an increased risk of developing diabetes, heart disease, and even cancer.

Our need for water and salt. Our ancestors continually faced the possibility of fatal dehydration, especially if they exercised and sweated, so their bodies had to crave and conserve both water and salt. Today, many Americans consume far more salt than they need, and this excess salt combined with the same internal hormones that conserve salt and water are the reasons why 30 percent of us have high blood pressure—significantly increasing our risks of heart disease, stroke, and kidney failure.

Knowing when to fight, when to flee, and when to be submissive. In prehistoric societies, up to 25 percent of deaths were caused by violence, so it was critical to be hypervigilant, always worrying about potentially getting killed. But as the world got safer, violence declined. Suicide is now much more common in the United States than murder and fatal animal attacks. Why? Our hypervigilance, fears, and worrying contribute to a growing epidemic of anxiety, depression, and post-traumatic stress—and the suicides that can result.

The ability to form blood clots so we won't bleed to death. Because of their considerable risk of bleeding from trauma and childbirth, early humans needed to be able to clot quickly and efficiently. Now, with the advent of everything from bandages to blood transfusions, blood clots are more likely to kill us than excessive bleeding. Most heart attacks and strokes—the leading causes of death in today's society—are a direct result of blood clots that block the flow of arterial blood to our hearts and brains. And long car rides and plane trips, unknown to our distant ancestors, can cause dangerous and sometimes fatal clots in our veins.

Well into the nineteenth century, each of these four traits specifically helped our ancestors survive as they tried to avoid starvation, dehydration, violence, and bleeding—which have been among the leading causes of death throughout human history. But now, amazingly, these same four traits collectively explain more than 40 percent of deaths in the United States, including four of the eight leading causes of mortality, and are directly responsible for more than six times the number of deaths they prevent. How could the very same attributes that helped humans not only survive but also dominate the earth now be so counterproductive?

This paradox is the essence of Too Much of a Good Thing. For more than 200,000 years and perhaps 10,000 generations, the world of our ancestors changed very gradually. Our genes, which define who we are, also evolved more or less on a parallel path, so our ancestors could adapt and thrive. Then, barely 200 years ago, human brainpower began to change our world dramatically, as the Industrial Revolution signaled the beginning of an entirely new era of transportation, electricity, supermarkets, and medical care. The good news is that life expectancy, which was little different in the early 1800s from what it was tens of thousands of years ago, has just about doubled since then, including more than a six-year gain just since 1990. But the bad news is that our bodies have had only about 10 generations to try to adapt to this new world. Our genes simply can't change that fast, and, as a result, our bodies are lagging behind our environment. Instead of dying from challenges against which our bodies were designed to protect us, we're now more likely to die from the protective traits themselves.

So what will happen next? There are three major possibilities. One is that everything gets worse—more obesity, diabetes, high blood pressure, anxiety, depression, suicides, heart attacks, and strokes—until those of us with these diseases don't live long enough to have children. Although this possibility may seem far-fetched, we already see obese children becoming diabetic adolescents who will be less likely to have children—let alone healthy children—than their nondiabetic counterparts. But somehow, we should be more than smart enough to avoid this doomsday scenario.

A second possibility is that we collectively will devote more time and effort to being healthier. We'll all eat better, exercise more, and embrace other virtuous lifestyle changes. Unfortunately, these self-help approaches, though sometimes successful on an individual basis, are notoriously unsuccessful across populations. The term "yo-yo dieting"—a reference to the fact that most people who lose weight usually gain it right back—is just one example of the tendency for many short-term successes to be counterbalanced by long-term failures.

The third possibility is that we'll take advantage of modern science—not in isolation but as a key complement to continued attempts to improve our lifestyles. High blood pressure requires medication; stomach surgery is the most successful way to deal with refractory extreme obesity; the chemical imbalances that cause depression often respond best to antidepressant medications; and an aspirin a day really does make sense for some of us. Ongoing scientific advances also hold great promise for future medications that could control our appetites for food and salt or that could safely reset our clotting systems. And with the decoding of the human genome, we're entering an era in which the specific genetic causes of modern diseases may be treated with medicines that target only the responsible gene. These advances augur a new era of personalized precision medicine: treatment specifically designed to address each individual's needs. This growing reliance on medications or even surgery shouldn't be dismissed as moral weakness but rather recognized as the sometimes necessary way to do what we can't do on our own—because our genes simply aren't built that way and can't change fast enough.

Too Much of a Good Thing addresses all these possible outcomes and potential solutions to our present predicament. Each of the first five chapters begins with a story that helps frame the modern situation, and then explains how our current health challenges are the direct result of historically successful human attributes. The final three chapters provide a blueprint for how we can and must continue to use our brains to get our genes and our bodies back into sync with the environment we've created.




How Our Bodies Became What They Are

Timothy Ray Brown, who was originally labeled the "Berlin patient" by doctors who tried to protect his anonymity, was in his twenties when he became infected with the human immunodeficiency virus (HIV)—the virus that causes AIDS—in the 1990s. Doctors successfully treated him with the usual HIV medications of that era, and he did well until he developed acute leukemia. His leukemia was initially treated with aggressive chemotherapy, the side effects of which required that he temporarily stop the anti-HIV medicines. Without these drugs, his HIV blood levels soared, an indication of persistent infection and susceptibility to full-blown AIDS. His doctors had no choice but to reduce his chemotherapy and restart the anti-HIV drugs. The only remaining option for treating his leukemia was a bone marrow transplant, which had about a fifty-fifty chance of curing it. Fortunately, his doctors found a compatible bone marrow donor, and Brown beat the odds—the bone marrow transplant cured his leukemia, and he remains leukemia-free more than five years later.

But the story doesn't end there. His doctors also knew, from studies of unusual people who were infected with HIV but never progressed to severe infection or full-blown AIDS, that just one mutation in a single gene, if inherited from both parents, can prevent HIV from entering cells that it otherwise has no trouble infecting. Unfortunately, less than 1 percent of northern Europeans and even fewer of the rest of us have this fully protective mutation. Remarkably, Brown's doctors found a compatible bone marrow donor who also had this unusual mutation. And the successful transplantation of those bone marrow cells didn't just cure Brown's leukemia—it also made him the world's first person ever to be fully cured of advanced HIV infection.

The first case of AIDS was reported in San Francisco in 1980. What followed was a series of remarkable scientific feats: HIV was confirmed to be the cause of AIDS by 1984; tests to screen for the virus were available by 1985; the first medication to treat it was formulated by 1987; and effective multidrug therapy began to turn HIV infection into a commonly treatable chronic disease by the mid-1990s.

Scientists also looked back and documented HIV in human blood as early as 1959, including in a small number of people who had died of undiagnosed AIDS in the decades before the first reported case. Based on genetic studies of the virus itself, scientists postulate that what we now call HIV originated as a slightly different virus in monkeys in central Africa and probably was first transmitted to humans sometime in the 1920s.

But let's imagine a different scenario: What would have happened if HIV had spread from monkeys to humans before the late-twentieth century advances in virology and pharmacology? HIV infection likely would have spread gradually but persistently throughout the world, especially because infected people are highly contagious for months or years before they develop symptoms. Of course, before the era of modern transportation, the spread initially would have been slower, but we still would have expected the virus to spread wherever humans had intimate contact with one another. If everybody infected with HIV subsequently got AIDS and died, the only remaining humans would either have been those rare people who had this protective mutation or people who were somehow sufficiently isolated or far enough from central Africa to avoid exposure.

The reality, though, is that nothing—no famine, infectious disease, or environmental catastrophe—has ever wiped out the human race. To understand how we've survived, we need to understand where we came from and how we got here. We'll start by going back in time to look at our ancestors.


My parents always told me that they named me after my maternal great-grandfather Louis Cramer. But when my wife decided to trace both our families on, census records showed that his name really was Leib Chramoi. The Internet, the most transformative human invention of my lifetime, shed new light on my genealogy and emphasized how much the world has changed in just three generations.

But how about our collective genealogy over thousands of generations? How could our ancestors have survived the myriad threats that antedated HIV but that potentially could have wiped out 99 percent or more of us long before modern medications were available? And going back even further, why did we survive while most species didn't?

If we really want to understand how we got from the first anatomically modern humans—known as Homo sapiens, based on our genus (Homo) and our species (sapiens)—to my great-grandfather Leib, we'll have to understand how our genes define who we are. And in telling that story, we'll also learn how those same attributes are now a mixed blessing—sometimes still saving our lives but increasingly causing modern maladies.

The earliest archaeological evidence for Homo sapiens begins about 200,000 years ago in Africa. Some adventurous Homo sapiens apparently first moved out of Africa and into the Arabian Peninsula about 90,000 years ago before dying or retreating back to Africa during the Ice Age because of the cold climate and advancing glaciers. But another group of Homo sapiens tried again around 60,000 years ago, migrating first to Eurasia, then spreading to Australasia about 50,000 years ago and later to the Americas about 15,000 years ago.

Archaeological remains also tell us that we weren't the only Homo species ever to walk the earth. Based on the dating of fossils, paleontologists estimate that the earliest member of the Homo genus, called Homo habilis, dates back to around 2.3 million years ago. By 1.8 million years ago, Homo ergaster in Africa, then Homo erectus in Africa and Eurasia, were as tall as modern humans, although with somewhat smaller brains. Between 800,000 and 500,000 years ago, other Homo species in Africa, Europe, and China had skull sizes and presumably brains that were approximately as big as ours. Some of their descendants, including Neanderthals and Denisovans, appeared about 350,000 years ago, followed later by the first Homo sapiens about 150,000 years later.

Although paleontologists used to think that each successive Homo species replaced the prior, "less human" one and that each more modern species inherited the earth from its predecessor, the reality isn't so simple. Neanderthals and humans, for example, probably overlapped for more than 150,000 years, and we know from fossil remains that humans, such as the cave-painting Cro-Magnons in France and Spain, lived near or even with Neanderthals beginning about 50,000 years ago and continuing for perhaps 10,000 years. Genetic evidence suggests that at least some interbreeding occurred pretty quickly and that Neanderthal-derived genes now comprise about 2 percent of the DNA of modern Europeans and Asians. And since we don't all have the same Neanderthal genes, it's clear that the interbreeding wasn't a one-time phenomenon. We don't know the functions of most of these Neanderthal genes, but as we'll show in chapter 2, they seem to have helped our ancestors adapt better to their new, out-of-Africa environments.

The Denisovans probably lived in Siberia as recently as 50,000 years ago. Based on DNA recovered from a Denisovan girl's tiny pinkie finger bone, scientists conclude that Denisovans interbred with Neanderthals as well as with humans. For example, today's Tibetans, Australian aboriginals, and Melanesians have some Denisovan DNA. By comparison, Africans have substantially less Neanderthal DNA and no detectable Denisovan DNA—but about 2 percent of their DNA apparently comes from yet another unknown prehistoric Homo group that lived in Africa about 35,000 years ago.

Scientists estimate that the Neanderthals across Europe and Asia never exceeded a population of about 70,000 and more likely peaked at about 25,000. And the Denisovan population likely was no larger. As the glaciers receded and the world warmed up, humans eventually outcompeted the brawnier but slower-footed Neanderthals as well as the Denisovans to become the only surviving Homo species by around 40,000 years ago.

On the one hand, it seems easy to assume that humans survived simply because we're smarter than all other animals. But on the other hand, it's pretty remarkable. It's true that our brains are big, but dinosaurs once dominated the earth despite their proportionally tiny brains. To put things into perspective, let's look at it another way: What percentage of species that ever inhabited the earth is still in existence today? The answer: about one out of 500—just two-tenths of 1 percent.

Yes, we're here today because of the ultimate triumph of our brainpower, but our survival has always been contingent on our brawn—not so much our raw muscular strength but rather all the inherent traits that helped our ancestors survive and even flourish. Our brain's capacity for art, science, philosophy, and technology never could have been realized if our bodies hadn't been vigorous enough to survive in challenging and even hostile environments. Our ancestors had to be able to gather nuts and berries, hunt and kill wild animals that were often bigger and faster than they were, and survive the Ice Age, not to mention droughts and infectious plagues.

In the digital age, a nonathletic, antisocial nerd with poor eyesight and bad asthma can become a billionaire. But for most of human existence, we required robust physical attributes just to survive, not to mention compete for the mate who would help pass our genes on to future generations.

We survived against these daunting odds because of the attributes and resilience of the human species—what we're made of and how our bodies have evolved and adapted over the course of about 200,000 years and 10,000 generations. We're here because no prehistoric predator, climate disaster, or HIV-like infection ever wiped out our ancestors.


Year in and year out, men and women probably always competed for mates mostly in terms of their physical attributes—not unlike the prom king and queen or the football captain and the head cheerleader. But survival of the fittest doesn't mean the survival of the biggest, fastest, or prettiest. Although these physical traits help, they're just the beginning of the story.

We're here rather than the Neanderthals, the Denisovans, or the never-born offspring of humans who failed to procreate because our own ancestors were fitter—they had the wherewithal to meet the earth's environmental challenges and to outcompete others for food, water, and mates. Survival of the fittest explains how natural selection works: over the course of generations, people with the "fittest" genetic attributes live long enough to have the most children, who themselves survive to have more children, and so on. Simply put, if you and your descendants survive and procreate, your DNA will be perpetuated. And if you don't, your DNA, just like your family name, won't.

Our DNA resides on 23 pairs of chromosomes—one member of each pair inherited from each parent. In 22 of these pairs, each of the two paired chromosomes has similar—but not absolutely identical—DNA. The 23rd pair defines our sex—girls inherit an X chromosome from each parent, whereas boys get an X chromosome from their mothers and a corresponding (but smaller and very different) Y chromosome from their fathers.

Each of our chromosomes is composed of a varying amount of DNA, which in aggregate comprise about 6.4 billion pairs of nucleotides, with about 3.2 billion pairs coming from each parent. These 6.4 billion pairs, which are commonly called base pairs, can be thought of as the letters that make up the words (genes) that define who we are.

Our chromosomal DNA collectively contains about 21,000 genes that code for corresponding RNA, which then codes for each specific protein that is critical for our bodies to function normally. Surprisingly, these 21,000 genes comprise only about 2 percent of our chromosomal DNA. Another 75 percent or so of our DNA contains about 18,000 more—and generally larger—genes that don't tell RNA to code for proteins but instead send signals that activate, suppress, or otherwise regulate the protein-coding genes or the functions of their proteins. The remainder of our DNA, about 20 percent, doesn't code for any currently known proteins or signals and is sometimes called junk DNA, although scientists may well discover important roles for it in the future.

Since we have two sets of chromosomes, we normally might expect to have two identical versions of each of our 21,000 genes, one inherited from each parent. But although each half of a child's genes—and his or her DNA—should be an exact replica of the DNA of the parent from whom it came, even the best-programmed computer can make a rare typographical error. When DNA replicates, an imperfection—or mutation—occurs only once every 100 million or so times. As a result, each child's DNA has about 65 base pairs that differ from those of her or his parents. Over time, these mutations add up. All humans are about 99.6–99.9 percent identical to one another, but that still means we each have at least six million base pairs that can differ from other humans. These mutations define us as individuals and also show how our human species has evolved over generations.

Mutations themselves are random events. But their fate—whether they spread, persist, or disappear—in future generations is determined by whether they're good (advantageous), bad (disadvantageous), or indifferent (neutral). In natural selection, an initially random genetic mutation is perpetuated if it's sufficiently beneficial to you and your children. By comparison, disadvantageous mutations are usually eliminated quickly from the population even if the child survives. Slowly but surely—across 10,000 or so generations of humans—our genome has changed substantially because of the random appearance but selective perpetuation and spread of beneficial mutations.

Neutral mutations don't spread as widely as beneficial mutations, but they can hang around and accumulate over time. Some neutral mutations are totally unimportant and invisible—our DNA code changes, but we still make the identical protein. It's the same principle as if we were to spell a single word two different ways—such as gray and grey or disk and disc—without any alteration in meaning. Some neutral mutations change us—determining, for example, whether a person's chin is dimpled or not—but have no survival impact. But some currently neutral mutations might theoretically become advantageous or disadvantageous in the future. Imagine, for example, a mutation that could make us more or less sensitive to high-dose radioactivity, which isn't a problem now but would become critical if there were a thermonuclear war.

Every now and then, a new advantageous trait might be so absolutely critical for the ability to survive an otherwise fatal challenge—such as an epidemic of infectious disease—that it will become the new normal almost immediately, because everyone without it will die. But most beneficial genetic changes simply confer a relative competitive advantage, perhaps only in limited geographical locations or in specific situations.

Not surprisingly, the rate at which a genetic mutation spreads depends on how much relative advantage it brings. The spread will be slow at first and then speed up as it disperses more broadly within a closed population. It's like when you bet one chip to win one chip, bet two chips and win two more chips, and keep on going. Once something "goes viral," it increases rapidly because every previous carrier becomes a transmitter—whether we're talking about infectious diseases or a favorable mutation.

Imagine, for example, a mutation that carries a 25 percent survival advantage for you and your children if you get it from both parents and half that advantage if you get it from one parent. This mutation will spread to less than 5 percent of the population in about 50 generations (circa 1,000 years) but to more than 90 percent of the population in 100 generations (2,000 years) and to essentially the entire population by 150 generations (3,000 years). By comparison, if a single new mutation carries just a 1 percent advantage, it will spread more slowly—but it will still spread to nearly 100 percent of a population of a million people in about 3,000 generations, or around 60,000 years. Interestingly, these calculations don't depend much on the size of the population. For example, it will only take about 1.5 times as long for a mutation to spread throughout an entire population of 100 million people as it would take for it to spread to a population of 10,000 people. Slowly but steadily and inexorably, natural selection has defined which genes—and, as a result, which people—inhabit the earth.


If every one of us shares a particular gene, it's hard to know what less advantageous gene it may have replaced. But we can get a good appreciation for natural selection by looking at genes that are common or even ubiquitous in some parts of the world, where they presumably have a strong advantage, and rare or even nonexistent in other parts of the world, where they have no advantage or even are deleterious. Two examples are (1) worldwide variations of skin color and their link to our need for strong bones and (2) lactose tolerance and its link to the domestication of animals.

Strong Bones

Adult humans have 206 discrete bones, somewhat fewer than our children, because some bones fuse together as we grow. In aggregate, our bones make up about 13 percent of our body weight. In order to have healthy bones, we need to have their key building block in our diets: calcium. We also need an activated form of vitamin D to help us absorb calcium in our intestines and to regulate its incorporation into our bones.


  • "In this highly original and profound book, Lee Goldman describes how the same physical traits that evolved to ensure our survival are now working against us. For anyone interested in their own and their family's well-being, Too Much of a Good Thing is a must read!"—Winner of the 2000 Nobel Prize in Physiology or Medicine, University Professor, Department of Neuroscience, Columbia University, author of The Age of Insight and In Search of Memory
  • "A fascinating look at the health problems that plague us, illuminating why they happen and what to do about them."
    Jerome Groopman, M.D., Pamela Hartzband, M.D., Harvard Medical School, Authors of Your Medical Mind: How to Decide What is Right For You
  • "This book, written from a deeply expert yet broad medical viewpoint, sets current medical challenges into their larger contexts of our human history and biological pre-history, to provide a crisply related and refreshingly clear-eyed perspective on much that ails us these days. And throughout the book, I also enjoyed the fascinating snippets on topics ranging from platelets to percentages of paleolithic food components to polyandry to presidential obesity."
    Elizabeth Blackburn, Winner of the 2009 Nobel Prize in Physiology or Medicine

On Sale
Dec 8, 2015
Page Count
288 pages
Little Brown Spark

Lee Goldman

About the Author

Dr. Lee Goldman is dean of the medical school at Columbia University. An internationally renowned cardiologist, he developed the Goldman Criteria (a set of guidelines for healthcare professionals to determine which patients with chest pain require hospital admission) and the Goldman Index (which predicts which patients will have heart problems after surgery). He’s the author of more than 480 medical articles and also the lead editor of Goldman-Cecil Medicine, the oldest continuously published medical textbook in the U.S.

Learn more about this author