Go to Hachette Book Group home
Join the Club!
Use code DAD23 for 20% off + Free shipping on $45+ Shop Now!
The Longevity Revolution
The Benefits and Challenges of Living a Long Life
Formats and Prices
Format:ebook $12.99 $16.99 CAD
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around September 8, 2009. This date is subject to change due to shipping delays beyond our control.
Also available from:
The U.S. has not made a research investment in aging. Only eleven medical schools out of 145 have geriatrics departments compared to England where geriatrics is the number two specialty. We have not solidified private pension plans or strengthened Social Security to ensure that people do not outlive their resources. In this urgent and ultimately optimistic book, Dr. Butler shows why and how we must re-examine our personal and societal approach to aging right now, so that the boomers and the generations that follow may have a financially secure, vigorous, and healthy final chapter life.
Books by the Author
Birren, J.E., R.N. Butler, S.W. Greenhouse, L. Sokoloff, and M.R. Yarrow. Human Aging, 1963.
Butler, R.N. and M.I. Lewis. Aging and Mental Health, 1973; with Sunderland, T. (5th ed.), 2003.
Butler, R.N. Why Survive? Being Old in America, Pulitzer Prize, 1976.
Butler, R.N. and M.I. Lewis. Love and Sex after Sixty, 4th ed., 2002.
Butler, R.N. and A.G. Bearn (eds.). The Aging Process, 1985.
Butler, R.N. and H.P. Gleason (eds.). Productive Aging, Enhancing Vitality in Later Life, 1985.
Warner, H., R.N. Butler, E. Schneider, and R. Sprott (eds.). Modern Biological Theories of Aging, 1987.
Kent, B. and R.N. Butler (eds.). Human Aging Research, 1987.
Bianchi, L., P. Holt, O.F.W. James, and R.N. Butler (eds.). Aging in Liver and Gastrointestinal Tract, 1987.
Butler, R.N., M. Schechter, and M. Oberlink (eds.). The Promise of Productive Aging, 1990.
Butler, R.N. and K. Kiikuni (eds.). Who Is Responsible for My Old Age? 1993.
Butler, R.N. and J.A. Brody (eds.). Delaying the Onset of Late Life Dysfunction, 1995.
Fillit, H.M. and R.N. Butler (eds.). Cognitive Decline: Strategies for Prevention, 1997.
Butler, R.N., L.K. Grossman, and M. Oberlink (eds.). Life in an Older America, 1999.
Butler, R.N. and C. Jasmin (eds.). Longevity and Quality of Life, 2000.
To the Memory of My Late Wife
Myrna Irene Lewis
To Our Daughter
Alexandra Nicole Butler
Myrna Irene Lewis
To Our Daughter
Alexandra Nicole Butler
“Man can will his future.”
Rene Dubos (20th Century Biologist, Scientist, and Author)
(Greek Poet, 5th Century)
In fewer than one hundred years, human beings made greater gains in life expectancy than in the preceding fifty centuries. From the Bronze Age to the end of the nineteenth century, life expectancy grew by only an estimated twenty-nine years—from about twenty to just under fifty years. But since the beginning of the twentieth century in the industrialized world, there has been an unprecedented gain of more than thirty years of average life expectancy from birth to over seventy-seven years of age. The aging of populations is occurring rapidly throughout the world. By 2025, the number of people in the United States age sixty-five and older will nearly double. Old age, once seen in 2 to 3 percent of a population, is now common and favors women, who outlive men by over five years in America today. Moreover, nearly 20 percent of the gain in life expectancy now applies to those sixty-five years of age.
I call this unprecedented demographic transformation the Longevity Revolution.1 Once the privilege of the few, longevity has rapidly become the destiny of the many in both the developed and developing worlds. How did it come about that the fulfillment of ancient hopes to extend life—with genuine possibilities of more to come—has not been welcomed with total enthusiasm?
People of good will and deep concern over the future of our country, and other nations, have asked tough questions about the impact of the aging of our population and its advancing longevity. Can we afford old age as a society? Will Social Security and Medicare collapse under the pressure of growing numbers of retirees? It is altogether good that corporate leaders, economists, politicians, and commentators are raising these crucial issues. However, the unfortunate outcome of such criticism has been the spread of fear and gloom among the general public.
Despite this pervasive ambivalence, I remain optimistic. We have the tools to take advantage of this exceptional demographic shift. But it will require nothing less than a total transformation of both the personal experience of aging and of cultural attitudes.
I wrote The Longevity Revolution for the thoughtful public. Its central purpose is to describe the origins, challenges, and adjustments to advancing longevity and the aging of populations and to question contemporary assumptions about late life. Because I believe in the activism of an enlightened citizenry, the general thrust of this book is toward an agenda for action and the presentation of a body of knowledge to support it.
The book is divided into six sections. The first part describes the origins of the Longevity Revolution. Part II outlines the major challenges of a longer life, including ageism, the changing nature of the family, and the various disorders of longevity.
Part III offers a detailed overview of biomedical science related to aging. But doing science takes money, lots of money, so we will begin with a discussion of the pressing need for funding. The devastating reality of Alzheimer’s disease is one of the greatest challenges facing scientists today. Despite advances in neuroscience, we do not know the causes of this disease nor have we discovered how to curb its progression in any meaningful way. We must launch a colossal public-private research initiative to combat Alzheimer’s disease if we are to realize the full potential of a longer life. Finally, this section covers the basic biology of aging and longevity, from the evolutionary theories of aging to the discoveries that appear so promising.
Part IV is the heart of the book. Here I will offer a range of solutions to the challenges raised in previous chapters. I will discuss how societies can improve health promotion and health care as well as how they can finance these added years of life. The United States has a long way to go. Although Americans are living longer than ever, having reached 77.9 years on average, American life expectancy has slipped over the last two decades from eleventh place to forty-second. This puts the richest country in the world behind Jordan, Singapore, and the Cayman Islands, among other countries. The United States has not made a research investment in ageing. Only eleven American medical schools out of 145 have geriatrics departments; in England geriatrics is the number two specialty. Americans need a health program rather than a disease program.
All of this, to be sure, is an enormous undertaking and will require the development of a vigorous politics of aging and longevity. In particular, the baby boomers must exercise their political power to force changes in policy.
In Part V, I discuss some necessary cautions. There is a staggering inequality of longevity around the world, which is a serious obstacle to globalization. In Sierra Leone, a child can expect to live to be about thirty-four years old. In Japan, which has the world’s highest life expectancy, a person will likely live to be eighty-two. Moreover, unless we deal with significant threats—from industrial pollution to diseases ranging from tuberculosis to AIDS to the epidemic of obesity—we could lose the longevity we have gained.
The final part of the book discusses quality-of-life issues and the future of longevity. If we continue to add years to our life span, will we make good use of them? Will the added years bring about greater maturity in our personal relationships and our relationship to the world?
Through this book, I hope to contribute to cultural, scientific, and social thought and, most of all, to encourage a national discussion about the challenges posed by the Longevity Revolution. My intentions are immediate and practical. The so-called baby boomers—the seventy-six million Americans born between 1946 and 1964—are, by some counts, the largest generation in American history. Their numbers surprised everyone, and consequently there were not enough diapers for them as babies, not enough schools for them as children, and not always enough good jobs for them as adults. In 2011, the oldest of this generation will turn sixty-five, and many worry that there will not be enough resources for them in old age.
I believe that the baby boomer generation is both a generation at risk for unhappy old age and the key to transforming the character of old age in America. The baby boomers are discovering old age through their parents, and they want a financially secure, vigorous, and healthy final chapter to life. They want to age better than older people do today.2 And, since the fields of medicine and biological science, among others, helped create the Longevity Revolution, they must contribute to and make the adjustments necessary to accommodate and take advantage of this new phenomenon.3
The baby boomers are an enormously influential interest group. They can transform what it means to live a long life. The American penchant for crisis management rather than foresight and design will be challenged, I hope, by a thoughtful analysis of the risks and potential rewards of meeting longevity head on, and sooner rather than later. The baby boomers should not have to turn gray of head before we notice that we haven’t made room for them in America’s land of old age. At the same time, society must also prepare for generations X and Y, who in turn must do their part. (See Table 1.)
If we are to successfully meet the challenges of the Longevity Revolution we must have the audacity to question conventional wisdom. This book refutes many long-held beliefs about the effects of an aging population on society. It is not true, for example, that:
• Decreased birthrates are disadvantageous.
• Welfare state-type social protections are unsustainable.
• The aging population accounts for rising health costs.
• Excessive medical costs are associated with the end of life.
• The AARP is the most powerful lobby in Washington.
• Age prejudices have been ended by laws and legal actions.
• Older workers are unproductive.
• Old people receive more public and private support than children and youth.
While there are radical social transformations to be made if we are to make good use of the extra thirty years granted us, there has also been a healthy growth in science. It is what got us here, and it promises still greater longevity. Scientific advancements should and will add vigor and health throughout life, and not just at its end. The aging population increasingly consists of active, vigorous, robust older people. We must not take them for granted, but the trend can continue and it should be celebrated. Above all, I hope that this book will help convince people that our increased longevity constitutes a supreme achievement.
PART I: INTRODUCTION
WHAT IS THE LONGEVITY REVOLUTION?
no Society; and which is worst of all,
continual fear, and danger of violent death;
and the life of man, solitary, poor, nasty,
brutish, and short.
Leviathan, Part I, Chap. XIII (1658)
Life through much of human history was indeed brutish and short—humans lived barely long enough to reproduce themselves. For the mere survival of the human race, a proportion of individuals had to live long enough to give birth and rear their young (Figure 1.1). Yet, it is sobering to note that, according to archeological estimates, half of all Neanderthals (the archetypal caveman living one hundred thousand to thirty-five thousand years ago) and Upper Paleolithic homo sapiens (beginning forty thousand years ago and including Cro-Magnon man) died by the time they were twenty, with only a few living beyond age fifty.
The Cro-Magnon era brought longer life expectancy for some, with the rare individual living beyond age sixty. This fledgling longevity, sporadic as it was, became possible as humans began to work together to create a better standard of living. Although they were less muscular, Cro-Magnons eventually replaced the intellectually outpaced Neanderthals. However, like their predecessors, the vast majority of Cro-Magnons continued to die at an average age of eighteen to twenty.
Before I continue, it is important to define the terms life expectancy and life span and distinguish between them. Life expectancy is based on the average number of years that each sex can expect to live, under specific conditions. Life span is the genetically determined length of life of a specific animal species under the best of environmental circumstances. It probably increased during early hominid development but has not, in all likelihood, increased since that time. Life expectancy is more malleable and is dependent upon a variety of factors that can change quickly, such as the conquest of diseases. For example, between 1900 and 2000, life expectancy in the United States increased over thirty years.1
Figure 1.1—Average Length of Life from Ancient to Modern Times
Although early humans had bodily defenses, their immunities were probably specific to local pathogens, so when our forebears traveled they were exposed to new infectious diseases to which they lacked immunity. Illness was largely a mystery to be accounted for by spirits, gods, evil, and retribution, or, as Euripides wrote in Medea, “A throw of chance—and there goes Death bearing off your child into the unknown.” Nature or the gods were blamed for accidents, plagues, pestilence, famine, and even wars.
The prospects for health and survival brightened a bit as hunting, fishing, and food gathering progressed to include the cultivation of plants and the domestication of animals during the Neolithic era. The Fertile Crescent was the site of the first technology essential to public health. Reliable fire-making techniques made cooked food and heated water possible. The development of pottery in the Neolithic period advanced more hygienic and convenient storage of food and water and the disposal of waste and garbage. Permanent population sites became common, and the overall number of human beings greatly increased. Nonetheless, the length of life that could be expected by any individual changed little, even though many more people were alive.
The Bronze Age opened the door to the first real improvements in human longevity, such as the manufacture of metal tools and weapons made of bronze, the rise of urbanization, the specialization of labor, the exploration and colonization of new territories in search of raw materials, and undoubtedly the production of surplus food. These developments provided the social and environmental supports for increases in life expectancy. The Iron Age (around 1200 BC) continued the trend toward permanent settlements, laying the foundation for social organization and advancing agriculture through the use of iron implements.
Imperial Rome at its height brought a relatively high standard of living and health to its more than one million inhabitants. Life expectancy was twenty-five years; however, it is important to note that this number includes a very high infant mortality rate. Those who survived beyond childhood had an average life expectancy of forty.2 But by AD 180 Roman culture began its decline, and nothing remotely like it appeared again until the eighteenth and nineteenth centuries in Europe. Scholars believe there was a dip in life expectancy after the decline of the empire.
As humankind moved from prehistory to the early modern era of Western civilization,3 life continued to be fragile. Infant and childhood mortality and the random and frequent deaths of adults—especially of women during childbirth—from infections, disease, and accidents were the norm. These commonplace and expectable deaths were punctuated by the devastations of plagues, failed harvests, famines, and scourges like scurvy and beriberi that seemingly sprang from nowhere. Typically, epidemics broke out as people became so weakened by hunger and malnutrition that they were predisposed to disease.
Although plague was first recorded in Athens in 430 BC, and five thousand died daily in Rome in an epidemic in the third century AD, the most infamous plague occurred in the late Middle Ages. The Black Death, as the bubonic form of the plague was known, began in Constantinople in 1334 and spread throughout Europe from 1348 until 1351. In fewer than twenty years, it is estimated to have killed perhaps three-quarters of the population of Europe and Asia.
It was not until Shibasaburo Kitasato and Alexander Yersin discovered the plague bacterium in 1894 that it became clear that the disease was carried by fleas from rats to humans and proliferated in dirty, garbage-strewn living situations. Plague continued into the nineteenth century in Europe; the last was in India in the early twentieth century, resulting in ten million deaths.
Epidemics continued into the twentieth century. Tuberculosis, known variously as consumption or the white plague, was particularly rampant and deadly. As late as 1930, eighty-eight thousand people in the United States died of tuberculosis. Although Robert Koch, a German physician and bacteriologist, discovered the tubercle bacillus in 1882, the disease was not controlled until the 1940s when streptomycin and other drugs became available.
Many scholars consider smallpox to have been more significant in its effect on populations and political developments than even the Black Death, because it struck all classes of society. Edward Jenner, an English physician, demonstrated how it could be prevented by a vaccination with cowpox virus. His discovery laid the foundation for the sciences of modern immunology and virology as well as the eventual elimination of smallpox. The last outbreak occurred in the 1970s in Somalia, where it was quickly suppressed. In May 1980 the World Health Organization officially declared its global eradication, marking perhaps the world’s greatest public health achievement. Smallpox, a scourge throughout history, is now extinct in nature.
Polio could be the second disease of epidemic proportions to become extinct worldwide. As with smallpox, humans are its only known natural host. During its peak, from 1943 to 1956, polio infected some four hundred thousand Americans and killed about twenty-two thousand. The disease began to decline rapidly in the United States in 1955, after a mass immunization program with the Salk vaccine, followed by the Sabin vaccine in 1961.
Influenza, an old enemy, made its most spectacular showing between 1917 and 1919, killing at least twenty-one million worldwide and infecting half the world’s population. Twice as many people perished in a few months’ time as had been killed in World War I, with people dying faster than from any other disease. Half a million died in the United States. Eventually, the pandemic ended, probably because most survivors developed antibodies, producing what might be called a herd immunity and leaving few people for the virus to attack.
In assessing the impact of disease over the centuries, or even over the first half of the twentieth century, it is important to remember that death was not the only consequence. Permanent disability was widespread in children who survived the so-called children’s diseases, such as whooping cough and German measles, and disability hastened death. In addition to the physical trauma, one can only imagine the emotional anxiety and fear that families felt, especially for their children, when diseases of mysterious origin, and for which no treatment was known, struck at random or with chilling predictability during epidemics. The life of a child was precarious; that of an adult only somewhat less so.
Nonetheless, the Industrial Revolution and the wealth it generated brought significant increases in longevity.4 Beginning in the middle of the eighteenth century in England, Europe was transformed from a rural, agricultural, and handicraft economy to one dominated by mass production of manufactured goods, improved agriculture, and wider distribution. It became possible to feed a much larger population, many of whom were now working in urban areas. The significance of this transformation cannot be overstated. Robert Fogel, economist and Nobel Prize winner, estimates that prior to the Industrial Revolution in France and England, about one-fifth of working class people had a calorie intake that was inadequate to sustain them and that during the eighteenth and even the nineteenth century there was widespread and chronic malnutrition.5 Fogel notes that the increase of longevity and stature6 over the past two hundred years was due to the availability of more food, and he introduced the concept of “technophysio evolution” to describe these changes.7
In response to epidemics of yellow fever, cholera, smallpox, typhoid, and typhus, communities began to recognize the benefits of organized efforts to address health issues. Predating the germ theory of disease, social reformers, motivated by moral concern, contributed critically to public health measures.8 In 1866, New York created the first state health department, with local boards in each town mandated to monitor serious health problems and attend to unsanitary living conditions. Other states followed, and organized public health efforts began.
Another element that contributed to the Longevity Revolution really began in the bedrooms of Europe (first, notably, in France) in the nineteenth century, when couples started to limit the number of children they conceived by purposefully abstaining from sexual intercourse in order to save the women from dying in childbirth. The resulting decline in birthrates produced two very important changes: the proportion of older persons and other age groups in the population increased and the longer time between births as well as fewer pregnancies per woman contributed to increased health and survival of both infants and mothers.
Meanwhile, by the beginning of the nineteenth century, another major element of the Longevity Revolution was taking shape, namely a revolution in medical science. In 1846, the general structure of the human body was almost fully known when an American surgeon, William Morton, opened the way to the field of surgery by using ether as a general anesthetic. At the same time, it was becoming clear that specific organisms caused infections. John Snow, a physician, unraveled the basis of contagion when he demonstrated that contaminated water flowing through a Broad Street pump was the cause of the 1866 London cholera epidemic. Subsequent improvements in water supply and sewage systems reduced both water-borne and food-borne diseases.
The real breakthrough came in the late nineteenth century when Louis Pasteur and Robert Koch developed and demonstrated the germ theory of disease, beginning a dramatic decline in death rates. With this discovery, health professionals and the general public finally understood how some diseases and infections were communicated. The work of Pasteur, a French chemist, led directly to pasteurization and the protection of millions of children and adults from disease transmitted through milk. At about the same time, Koch developed ingenious techniques for the study of bacteria that are still in use today. He established criteria, referred to as Koch’s postulates, for proving the bacterial cause of a disease. In the process he discovered the microorganism causing tuberculosis as well as those causing wound infections and Asiatic cholera.
By the end of the nineteenth century, proof of the germ theory of disease began to transform medical care and hospital practices. Decades earlier, Hungarian physician Ignaz Semmelweiss was driven to insanity and suicide after his peers ridiculed his pioneering belief that midwives and other medical personnel delivering babies should thoroughly wash their hands and wear clean clothes to prevent “childbirth fever” (puerperal fever),9 a major cause of death among women giving birth. Skep-tics were unconvinced even by the evidence of greatly reduced deaths from infection in the Viennese hospital where Semmelweiss worked.10 In 1865, the very year of Semmelweiss’s death, English surgeon Joseph Lister demonstrated that heat sterilization of surgical instruments and the use of antiseptic agents on wounds could dramatically reduce infection. Cleanliness during childbirth and in medical care in general was adopted by the 1890s, unfortunately too late for Semmelweiss to know that he had been vindicated.
The developing science of endocrinology, the study of the body’s hormonal system, also brought dramatic changes, exemplified by the important discovery of insulin by Frederick Banting and Charles Best, both Canadians, in 1921. Practically overnight, diabetics were saved from almost certain death and given the prospect of reasonably long and healthy lives.
Other drug discoveries led to seemingly miraculous cures, and Alexander Fleming’s discovery of penicillin in 1928 was perhaps the most dramatic of all. Prior to penicillin, even minor cuts and bruises could have dire consequences. Many Americans past the age of sixty still clearly recall the days of childhood ear infections and other miseries that quickly became curable when penicillin came into general use in the 1940s. Young military personnel in World War II were among the first to benefit from its lifesaving effects.
- On Sale
- Sep 8, 2009
- Page Count
- 576 pages