The Biological Mind

How Brain, Body, and Environment Collaborate to Make Us Who We Are


By Alan Jasanoff

Formats and Prices




$47.00 CAD

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around March 13, 2018. This date is subject to change due to shipping delays beyond our control.

A pioneering neuroscientist argues that we are more than our brains

To many, the brain is the seat of personal identity and autonomy. But the way we talk about the brain is often rooted more in mystical conceptions of the soul than in scientific fact. This blinds us to the physical realities of mental function. We ignore bodily influences on our psychology, from chemicals in the blood to bacteria in the gut, and overlook the ways that the environment affects our behavior, via factors varying from subconscious sights and sounds to the weather. As a result, we alternately overestimate our capacity for free will or equate brains to inorganic machines like computers. But a brain is neither a soul nor an electrical network: it is a bodily organ, and it cannot be separated from its surroundings. Our selves aren’t just inside our heads — they’re spread throughout our bodies and beyond. Only once we come to terms with this can we grasp the true nature of our humanity.




Wherever you come from and whatever you believe about yourself, chances are that to some extent you know your brain is the heart of the matter. Although it is said that there are no atheists in foxholes, there are also few people who will not duck when the shooting starts—nobody wants a bullet in their brain. If you trip and fall forward on a concrete sidewalk, your arms rise instinctively to protect your head. If you are a cyclist, the only protective gear you probably wear is your helmet. You know something important is under there, and you will do what it takes to keep it safe.

Your concern for your brain probably does not end there. If you are smart or successful, you pride yourself on your brainpower. If you are an athlete, you prize your coordination and stamina, likewise products (at least in part) of your brain. If you are a parent, you worry about your child’s brain health, development, and training. If you are a grandparent, you may worry about your own aging brain and the consequences of brain atrophy. If you had to swap body parts with someone else, your brain would probably be the last part you would consider exchanging. You identify with your brain.

How complete should this identification be? Is it possible that everything truly significant about you is in your brain—that in effect, you are your brain? A famous philosophical thought experiment asks you to consider just this possibility. In the experiment, you imagine that an evil genius has secretly removed the brain from your body and placed it in a vat of chemicals that keeps it alive. The brain’s loose ends are connected to a computer that simulates your experiences as if everything were normal. Although this scenario seems like nothing more than science fiction, serious scholars use it to consider the possibility that the things you perceive may not in fact represent an objective reality outside your brain. Regardless of the outcome, the premise of the thought experiment itself is that being a brain in a vat violates no physical principles and that it is at least theoretically conceivable. If scientific advances eventually made it possible to maintain your disembodied brain, the scenario implies that the irreducible you would indeed be in there.

For some, the idea that people can be reduced to their brains sounds a powerful call to action. A young woman named Kim Suozzi heard that call. At just twenty-three years old, Suozzi was dying of cancer, but she refused to go gentle into that good night. She and her boyfriend decided to raise $80,000 in order to fund the preservation of her brain after she died. Suozzi believed that technology might one day enable her to be brought back to life, either physically or digitally, through structural analysis of her frozen organ. Science is nowhere near up to the task right now, but that did not deter her. To Suozzi in her final days, the brain became everything. Others have taken Suozzi’s path as well. I myself have had a related experience, which I will describe later in this book.

When we are confronted with mounting evidence that the brain is central to all we once associated with our selves, our spirits, and our souls, it is not surprising that some of us react dramatically. In our brave new neuroscientifically informed world, the brain bears the legacy of several millennia of existential angst. Our ultimate hopes and fears can come to revolve around this organ, and in it we may seek answers to eternal questions about life and death, virtue and sin, justice and punishment. There is no mental function for which researchers have not succeeded in finding corresponding activity patterns in the brain, using either imaging techniques in people or more invasive measurements in animals. We see brain data increasingly entering courtrooms, the risk of brain injury newly affecting our pastimes, and brain-targeted medicines prescribed to alter a gamut of behavior from school performance to social graces. A lesson from the legendary Greek philosopher Hippocrates is penetrating the public consciousness: “Men ought to know that from nothing else but the brain come joys, delights, laughter and sports, and sorrows, griefs, despondency, and lamentations.”

Everything important about us seems to boil down to our brains. This is a stark claim, and my aim in this book is to show that it sends us in the wrong direction, by masking the true nature of our biological minds. I argue that the perception that the brain is all that matters arises from a false idealization of this organ and its singular significance—a phenomenon I call the cerebral mystique. This mystique protects age-old conceptions about the differences between mind and body, free will, and the nature of human individuality. It is expressed in multiple forms, ranging from ubiquitous depictions of supernatural, ultrasophisticated brains in fiction and media to more sober scientifically supported conceptions of cognitive function that emphasize inorganic qualities or confine mental processes within neural structures. Idealization of the brain infects laypeople and scientists alike (including myself), and it is compatible with both spiritual and materialist worldviews.

A positive consequence of the cerebral mystique is that exalting the brain can help drive public interest in neurobiological research, a tremendous and worthy goal. On the other hand, the apotheosis of the brain ironically obscures consequences of the most fundamental discovery of neuroscience: that our minds are biologically based, rooted in banal physiological processes, and subject to all the laws of nature. By mythologizing the brain, we divorce it from the body and the environment, and we lose sight of the interdependent nature of our world. These are the problems I want to address.

In the first part of this book, I will describe the cerebral mystique as it exists today. I will do this by considering themes in today’s neuroscience and its public interpretation that underemphasize the brain’s organic, integrated characteristics. I argue that these themes promote a brain-body distinction that recapitulates the well-known mind-body dualism that dominated Western philosophy and religion for hundreds of years. By perceiving virtual barriers between our brains and our bodies—and by extension between our brains and the rest of the world—we see people as more independent and self-motivated than they truly are, and we minimize the connections that bind us to each other and to the environment around us. The disconnected brain acts as a stand-in for the ethereal soul, inspiring people like Kim Suozzi to preserve their brains upon death in the hope of attaining a form of immortality. In upholding the brain-body distinction, the cerebral mystique also contributes to chauvinistic attitudes about our brains, minds, and selves, such as the egotism of successful leaders and professionals and the “us versus them” attitudes of war and politics.

In the individual chapters of Part 1, I will introduce five specific themes that give rise to the brain-body distinction and that tend to elevate the brain above the rest of the natural realm. By scrolling through alternative, scientifically grounded perspectives, I will try to bring the brain back down to earth. The first theme I will address is abstraction, a tendency for people to view the brain as an abiotic machine based on fundamentally different principles from other living entities. This is best exemplified by the familiar analogy of the brain to a computer, a solid-state device that can be perfected and propagated in ways that evoke a disembodied spirit. The second theme is complexification, a vision of the brain as so vastly complicated as to defy analysis or understanding. The inscrutably complex brain is a convenient hiding place for mental capabilities we want to possess but cannot explain, like free will. The third theme is compartmentalization, a view that stresses the localization of cognitive functions without offering deeper explanations. Supported largely by the kinds of brain imaging studies we often see in the media, the compartmentalized view often facilitates shallow interpretations of how the brain helps us think and act. The fourth theme is bodily isolation, a tendency to see the brain as piloting the body on its own, with minimal influence from biological processes outside the skull. The fifth and final theme is autonomy, the view of the brain as self-governing, receptive to the environment but always in control. These last two themes allow us to see ourselves as cut off from impersonal driving forces both inside and outside our bodies that nevertheless dramatically affect our behavior.

In Part 2, I will explain why a more biologically realistic view of our brains and minds is important, and how it could improve our world. I consider three areas that today are heavily influenced by the cerebral mystique: psychology, medicine, and technology. In psychology, the mystique fosters a view that the brain is the prime mover of our thoughts and actions. As we seek to understand human conduct, we often think first of brain-related causes and pay less attention to factors outside the head. This leads us to overemphasize the role of individuals and underemphasize the role of contexts in a range of cultural phenomena, from criminal justice to creative innovation. An updated view that moves beyond idealizations must accept that the body’s physiological milieu, encompassing but not bounded by the brain, provides an unequivocal meeting point for influences both internal and external to every person. Our brains seen in this way are complex relay points for innumerable inputs, rather than command centers endowed with true self-determination. Whenever I have an idea, my idea is the product of all of these inputs converging at once around my head, rather than mine alone. When I steal or kill, whatever happens in my criminal brain is the product of my physiology and environment, my history, and my society, including you.

In medicine, a grave consequence of the cerebral mystique is to perpetuate the stigma of psychiatric disease. Accepting that our minds have a physical basis relieves us of the traditional tendency to view mental illnesses as moral failings, but recasting psychiatric conditions as brain disorders can be almost as damning to the patients affected. Society tends to view “broken brains” as less curable than moral flaws, and people thought to have problems with their brains can be subject to greater suspicion as a result. Equating mental disorders with brain dysfunction also skews the treatments people seek, leading to greater reliance on medications and less interest in behavioral interventions such as talk therapy. And seeing mental illnesses purely as brain diseases overlooks an even deeper issue—the fact that mental pathologies themselves are often subjectively defined and culturally relative. We cannot properly grapple with these complexities if we reduce problems of the mind to problems of the brain alone.

For some people, the cerebral mystique inspires technological visions for the future. Many of these revolve around science fiction and the idea of “hacking the brain” to improve intelligence or even eventually upload our minds and preserve them for eternity. But the reality of brain hacking is less glamorous than its image. Invasive brain procedures have historically incurred high risk of injury and helped only the most debilitated patients. The neurotechnological innovations that meet society’s needs might best remain outside our heads; indeed, such peripheral tech is already turning us into transhumans armed with portable and wearable electronics. Both hopes and fears about neurotechnology are distorted by artificial distinctions between improvements that work directly and those that work indirectly on our central nervous systems. By demystifying the brain we will be better able to enhance our lives while solving the scientific and ethical challenges that arise along the way.

Before getting into my argument, I want to say a few words about what this book does not try to do. First, it does not explain how the brain works. Unlike many other authors, I am concerned more with what the brain is than what it does. Although several of my chapters include examples of specific brain mechanisms, my purpose in introducing them is largely to illustrate modes of action that depart from widespread stereotypes about the brain. Just as many artists strive to give emotional and psychological depth to flat figures from history and legend, I hope in a humble way to add dimensionality and nuance to an organ that popular writing often depicts as a dry computing machine rather than a thing of flesh and blood.

Second, this book does not challenge the fact that the brain is essential to human behavior. Functions of the mind all require the brain, even if they do not reduce to the brain. Many of these functions are almost as poorly understood now as they were fifty or a hundred years ago, and basic neuroscientific explorations of phenomena such as memory, perception, language, and consciousness are the best way to advance our knowledge. I will illustrate how traditional ways of looking at the brain can be complemented by alternative and broadened views, but neuroscience and the brain remain at the center of the picture.

Third and most important, this book in no way aims to reject objective neurobiological findings. The perspectives I offer will foster a view of our minds and selves as more interconnected than Old Age culture traditionally views them, but this is no invitation to slip into ungrounded New Age spirituality. It is hard scientific research itself that paints a picture of the brain as biologically grounded and integrated into our bodies and environments. Conversely, it is the cerebral mystique and its emphasis on the extraordinary features of brains that drive people to doubt the power of science to illuminate human thought and behavior—a view that I, like most neuroscientists, emphatically reject. The cerebral mystique limits the impact of neuroscience in society today by presenting the brain as a self-contained embodiment of the mind or soul. This view makes it easier to “black-box” the nervous system, to treat what happens in the brain as confined to the brain, and to ignore what neuroscience might have to say about real-world problems. This is a view I mean to set aside, and I hope that this book will convince you to agree.

part I




WHEN I FIRST TOUCHED A BRAIN, IT WAS BRAISED AND enveloped in a blanket of beaten eggs. That brain had started its life in the head of a calf, but ended in my mouth, accompanied by some potatoes and a beverage at an economical eatery in Seville. Seville is a Spanish city famous for its tapas, and tortilla de sesos, as well as other brain preparations, are occasional offerings. On my brain-eating trip to Seville, I was too poor to afford sophisticated gastronomic experiences. Indeed, some of my most vivid recollections of the trip included scrounging around supermarkets for rather less satisfying food, while the delectable tapas remained out of reach, only for the ogling. The brain omelet was certainly one of the better meals I had.

My next encounter with sesos came many years later in a laboratory at MIT, in a crash course on neuroanatomy whose highlight was certainly the handling and dissection of a real sheep’s brain. At that time, I was drawn to the class and to the sheep’s brain by a diffuse set of concerns that motivate many of my fellow humans to follow and even embed themselves in neuroscience. The brain is the seat of the soul, the mechanism of the mind, I thought; by studying it, we can learn the secrets of cognition, perception, and motivation. Above all, we can gain an understanding of ourselves.

The experience of handling a brain can be awesome, in the classical sense of the word. Is this lump of putty really the control center of a highly developed organism? Is this where the magic happens? Animals have had brains or brain-like structures for nearly five hundred million years; over 80 percent of that time, the ancestors of sheep were also our ancestors, and their brains were one and the same. Reflecting that extensive shared heritage, the shape, color, and texture of the sheep’s brain are quite like our own, and it is not hard to imagine that the sheep’s brain is endowed with transcendent capabilities analogous to ours. The internal complexity of the sheep’s organ is indeed almost as astounding as that of the human brain, with its billions of cells, trillions of connections between cells, and ability to learn and coordinate flexible behaviors that carry us across lifespans more convoluted than the cerebral cortex. The sheep’s brain bears witness to years of ovine toil, longing, passion, and caprice that are easily anthropomorphized. And that brain, removed from the rest of its body and everything the ex-sheep once felt or knew, is as powerful a memento mori as one can find.

But the sheep’s brain, like ours, is also a material highly similar to other biological tissues and organs. Live brains have a jellylike consistency that can be characterized by a quantity called an elastic modulus, a measure of its capacity to jiggle without losing its form. The human brain has an elastic modulus of about 0.5–1.0 kilopascal (kPa), similar to that of Jell-O (1 kPa), but much lower than biological substances such as muscle or bone. Brains can also be characterized by their density. Like many other biological materials, the density of brains is close to water; given its size, an adult human brain therefore weighs about as much as a large eggplant. A typical brain is roughly 80 percent water, 10 percent fat, and 10 percent protein by weight, leaner than many meats. A quarter pound of beef brain contains 180 percent of the US recommended daily value of vitamin B12, 20 percent of the niacin and vitamin C, 16 percent of the iron and copper, 41 percent of the phosphorus, and over 1,000 percent of the cholesterol—a profile somewhat resembling an egg yolk. Risk of clogged arteries aside, why not eat the brain rather than study it?

About two million years ago, near what is now the southeastern shore of Lake Victoria in Kenya, ancient hominins were doing just that. Lake Victoria itself, the largest in Africa and source of the White Nile, is less than half a million years old and was then not even a glimmer in the eye of Mother Nature. Instead, the area was an expansive prairie, roamed by our foraging forebears, who subsisted on grassland plants and the flesh of prehistoric grazing mammals that shared the terrain. Archeological findings at this site, known as Kanjera South, document the accumulation of small and midsize animal skulls at specific locations over several thousand years. The number of skulls recovered, particularly from larger animals, substantially exceeds the corresponding numbers of other bones. This indicates that animal heads were separated from the rest of their carcasses and preferentially gathered at each site. Some skulls bear the marks of human tool use, thought to reflect efforts to break open the cranial cavities and consume their contents. Brains were apparently an important part of the diet of these early people.

Why brains? In evolutionary terms, the Kanjera humans were relatively new to meat eating; carnivory in Homo is documented as beginning only at about 2.5 million years ago (Mya), though it is believed to have been a major factor in our subsequent development as a species. Nonhuman carnivorous families on the scene at 2 Mya had been established meat eaters for many millions of years already. The biting jaws and catching claws of the great Pleistocene cats, the giant hyenas, and the ancestral wild dogs were better adapted to slaying, flaying, and devouring their prey than anything in the contemporary hominin body plan. But early humans had advantages of their own: already the bipedal stance, the storied opposable thumb, and a nascent ability to form and apply artificial implements all conferred special benefits. If a primordial person stumbled across the carcass of a slain deer, pungent and already picked to the bone by tigers, she could raise a stone, bring it crashing down on the cranium, and break into a reservoir of unmolested edible matter. Or if she brought down an animal herself, she could pry off the head and carry it back for sharing with her clan, even if the rest of the animal was too heavy to drag. In such fashion, the hominins demonstrated their ability to carve out an ecological niche inaccessible to quadrupedal hunters. Although other carnivores competed vigorously with humans for most cuts of meat, brains may have been uniquely humankind’s for the taking.

Synchronicity on a geologic time scale may explain the coincidence of early hominin brain eating and the emergence of massive, powerful brains in our genus, but the two phenomena are connected in other ways as well. Highly evolved human civilizations and their corresponding cuisines across the world have produced edible brain preparations that range from simple, everyday dishes to splendid delicacies. Celebrity chef Mario Batali brings us calf brain ravioli straight from his grandmother, needing about one hour of preparation and cooking time. Traditional forms of the hearty Mexican hominy stew called posole are somewhat more involved: an entire pig’s head is boiled for about six hours until the meat falls off the bone. Unkosher, but perhaps appetizing all the same! Truly festive brain dishes are prepared across much of the Muslim world on the feast of sacrifice, Eid al-Adha, which celebrates Abraham’s offering of his son Ishmael to God. These recipes—brain masala, brains in preserved lemon sauce, steamed lamb’s head, and others—leverage the glut of ritually slaughtered animals generated on the holiday, as well as a cultural reluctance to let good food go to waste. And who could forget the highlight of Indiana Jones’s Himalayan banquet on the threshold of the Temple of Doom—a dessert of chilled brains cheerfully scooped out of grimacing monkey heads? Although it is a myth that monkey brains are eaten on the Indian subcontinent, they are a bona fide, if rare, component of the proverbially catholic Chinese cuisine to the east.

Even to the hardened cultural relativist, there is something slightly savage about the idea of consuming brains as food. “It’s like eating your mind!” my little girl said to me at the dinner table, a scowl on her face. Eating monkey brains seems most definitively savage because of the resemblance of monkeys to ourselves, and eating human brains is so far beyond the pale that on at least one occasion it has invited the wrath of God himself. The unhappy victims of that almighty vengeance were the Fore people of New Guinea, discovered by colonists only in the 1930s and decimated by an epidemic of kuru, sometimes called “laughing sickness.” Kuru is a disease we now believe to be transmitted by direct contact with the brains of deceased kuru sufferers; it is closely related to mad cow disease. The Fore were susceptible to kuru because of their practice of endocannibalism—the eating of their own kind—as Carleton Gajdusek discovered in epidemiological studies that later won him a Nobel Prize. “To see whole groups of well nourished healthy young adults dancing about, with athetoid tremors which look far more hysterical than organic, is a real sight,” Gajdusek wrote. “And to see them, however, regularly progress to neurological degeneration… and to death is another matter and cannot be shrugged off.”

Fore people were surprisingly nonchalant about their cannibalism. The bodies of naturally deceased relatives were dismembered outside in the garden, and all parts were taken except the gallbladder, which was considered too bitter. The anthropologist Shirley Lindenbaum writes that brains were extracted from cleaved heads and then “squeezed into a pulp and steamed in bamboo cylinders” before eating. Fore cannibalism was not a ritual; it was a meal. The body was viewed as a source of protein and an alternative to pork in a society for which meat was scarce. The pleasure of eating dead people (as well as frogs and insects) generally went to women and children, because the more prestigious pig products were preferentially awarded to the adult males. The brain of a dead man was eaten by his sister, daughter-in-law, or maternal aunts and uncles, while the brain of a dead woman was eaten by her sister-in-law or daughter-in-law. There was no spiritual significance to this pattern, but it did closely parallel the spread of kuru along gender and kinship lines until Fore cannibalism was eliminated in the 1970s.

There are many reasons not to eat brains, from ethical objections to eating meat in general, to the sheer difficulty of the butchery, to the danger of disease; but all activities come with some difficulties and dangers. One can’t help thinking that the real reason our culture doesn’t eat brains is more closely related to the awesomeness of holding a sheep’s brain in one’s hand: brains are sacred to us, and it takes an exercise of willpower to think of them as just meat. Eating someone else’s brain, even an animal’s, is too much like eating our own brain, and eating our own brain—as my daughter asserted—is like eating our mind, and perhaps our very soul.

Some of us arrive at this conclusion through introspection. Even in the sixth century BCE, the Pythagoreans apparently avoided eating brains and hearts because of their belief that these organs were associated with the soul and its transmigration. But can we find objective data to demonstrate a modern disinclination to eat brains? Consumption of offal of all sorts, at least in Europe and the United States, has dropped precipitously since the beginning of the twentieth century, but it seems that brains in fact are particularly out of favor. A recent search of a popular online recipe database uncovered seventy-three liver recipes, twenty-eight stomach recipes, nine tongue recipes, four kidney recipes (not including beans), and two brain recipes. If we suppose somewhat crudely that the number of recipes reflects the prevalence of these ingredients in actual cooking, there appears to be a distinct bias against brains. Some of the bias may be related to “bioavailability”—a cow’s brain weighs roughly a pound, compared with two to three pounds for a tongue or ten pounds for a liver—but a difference in popularity plausibly explains much of the trend. A 1990 study of food preferences surveyed from a sample set of English consumers also supports this point. The results showed that dislike for various forms of offal was ranked in ascending order from heart, kidney, tripe, tongue, and pancreas to brain. This study is notable partly because it was performed before the mad cow outbreak of the mid-1990s, so the surveyed preferences are not easily explained by health concerns related to brain eating. The participants’ tendency to “identify with” brains might best explain revulsion at eating them, inferred sociologist Stephen Mennell in an interpretation of the results.

Most people lack an appetite for brains, but hunger and the brain remain closely intertwined in other ways, both literally and metaphorically. In the most concrete sense, brains are of course necessary for the perception of hunger in each of us. The cognitive basis for hunger revolves largely around a group of cells that live in a brain region called the hypothalamus. Some of these cells secrete a hormone called Agouti-related peptide


  • "This philosophical puzzle has been posed, in various forms, for centuries and is one of the starting points for Alan Jasanoff's elegant and spirited attack on what he calls our 'cerebral mystique' ... A lucid primer on current brain science that takes the form of a passionate warning about its limitations."—Wall Street Journal
  • "The Biological Mind is chock-full of fun facts that entertain. And best of all, it makes you think. I found myself debating with Jasanoff in my head as I read -- surely a sign of a worthy book."—New York Times Book Review
  • "Alan Jasanoff's The Biological Mind...stylishly sums up the state of current knowledge while emphasizing the limitations of neuroscientific understanding."—Wall Street Journal
  • "In this powerful treatise, neurological engineer Alan Jasanoff issues a corrective to the 'cerebral mystique.'"—Nature
  • "The book features a learned and experienced author who has the ability to take complex concepts of neuroanatomy and neurophysiology and explain them in easy to understand descriptions. The intelligent reader interested in 21st century understanding of the human brain and particularly those who may be involved in mental or physical health will find this book useful and interesting."—The New York Journal of Books
  • "[Jasanoff's] clear, lively writing reveals how our emotions, such as the fight-or-flight response and the suite of thoughts and actions associated with stress, provide strong evidence for a brain-body connection."—Science News
  • "Taking the brain off of its pedestal, Jasanoff offers an exhaustive, comprehensible, and at times playful (e.g., why do humans now study brains instead of eat them?) look at the brain. Appropriate for both neuroscientists as well as general readers interested in gaining a better understanding of this vital organ."—Library Journal
  • "Jasanoff writes with admirable clarity as he argues that the modern tendency of neuroscience to take a 'brain-centered view' that overlooks external sources of behavior can lead to epistemological dead ends."—Kirkus Reviews
  • "Jasonoff delivers a highly readable and enjoyable exploration of a series of compelling questions relating to the human experience."—CHOICE
  • "Neuroscientist Alan Jasanoff has identified a widespread 'Brain Mystique'--a collection of folk theories about the brain that are scientifically false. In The Biological Mind, Jasanoff dispels these theories while leading the reader on an engaging tour of real neuroscience, from the brain to the body to the social and physical world."—George Lakoff, coauthor of The Neural Mind
  • "Any book that opens with a historical account of the nutritional merits of eating animal brains and concludes with an imaginary account of the author's brain being removed from his body to take up residence in a vat is certainly worth a read, and Alan Jasanoff's The Biological Mind is precisely that. Thought-provoking and enjoyable, this book will provide readers with a new conception of who they are."—Robert Whitaker, authorof Anatomy of an Epidemic
  • "The dark side of all the wonderful new neurotechnology at researchers' fingertips is that too many experts are now over-simplifying mental illness, reducing it to mere descriptions of brain physiology. Alan Jasanoff does an outstanding job of bringing much needed nuance, humanity, and compassion to the way we think about mental illness and the brain."—Sally Satel, M.D., Lecturer in Psychiatry at the Yale University School of Medicine
  • "Alan Jasanoff's The Biological Mind provides a provocative and accessible neuroscientific defense of the 'extended mind' thesis--the idea that we are much more than our brains, and even the bodies in which they are normally housed. By the conclusion, readers will be left wondering whether Jasanoff's findings suggest something even more radical: that our brains are actually platforms for launching any number of versions of who we really are."—Steve Fuller, Auguste Comte Chair in Social Epistemology at the University of Warwick and author of Humanity 2.0

On Sale
Mar 13, 2018
Page Count
304 pages
Basic Books

Alan Jasanoff

About the Author

Alan Jasanoff is the award-winning director of the MIT Center for Neurobiological Engineering. He lives near Cambridge, Massachusetts.

Learn more about this author