Read by Jeremy Maxwell
Formats and Prices
This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around March 21, 2017. This date is subject to change due to shipping delays beyond our control.
Humans have become subservient to algorithms. Every day brings a new Moneyball fix–a math whiz who will crack open an industry with clean fact-based analysis rather than human intuition and experience. As a result, we have stopped thinking. Machines do it for us.
Christian Madsbjerg argues that our fixation with data often masks stunning deficiencies, and the risks for humankind are enormous. Blind devotion to number crunching imperils our businesses, our educations, our governments, and our life savings. Too many companies have lost touch with the humanity of their customers, while marginalizing workers with liberal arts-based skills. Contrary to popular thinking, Madsbjerg shows how many of today’s biggest success stories stem not from “quant” thinking but from deep, nuanced engagement with culture, language, and history. He calls his method sensemaking.
In this landmark book, Madsbjerg lays out five principles for how business leaders, entrepreneurs, and individuals can use it to solve their thorniest problems. He profiles companies using sensemaking to connect with new customers, and takes readers inside the work process of sensemaking “connoisseurs” like investor George Soros, architect Bjarke Ingels, and others.
Both practical and philosophical, Sensemaking is a powerful rejoinder to corporate groupthink and an indispensable resource for leaders and innovators who want to stand out from the pack.
The End of Thinking
A senior executive at one of the world's largest health care technology companies sits in a conference room filled with whiteboards. PowerPoint slides blink by on a projector screen. After growing in the double digits for almost a decade and securing a comfortable lead in the market for diabetes care products, the EVP's department has missed its sales targets for the third time this year. A few months ago, he signed off on an extensive round of market research to understand why. The marketers conducted surveys with thousands of diabetes patients all over the United States and Europe, measuring hundreds of different factors that account for their compliance in taking their medication. He learned that "43 percent of Type 2 diabetes patients are noncompliant, and 84 percent of those noncompliant patients cite forgetfulness as the main reason for their noncompliance." With only a short period of time to turn the company around before the board tears into him, the EVP is consumed with anger. "We already know that patients are noncompliant because they forget. We've known that for decades. We need to know what we can do to get them to change their behavior." The room is silent. After millions of dollars and months of research, no one in the room has any idea why people do what they do.
A candidate for Senate in a battleground state reviews the polling averages for her race. Her consultants tell her that the averages, when properly adjusted to reflect the current environment, assure her a win in November. They have sliced her electorate into ever-narrowing segments so she can shape her talking points accordingly. We've seen this before, they say. This November will be like the last election and the one before. But over the spring, something completely unexpected happens. A surprising new candidate puts his hat in the ring. Instead of talking points and segmentation, he captures the imagination of the electorate with an oratorical gift, weaving together seemingly disparate cultural themes and patterns into a powerful metaphor for the future. When the frontrunner watches a video of one of his rallies, she can see the swell of energy and excitement in the voters. The mood of the event fills her with a deep sense of foreboding. With all the numbers showing support for her campaign, why does it feel as though this man is making a more meaningful connection with the voters than she? In a moment of terror, it dawns on her that she is going to lose this election despite doing everything exactly right.
An entrepreneur at a start-up specializing in solar power solutions struggles to make sense of changes in the market. As energy distribution moves from a centralized model—provided by utilities over the grid—to an increasingly distributed model with a constellation of players, the entrepreneur must synthesize a variety of different data streams. Her team prides itself on its engineering expertise—cutting-edge technical knowledge of the solar market—so they spend little time dealing with all of the cultural and political dynamics that occur within corporate sustainability initiatives. And yet, despite their industry knowledge, her company is losing clients. Just today, one of their biggest corporate clients, a retail chain eager to use sustainability initiatives in their marketing platforms, announced that they were signing on with another solar company selling products technically inferior and more expensive than the product she is offering. She must find partners to replace them immediately or she won't make company payroll in a month or two. Why are we losing market share to competitors with far less technical knowledge? she thinks. What are we missing?
Although the word algorithm is in the subtitle, this is not really a book about algorithms. Nor is this a book about computer programming or the future of machine learning. This is a book about people. More specifically, this is a book about culture and the pendulum shifts of our age. Today we are so focused on STEM-based knowledge—theories from science, technology, engineering and math, and the abstractions of "big data"—that alternative frameworks for explaining reality have been rendered close to obsolete. This pendulum shift is doing great damage to our businesses, governments, and institutions. As each of these three scenarios illustrates, society devalues our human inferences and judgments at a great cost. Our fixation with STEM erodes our sensitivity to the nonlinear shifts that occur in all human behavior and dulls our natural ability to extract meaning from qualitative information. We stop seeing numbers and models as a representation of the world and we start seeing them as the truth—the only truth. We are in grave danger of completely eroding our sense of the human world in favor of these false abstractions.
Of course, the hard sciences are a good way of explaining quite a lot on our planet, namely material nature. They are tremendously effective at explaining chemistry, engineering, or physics, for example. But they are not good at explaining us. As the famed physicist Neil deGrasse Tyson put it, "In science, when human behavior enters the equation, things go nonlinear. That's why physics is easy and sociology is hard."
Because, at the end of the day, it doesn't matter how much hard data we have in our hands, how many brain scans we've monitored on our screens, or how many different ways we have segmented our markets. If we don't have a perspective on the human behavior involved, our insights have no power. When we lose touch with the human circumstances present inside every single election, behind every breakout innovation and each and every successful corporate initiative, we limit our ability to genuinely understand our world.
If we want to truly make sense of our challenges, we must return to a process that feels old-fashioned and out of date in today's anesthetizing world of algorithmic promise. It's something that has been sorely lacking in all of our organizations and across all aspects of our civic discourse. It's called critical thinking. And, as a process, it has never felt so revolutionary nor so cutting edge.
The Human Factor
The essence of being human is that one does not seek perfection.
—George Orwell, In Front of Your Nose: 1945–1950
We humans have been getting some bad press lately. Not a day goes by without hearing about how irrational or inefficient we are when compared with machines. Next to our sleek silicon-powered computer counterparts, our brains are sluggish and burdened by emotions. In the work world, humans are basket cases, slowing down projects and turning black and white lines into gray with our penchant for ambiguity and complexity. We need to learn through experience, and what we learn doesn't have the same precision, rigor, or consistency as algorithms.
Our standing in the world has fallen so far that we have developed a mantra to excuse ourselves from our inadequacies. "I'm only human," we shrug to coworkers in the break room or at happy hour drinks. This idiom contains a singular truth about the way our culture conceives of our humanity: to be human is to be full of flaws.
In engineering circles, this is referred to as the human factor. The human factor—in domains as varied as aeronautics, supply chain management, and pharmaceuticals—is another way of saying: the capacity for error. There is even a burgeoning area of scholarship called human factors research, focused on how to optimize and correct for our flaws in human-computer interaction, or HCI. Human factors research explores how machines can best cope when we humans make one of our typical mistakes. Google utilizes human factors research, for example, when one of their driverless cars attempts to interpret the inconsistencies of a real human driver. Humans are notorious for behaving irregularly, frustrating the efforts of algorithms to achieve driving perfection.
To add to our list of woes, journalists and futurists are telling us that we humans will soon be turning over the majority of our jobs to robots. Factory and customer service employees have been the first to go, but it will soon be entire swaths of our labor force: restaurant workers, pharmacists, medical diagnosticians, lawyers, accountants—even caretakers for the elderly. For journalists and academics, the question is not if this will happen, but rather what we will do with ourselves when it happens.
The solution to the human problem seems straightforward. If we want to remain useful—and employed—we should cede territory to the algorithms all around us—even become subservient to them. Not a day goes by without hearing about a new Moneyball fix—bringing an Ivy League–educated economist in to crack open an industry with clean, fact-based analysis rather than human intuition and experience. We are inundated with stories of big data scores from Amazon, Google, and countless other apps and start-ups. Employment website Glassdoor named "data scientist" as the number-one job in America for 2016 based on the number of job openings, salary, and advancement opportunities. We fervently believe that more data will lead to more insights. If we learn X from looking at a data set of one hundred people, wouldn't we learn exponentially more after aggregating a data set of hundreds of thousands of people? Or hundreds of millions? Or billions? Facebook CEO Mark Zuckerberg captured our intoxication with big data when he recently told investors that he wanted machine learning at Facebook to create "the clearest model of everything there is to know in the world."
Our students are getting the message. At the most prestigious universities in the United States, liberal arts fields like English and history used to be among the most popular majors, but a surge in interest in engineering and the natural sciences has decimated many humanities departments. Since the 1960s, the number of degrees awarded in the humanities has shrunk by half. Funding for humanities research has declined precipitously. In 2011, it amounted to less than half of a percent of the funds for science and engineering research and development. Within the social sciences, quantitative studies like social network analysis and psychometrics dominate, while qualitative fields like sociology and anthropology are seen as passé. At a 2015 town hall meeting, U.S. Republican presidential candidate Jeb Bush told the audience that students majoring in disciplines like psychology were headed for jobs at Chick-fil-A. That same year, the Japanese education minister ordered Japanese universities to shut down social science and humanities departments or convert them "to serve areas that better meet society's needs."
The humanities—disciplines that explore culture, such as literature, history, philosophy, art, psychology, and anthropology—no longer meet "society's needs." A humanities-based understanding of different people and their worlds is now officially useless. After all, compared to the endless information accessible through big data, what value is there in human-led cultural inquiry? What value is there in actually reading a few great books when algorithms can "read" them all and give us an objective analysis of their content? What value is there in plays, paintings, historical studies, dances, political treatises, and pottery, in cultural knowledge that cannot be stripped of its specificity and context and transformed into vast sluices of information?
I write this book with one urgent message to impart: there most certainly is value.
We dismiss this cultural knowledge—cultivated through humanities thinking—at great risk to our future. When we focus solely on hard data and natural science methods—when we attempt to quantify human behavior only as so many quarks or widgets—we erode our sensitivity to all the forms of knowledge that are not reductionist. We lose touch with the books, music, art, and culture that allow us to experience ourselves in a complex social context.
This is not an esoteric subject for debate solely inside the ivory tower. In fact, I see the consequences of this phenomenon playing out in my consulting work every day. I see the dearth of cultured leadership at the senior levels of major corporations. Too many of the top cadre of leadership I have met are isolated in their worldview. They have lost touch with the humanity of their customers and their constituents and, as a result, they mistake numerical representations and models for real life. Their days are sliced and diced into tiny segments, so they feel they don't have time to wander around in the mess of real-world data. Instead, they jump into a problem-solving process and a conclusion without understanding the actual question.
As a result of all this, they tend to hire engineering-or MBA-trained junior executives to be their foot soldiers in the data trenches. Their fixation with hard data often masks stunning deficiencies, and many such lower-level managers will hit a glass ceiling in today's business world. They are reductionists without the sensitivity to recognize the most exciting and essential patterns. These are managers who did everything "right": they hacked the system and aced all the tests; they went to the best schools and got all the good grades; they spent their entire education training their minds to reduce the problems and then to solve them. And today, as a result, they simply don't have the intellectual sophistication required to move into the upper echelons of leadership.
It's not always easy to prove this point—that training in the humanities and the social sciences is just as important, if not more important, than STEM to a successful career—with hard data. But allow me to put this data question in context. In 2008, the Wall Street Journal reported on a large-scale study of global compensation by the research firm PayScale Inc. The study confirms that students with a pure STEM background generally get better-paying jobs right after graduating from college. Massachusetts Institute of Technology (MIT) and California Institute of Technology rank as the top two schools for starting median salary—at $72,000—and then at number 3 and 6, respectively, for the best mid-career median salaries.
But this study includes everyone who graduated from college across the whole of the United States, so median measurements both for starting salaries and for mid-career salaries favor STEM graduates. This is because the liberal arts graduates end up working in an incredibly wide range of jobs and fields across the whole of the nation. If you look at the most successful earners in the entire country—in the 90th percentile mid-career and beyond—the story starts to change. MIT doesn't show up until number 11, behind ten colleges and universities with strong liberal arts concepts. Places like Yale University and Dartmouth College show the strongest median earnings—above $300,000. Of all other engineering STEM-centered colleges, only Carnegie Mellon University even makes the list of mid-career 90th percentile salaries.
The study reveals the same story about majors. Generally, computer engineering and chemical engineering are high-ranking majors when it comes to salary, while it is much harder to find the humanities on the top 20 majors list when it comes to higher earners mid-career. But, again, if you look at the most successful 90th percentile earners in the whole of the country, suddenly political science, philosophy, drama, and history are placed prominently, often from pure liberal arts schools like Colgate University, Bucknell University, and Union College.
What we can take away from this data is that most STEM training will get students a good income at the starting gate and a decent career. But powerful earners—the people running the show, breaking through the glass ceilings, and changing the world—tend to have liberal arts degrees. This will come as surprising news if you listen to the general rhetoric coming out of Silicon Valley, politicians, and even amongst many leaders in education today. But, if you have spent any time in a global company or one of the world's most powerful institutions, it makes sense. After nearly twenty years of counseling the very top executives and management around the world, I can tell you that the most successful leaders are curious, broadly educated people who can read both a novel and a spreadsheet.
After all, do we actually think that figuring out the future of a global insurance company or discerning the political and social implications of proposed legislation is a process based entirely on a linear decision tree or a set of numbers on a spreadsheet? In February 2007, Lehman Brothers had all of its balance sheets in order and reported a record high market cap of close to $60 billion. Little more than a year later, their stock had plunged 93 percent and they were filing for bankruptcy. Numerical data sets obscured the more complex reality that led to the Lehman Brothers collapse. In 2003 and 2004, the bank had acquired five mortgage lenders, including two sub-prime mortgage lenders that loaned money to borrowers without any real documentation. In the midst of a housing boom, the profits were unprecedented, but more and more people were gaining access to free money with less and less scrutiny of their ability to repay any of it. Meanwhile, these bad loans were being hidden away with all of the more legitimate loans, packaged together in complex financial products called CDOs, or collateralized debt obligations. The reality on the streets was available to any leader or executive willing to go out and observe it. It was inevitable that most of the borrowers in the sub-prime mortgage market were going to default. Unfortunately for all of us who had retirement savings invested in the stock markets in September 2008, few financial leaders found it worth their time to take their cues from real-world data. When we stop thinking, it's not just our intellects that are at stake. It's our businesses, our educations, our governments, and our life savings.
I am not alone in my concern. Many of our most prominent leaders are publicly calling out for more liberal arts–trained thinkers to meet this coming future. Norman Augustine, retired chairman and CEO of Lockheed Martin, wrote an op-ed for the Wall Street Journal in 2011 arguing for a stronger foundation in the humanities in our primary and secondary schools: "Far more than simply conveying the story of a country or civilization, an education in history can create critical thinkers who can digest, analyze, and synthesize information and articulate their findings. These are skills needed across a broad range of subjects and disciplines."
A. G. Lafley, former CEO of Procter & Gamble, had one single piece of advice for achieving business success in today's complex managerial environment: pursue a degree in the liberal arts. "By studying art, science, the humanities, social science, and languages," he wrote in the Huffington Post, "the mind develops the mental dexterity that opens a person to new ideas, which is the currency for success in a constantly changing environment. And just as an aspiring Major League pitcher needs a live arm and a calculating, cool head to pitch effectively, so too does a management prospect need to be educated broadly to respond effectively to ambiguity and uncertainty. Completing a broad liberal arts curriculum should enable a student to develop the conceptual, creative, and critical thinking skills that are the essential elements of a well-exercised mind."
These leaders, like so many others at the forefront of business, policy, and entrepreneurship, are sounding the alarm for a better-educated workforce. After all, it was not so long ago when it was commonplace for leaders in finance, media, or policy to have a background in the humanities: Ken Chenault, current CEO of American Express, cited his in-depth study of history as a touchstone for his leadership and managerial acumen; Sam Palmisano, former CEO of IBM, was a history major at Johns Hopkins; Hank Paulson, once secretary of the Treasury, studied English at Dartmouth; Carly Fiorina, CEO of Hewlett Packard from 1999 to 2005, described her undergraduate major in medieval history as the perfect foundation for understanding the high-tech world; Michael Eisner of Disney bypassed business and finance courses, eschewing them for majors in English and theater; famed investor Carl Icahn's senior thesis in philosophy at Princeton was titled, "The Problem of Formulating an Adequate Explication of the Empiricist Criterion of Meaning"; Sheila Bair, former FDIC chair, studied philosophy as an undergraduate at the University of Kansas; and Stephen Schwarzman, chairman and CEO of the private equity firm Blackstone, chose an interdisciplinary major at Yale that he described as "psychology, sociology, anthropology, and biology, which is really sort of the study of the human being."
More and more people, however, have come to view these disciplines as irrelevant when compared with the immediate utility of a degree in data analytics or an online crash course in the latest computer programming skill. The result of this cultural shift is that we stop seeing value in things like poetry, sculpture, novels, and music. And when we devalue humanistic endeavors, we lose our best opportunity for exploring worlds different from our own. When I read a great novel like Thomas Mann's The Magic Mountain, I can actually feel the devastation of a European continent during and after World War I; when I encounter a medieval tapestry like The Hunt of the Unicorn, I understand what was meaningful to people in France on the cusp of the Renaissance. And when I visit the Ryōan-ji Zen garden in Kyoto, the placement and texture of the stones shows me something essential about the Japanese worldview and aesthetic.
Whether you are studying Chinese architecture, Mexican history, or the philosophy of the Sufis, this kind of thinking trains our minds to synthesize all types of data, to explore without need of proving or disproving a narrow hypothesis, and to engage empathically with the particularities of a given world. I believe that this type of cultural engagement is an essential training ground for understanding any group of people. If you work for a pharmaceutical firm, for example, you need to understand the world of a person with diabetes, or all your attempts at drug development will surely fail. Say you make cars. You need to know what life is like for a driver living in western China, or your vehicles will be filled with irrelevant features in the world's largest car market. And if you work in the public sector, you need social science tools to think critically about the culture of bureaucracy.
Experiences with the humanities teach us how to imagine other worlds. But they offer so much more than that. Because when we can fully imagine other worlds—using cultural knowledge and explanations for our human experience—we inevitably develop a more acute perspective on our own world. We learn to see when models and financial innovations are diverging from the truth. We recognize patterns—distilled from both scientific facts and practical reality, from both existing situations and future possibilities—that shed light on insights and, ultimately, help us form a genuine perspective. And perspectives, in the long run, always prove far more profitable—to both your bank account and your life—than confinement in a cage of data.
This rigorous cultural engagement is the foundation of the practice I call sensemaking. Academics have used the term sensemaking to describe different concepts over the years, but I use it here, throughout this book, simply to describe an ancient practice of cultural inquiry, a process based on a set of values we are in great danger of forgetting. With sensemaking, we use human intelligence to develop a sensitivity toward meaningful differences—what matters to other people as well as to ourselves.
In the pages to come, sensemaking will take us on an intellectual adventure story grounded in the tenets of twentieth-century philosophy. We will look at the theories and methodologies that make up studies in the humanities and discuss different ways that these can help us extract meaning from nonlinear data. We will examine our genuine experience of creative insights and, in so doing, clear away some of the misguided notions that continue to circulate around innovation and breakout ideas. And we will meet master practitioners and look at how human intelligence is the only intelligence that can cultivate a perspective.
Never before has our culture been so seduced by the promises of artificial intelligence, machine learning, and cognitive computing. Never before has our world of overlapping political, financial, social, technical, and environmental systems been so inextricably linked. We must remind ourselves—and the culture at large—why the human factor is the most important factor when it comes to making sense of this world. The time to begin is right now.
Making Sense of the World
True genius resides in the capacity for evaluation of uncertain, hazardous, and conflicting information.
The first thing a visitor to the headquarters of the Ford Motor Company will notice upon arrival is the flags. They surround the entrance to the imposing blue building in Dearborn, Michigan, each one from a country where Ford has operations. There are so many flags that the long walkway up to the main door of the building has the air of a United Nations assembly.
The lobby and lower floors of the headquarters maintain this atmosphere of diplomatic cheerfulness—people come and go with a friendly efficiency, dinging the countless elevators doing brisk business up and down. The very top floors of the headquarters, however, are oddly quiet. This is where Mark Fields, president and CEO since 2014, has his office. The main objective of almost everyone who spends any amount of their day up here is to protect his time and attention.
From his penthouse office, Fields looks out and sees the vast expanse of the Ford campus. There's a reason visitors need a car to go to meetings at Ford; it's impossible to cover the campus on foot. Through the windows, the company resembles a little country of engineers making power trains, brakes, and software. To the left is the headquarters of product development; and to the right, the marketing buildings tower over well-kept lawns.
From this singular vantage point, Mark Fields makes decisions that have lasting impact on hundreds of thousands of employees all over the world. But his perspective is fundamentally limited: like most CEOs, he is protected from the world by official and unofficial layers of people. Ford employees will spend months preparing for a one-hour meeting with him, rehearsing over and over again every aspect and responses to every possible question he might have. In ways both direct and indirect, the 199,000 people working for Ford are offering him a highly curated information stream. Some of them are glossing over issues because they don't want to be the bearers of bad news. Others, however, are simply editing out descriptions and details in the name of efficiency. With every single edit in every single conversation, Fields is in danger of losing touch with potent human intelligence that will help him make strategic decisions. And yet, he can't pay attention to everything. Somehow, within these limitations, Fields has to make daily decisions that will determine the future of more than 150 billion dollars in annual revenue.
- "At Ford, we believe the key to creating products and experiences that truly make people's lives better is to deeply understand our customers. Technology alone isn't enough. So we've changed our product development process to focus on the customer experience--and not just the vehicle itself. In Sensemaking, Christian Madsbjerg explains with depth and structure how this is done."—Mark Fields, president and CEO, Ford Motor Company
- "This book makes powerful sense. Madsbjerg is a fascinating fellow, philosophically astute and immensely business savvy. Packed with a rich array of concrete examples and thick data, Madsbjerg shows how the problems of the coming century are cultural and how we require the tools of the humanities--especially philosophy--in order to confront them successfully. This is essential reading for anyone in the world of business and everyone with a concern for how human beings make sense of their world. Highly recommended."—Simon Critchley, Hans Jonas Professor of Philosophy, The New School
"Having helped some of the world's largest companies transition for the digital age, it's clear to me that those best positioned to win in today's marketplace possess a deep and human understanding of their customers. Companies must master not just big data, but thick data--insight into culture, history, and the social structures underlying human behavior. Sensemaking is the road map for how this works, and it is essential reading for anyone looking to thrive in a world of digital disruption."
—Francisco D'Souza, CEO, Cognizant
"Almost twenty years ago, I wrote, 'To be qualified to be a chief executive officer, you must be broad-gauged, widely read, and have many diverse interests.' This remains just as true in today's world, where companies have become enthralled with quantitative analysis. Christian Madsbjerg's Sensemaking is a powerful defense of human intelligence to solve problems. Anyone who dreams of leading a company should read it--and heed his wonderfully contrarian advice."
—Jeffrey Fox, bestselling author of How to Become CEO and How to Become a Rainmaker
- "Many have decried the widespread conclusion that the humanities have lost relevance, but few have proposed how to respond. Offering neither a rearguard defense of the humanities as we have known them, nor an unrealistic plea to other fields simply to take them seriously, Christian Madsbjerg offers a ringing endorsement of how humanities knowledge is still critically necessary to make sense of the world and its problems. With roots in Aristotle, Sensemaking calls on humanists to reinterpret their contribution while showing others how they cannot do without it. It is a book of the first importance."—Samuel Moyn, author of The Last Utopia: Human Rights in History and Jeremiah Smith, Jr. Professor of Law and Professor of History, Harvard University
- "Producing a mixture of how-to text and trenchant philosophy, Madsbjerg illustrates his formula for problem-solving with rich, captivating anecdotes.... Madsbjerg is no Luddite-he fully understands the value of data generated by algorithms-but he feels certain that one finely tuned human mind can solve problems that are beyond the grasp of emotionless computers."—Kirkus Reviews
- "Madsbjerg thinks that if businesses accept pure data as the only truth, they are in danger of losing their ability to understand people. But it is by no means the author's aim to dismiss stem subjects. Through his particular method, his intention is to help companies find the right balance. The best CEOs can read a novel and a spreadsheet."—Financial Times
- On Sale
- Mar 21, 2017
- Hachette Audio