Jump-Starting America

How Breakthrough Science Can Revive Economic Growth and the American Dream

Contributors

By Jonathan Gruber

By Simon Johnson

Formats and Prices

Price

$39.00

Price

$49.00 CAD

This item is a preorder. Your payment method will be charged immediately, and the product is expected to ship on or around April 9, 2019. This date is subject to change due to shipping delays beyond our control.

The untold story of how America once created the most successful economy the world has ever seen and how we can do it again.

The American economy glitters on the outside, but the reality is quite different. Job opportunities and economic growth are increasingly concentrated in a few crowded coastal enclaves. Corporations and investors are disproportionately developing technologies that benefit the wealthiest Americans in the most prosperous areas — and destroying middle class jobs elsewhere. To turn this tide, we must look to a brilliant and all-but-forgotten American success story and embark on a plan that will create the industries of the future — and the jobs that go with them.

Beginning in 1940, massive public investment generated breakthroughs in science and technology that first helped win WWII and then created the most successful economy the world has ever seen. Private enterprise then built on these breakthroughs to create new industries — such as radar, jet engines, digital computers, mobile telecommunications, life-saving medicines, and the internet– that became the catalyst for broader economic growth that generated millions of good jobs. We lifted almost all boats, not just the yachts.

Jonathan Gruber and Simon Johnson tell the story of this first American growth engine and provide the blueprint for a second. It’s a visionary, pragmatic, sure-to-be controversial plan that will lead to job growth and a new American economy in places now left behind.

Excerpt

Introduction:

Endless Invention

A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade, regardless of its mechanical skill.

—Vannevar Bush, head of the US World War II scientific effort1

IN JUNE 1940, THE FUTURE OF THE WORLD HUNG IN THE BALANCE. GERMANY HAD ATTACKED THE Netherlands, Belgium, and France just over a month earlier, and the Nazi victories were nothing short of stunning. Using military technology in new and inventive ways, the Germans demonstrated a form of warfare that combined quick movement, powerful weaponry, and dominance of the air. On paper, and according to conventional thinking, the combined British and French forces should have been able to stop the German advance, but within six weeks, the British were scrambling to evacuate their beleaguered forces from Dunkirk, and Paris fell.

America waited indecisively on the fringes of this fast-spreading conflict, with a competent but small navy, an air force that had fallen behind its potential adversaries, and an army that was so short of rifles that soldiers had to practice with brooms instead. In all of 1939, the United States built only six medium tanks.2

US military technology at the start of World War II was also seriously flawed. There were “grave defects with the depth-control mechanism and the exploder” in US torpedoes; many did not detonate when they hit targets.3 There was no consistently reliable way to track the presence of German U-boats in the Atlantic—thousands of sailors died as a result, and Britain came close to starvation.4 American armor was initially no match for what the Germans had in the field and under development.

A mere four years later, led by newly developed American technology, the Allies scored a decisive victory against both Germany and Japan. The United States had transformed warfare through the development and rapid deployment of advanced radar, proximity fuses, more effective armor, automated fire control mechanisms, amphibious vehicles, and high-performance aircraft—as well as by more effective ways to limit bacterial infections and control malaria.5 The German submarine fleet, once so close to victory in the Atlantic, was broken by the use of techniques, including radar detection, that seemed fantastical just a few years before. Japan’s surrender was forced by the detonation of two atomic bombs, based on essentially brand-new science.

How did this technological transformation happen—and so quickly? Start with June 12, 1940, and a visit to the White House by Vannevar Bush.

Vannevar Bush was an accomplished man. Previously vice president and dean of engineering at MIT, on the eve of World War II, he was running the Carnegie Institution for Science, a leading research organization in Washington, DC. Tough and experienced as an administrator, Bush was also a technology visionary and an entrepreneur with two successful start-ups under his belt, including as cofounder of Raytheon—an early technology company that grew up to become a substantial military contractor.

Bush represented American private enterprise, both academic and profit-making, at its best. Like many private-sector leaders of his generation, he also had a deeply rooted dislike of government involvement both in the economy and in science.

Bush had good reason to feel on edge waiting for his first White House meeting with President Franklin Delano Roosevelt. Despite the urgency of the moment, Bush did not have a new weapon or potential technology to unveil. Instead, his idea was prosaic and literally written on a single sheet of paper. In short, Vannevar Bush wanted to create a new government committee.

Washington, DC, has never been short of committees, and the summer of 1940 was no exception. But what Bush had in mind was no ordinary additional level of bureaucracy. The powerful people, with a clear mandate to develop weapons, would no longer just be admirals and generals or established industrial companies or even the private sector’s top research labs but rather Bush and a few university colleagues, none of whom had experienced combat. By any political standards, this was a breathtaking move—and by outsiders with very little political experience. Thirty years later, this was Bush’s assessment:

There were those who protested that the action of setting up N.D.R.C. [the National Defense Research Committee] was an end run, a grab by which a small company of scientists and engineers, acting outside established channels, got hold of the authority and the money for the program of developing new weapons. That, in fact, is exactly what it was.6

It worked. FDR was well aware that war was approaching—and was looking for good ideas that would not trigger congressional opposition. The president’s prior experience as assistant secretary of the navy encouraged him both to think about military technology and to be skeptical of admirals. Bush had prepared the ground well through key advisors, and FDR approved the idea inside of fifteen minutes. The National Defense Research Committee (NDRC) sprang into being.

Bush proved an inspired choice, abrasive enough to get the job done but also always focused on improving coordination and cooperation, even among people who did not like him. His friends were good at recruiting and managing talented scientists—other founding members of the NDRC included Karl Compton (president of MIT), James B. Conant (president of Harvard), Frank B. Jewett (president of Bell Labs and the National Academy of Sciences), and Richard C. Tolman (dean of the graduate school at Caltech).

Bush had long rubbed shoulders with all the smartest scientific people, who worked on everything from theories about the atom to far-fetched notions about how electricity passes through various materials. His new government committee idea was, in effect, simply about harnessing these individuals and their protégés in a productive manner to the coming effort of national defense.

This team, and the people they worked with, built—for its day—an enormous operation. At the peak of the activity, Bush directed the work of around thirty thousand people, of whom six thousand were scientists. Perhaps two-thirds of all physicists in the United States were employed in this operation.7 There had never been a greater concentration of scientific effort in the world.

In 1938, on the eve of world war, federal and state governments spent a combined 0.076 percent of national income on scientific research, a trivial amount. By 1944, the US government was spending nearly 0.5 percent of national income on science—a sevenfold increase, most of which was channeled through Bush’s organization from 1940.8 The effects of this unprecedented surge were simply incredible and, for America’s enemies, ultimately devastating.

Then, in 1945, Vannevar Bush had what may be considered his most profound insight. The war had been won, in part, because scientists under his general direction had figured out how to apply the existing stock of knowledge for military purposes and because American industry proved very good at turning those ideas quickly into a large number of physical goods—weapons of war.

What was needed next, Bush argued, was a redirection to focus on winning the peace. In a simple yet forceful way, Bush asked: What are the scientists to do next? His answer: a lot more science—funded by the federal government. In a 1945 report titled Science: The Endless Frontier, prepared for President Roosevelt, Bush argued dramatically that more than narrowly defined national security was at stake. Invention, including in ways that could not be forecast, would save lives, increase the standard of living, and create jobs.

Government itself should not do science. Bush was scathing about bureaucracy in general and had the metaphorical scars to prove that military bureaucracy in particular was not conducive to scientific inquiry.

At the same time, based on Bush’s deep personal experience, the private sector—firms, rich individuals, and the best universities—by itself could not fund and carry out the innovative science that was required. Private business was very good at incremental change based on existing knowledge. But by the mid-twentieth century, the age of individual inventors providing breakthroughs by themselves was substantially over, and private sector science was being carried out in large-scale corporate labs. The executives running these labs were not generally inclined to fund the invention of new technologies that could undermine or even ruin their company’s existing business model.

The controversial yet deep insight of Bush’s wartime model was combining the traditionally quite separate world of corporate management with the quirky faculty of universities to find solutions for what the military needed—sometimes even before the military knew precisely what that was. In his report, Bush proposed that the United States combine that university–private sector partnership with ongoing large-scale government funding to produce a postwar innovation machine.

This is—eventually—exactly what the United States did. The idea of large-scale government funding for university-based science took a while to gain traction, and the precise structures created were not exactly what Bush had in mind.9 Nevertheless, in the decades that followed World War II, his broad vision was implemented to a substantial degree.

An essential part of this approach was a transformation of higher education, including a great expansion in the number of university-trained engineers and scientists—made possible through federal government support, beginning with the GI Bill of 1944. New sectors developed, millions of jobs were created, and these vacancies, including occupations that had not previously existed, were filled by people with recently acquired high levels of skill. For example, the government backed investments in the technology needed to develop jet aircraft, creating the basis for a large commercial sector, which in turn needed—and was able to hire—thousands of skilled mechanics and engineers.

This combination of new technology and a larger number of skilled people increased productivity and created the scientific and practical basis for almost everything that characterizes our modern economy. For the next two decades, wages rose for university-educated people and—a key point—also for those with only a high school diploma.

The catalyst for this effort was federal government funding at a scale never previously experienced, which generated some of the highest-return investments the world has ever seen.

From 1940 to 1964, federal funding for research and development increased twentyfold. At its peak in the mid-1960s, this spending amount was around 2 percent of annual gross domestic product—roughly one in every fifty dollars in the United States was devoted to government funding of research and development (equivalent, relative to GDP, to almost $400 billion today). The impact on our economy, on Americans, and on the world was simply transformational.

There is a good chance you are alive today because of this work. Penicillin was a pre–World War II British invention—a brilliant albeit accidental discovery. But it was the Americans, under Bush’s leadership, who figured out how to scale up production and distribute millions of high-quality doses around the world.10 This effort sparked interest in other potentially important soil microorganisms, leading indirectly to the development of streptomycin (effective against tuberculosis) and other antibiotics.11 Cortisone and other steroids were created.12 An ambitious worldwide anti-malaria campaign was launched.13 Childhood vaccination, the decline in maternal mortality, and the control of infectious disease more broadly all sprang directly from this work. The leading American pharmaceutical companies of today owe their expansion and subsequent fortune to this public push, which began under Bush’s auspices, to improve medical science.14

Digital computers are another area where the federal government had great impact.15 By 1945, the US military saw that it faced important problems—from the automated control of naval guns to the management of complex early-warning radar systems—that called out for faster computation than was humanly possible. Their funding for both basic research and more applied development eventually made possible both the development of new machines (hardware)—including transistors, which became the essential silicon-based component—and the instructions that run on those machines (software). From this national defense–oriented investment flowed everything that has changed how we handle, analyze, and use information—up to and including Apple’s iPhone.

The examples go on, including jet aircraft, satellites, improved telecommunications, and the internet. It is hard to find any aspect of modern life that has not been profoundly affected by innovation that can be traced back either to the Bush-era efforts or to inventions that were supported by various government programs in the years that followed.

Prior to 1940, university education was primarily a luxury, available to only a few people. Subsequent to the expansion in the potential for technological progress—and the availability of government support for research and teaching—the number and quality of places for studying science, engineering, and all their applications greatly increased. For the first time, the United States became the best country in the world to study, develop, and commercialize new technology.

The backbone of the US economy in the postwar years was built on a visionary model that created not just great companies and amazing products but also a large number of good jobs—the basis for the largest and most successful expansion of a middle class that the world has ever seen. The fruits of government investment were indirectly shared with all citizens through a US corporate sector that provided stable employment at high wages with relative equality, at least by today’s standards.

Median family income in the United States doubled from 1947 through 1970. The increase in wealth was shared throughout the country, with growth not only on the coasts but in the industrial Midwest and the newly dynamic South.

The broader benefits of this new technology were felt around the world. There was a general American desire to support a more stable world—primarily to avoid a repetition of the Great Depression and two world wars. However, what drove the spread of useful and productivity-enhancing technology was not primarily any form of altruism or even a deliberate desire to help. Ideas, once manifest in the form of a usable technology, are hard to control and spread to wherever people find them appealing.

Naturally, other countries responded—by investing in their own scientific endeavors, in effect trying to create their own version of what was working well in the United States. This became the age of deliberate government-supported but private sector–led technological innovation.

WHAT WENT WRONG?

Despite a remarkable run of technological and economic success, the United States now faces serious problems. In World War II and during the Cold War, the country built a powerful and stable engine of growth through the application of scientific research to practical problems. The associated technologies proved transformative, resulting in new products, new companies, and an almost insatiable demand for American goods and services around the world.

Unfortunately, we failed to maintain the engine. From the mid-1960s onward, based on concerns about the environmental, military, and ethical implications of unfettered science, compounded by shortsighted budget math, the government curtailed its investments in scientific research. Economic difficulties during the 1970s, followed by the Reagan Revolution and the anti-tax movement, resulted in an even broader retreat from federally funded activities. Most recently, the impact of a global financial crisis in 2008 and consequent economic pressures—known as the Great Recession—have further squeezed investments in the scientific future.

Federal spending on research and development peaked at nearly 2 percent of economic output in 1964 and over the next fifty years fell to only around 0.7 percent of the economy.16 Converted to the same fraction of GDP today, that decline represents roughly $240 billion per year that we no longer spend on creating the next generation of good jobs.

Should we care? If there is socially beneficial research and product development to be done, surely the innovative companies of today will take this on?

In fact, they won’t. Invention is a public good, in the sense that every dollar of spending on science by a private company is paid for by that company (a private cost), while some of the benefits from discoveries invariably become public—ideas, methods, and even new products (once patents expire) are shared with the world.

The private sector, by definition, focuses solely on assessing if the private returns—to this firm, its managers and investors—of any investment are high enough to justify the risks. Executives running these companies do not account for the spillover benefits that accrue from producing general knowledge, and they do not share proprietary research that might benefit others.

Moreover, new invention in the private sector is constrained by financing. The venture capital sector that has created so many high-tech success stories has, at the same time, avoided the type of very-long-run and capital-intensive investments that lead to technological breakthroughs—and create new industries and jobs.

As a result, the government retreat from research and development has not been fully offset by the private sector. Consequently, our stock of knowledge increased more slowly than it would have otherwise—over time, this means lower growth and less job creation.

Missed opportunities for invention directly contribute to the stagnation of incomes. From World War II through the early 1970s, our economy—total gross domestic product—grew close to 4 percent per year on average.17 Over the last forty years, our growth performance has slipped, averaging under 3 percent per annum since the early 1970s and decelerating further to under 2 percent per annum since 2000.18 By the mid-2020s, the Congressional Budget Office expects annual growth in total GDP will average only 1.7 percent per year.19

At its core, economic growth is all about what happens to productivity—how much we collectively produce per person.20 The information technology revolution is much hyped—smartphones for everyone!—but has proven profoundly disappointing in terms of its impact on productivity, and there is no sign this will soon change. The boom-bust decade that started the 2000s only further undermined our ability to grow.

Good jobs—at decent wages with reasonable benefits—are disappearing and being replaced by low-paying jobs that do not support a sufficient standard of living. A process of job destruction is a normal part of any market economy and also existed during the boom years of the 1950s and 1960s. But the new information technology that failed to boost overall productivity growth served to accelerate the elimination of high-paying jobs that were previously held by people with just a high school education. As a result, after doubling in only twenty-three years after World War II, the median US household income grew by only 20 percent over the next forty-five years.

While we have retreated from Vannevar Bush’s innovation engine, the rest of the world is picking up the slack. Total research funding is growing at a much faster rate, relative to the economy, in the rest of the world than it is in the United States, led in many countries by active government policies. This is particularly true in our largest economic rival, China, whose rising investments have paid off, including in areas such as computing and, increasingly, medical research, where the United States once dominated.

The middle class is already under enormous pressure, with stagnant wages and a rising cost of higher education that makes it harder and harder to move up the economic ladder. At the same time, there is a discernible and hard-to-reverse geographic impact: good jobs are created disproportionately in a small number of cities, largely on the East and West Coasts. Restrictive zoning policies and high land prices in these cities make it difficult for many people to migrate to where the good jobs are, leaving them behind in slower-growing areas and contributing to a sense of economic unease.

We need a transformative and politically sustainable new way to boost growth and create jobs—by jump-starting our growth engine.

JUMP-STARTING THE ENGINE

The economic slowdown of the past few decades is not inevitable. Our economy can become dazzling again—both in terms of inventions and, more importantly, in terms of the prospects for most Americans. To do this, America needs to become much more of a technology-driven economy. That sounds surprising because most of us think we are a country driven by leading technologies and technological players. After all, isn’t Silicon Valley already the engine of world growth?

Actually, no—Silicon Valley impacts only a small part of the US economy.21 The American private sector invests in new products but not in basic science. To really improve the performance of the American economy—and to raise incomes across the board—we need to invest heavily in the underlying science of computing, human health, clean energy, and more.

The necessary conditions are largely in place. We have the world’s leading universities, favorable conditions for starting new business, and plenty of capital willing to take risks. We have learned a great deal about what works and what does not in terms of the public-private partnership around science and innovation.

What we need is a sustained public- and private-sector push that scales up the innovation system, focusing on the creation of ideas that can be converted into technology—just like the early work on digital computers ended up creating an entirely different structure for the organization and dissemination of information. This will require the type of commitment to the federal funding of science that helped support our post–World War II boom.

We should support this with a major expansion in science education across all ages, with the goal of producing—and employing—many more university graduates with technical skills. This combined increase in demand and supply can, over time, create millions of new, high-paying jobs.

But to make this push both economically sensible and politically sustainable, we need to distribute the benefits of growth more broadly, in two senses.

First, we must ensure that the new high-tech jobs do not follow the pattern of the past forty years and fall into just a narrow set of “superstar” cities on the East and West Coasts. There are dozens of other cities throughout the United States that meet the conditions for creating a new technology hub. These are cities that have the preconditions for success—a large pool of skilled workers, high-quality universities, and a low cost of living—and where people desperately want more jobs at good wages. But they are places that are losing out today because they do not have enough scientific infrastructure to become new centers of innovation, nor do they have the base of venture capital that can turn new ideas into profitable companies.

The federal government can select the best places using the type of competitive selection mechanism most recently employed by one of the country’s most valuable companies. In late 2017, as we mentioned in the prologue, Amazon announced that it would place a second headquarters operation somewhere in North America, creating perhaps around fifty thousand good jobs. A total of 238 cities and regions from all over the United States (and Canada)—irrespective of political inclination—submitted bids, laying out various kinds of welcome carpets, including tax breaks and supportive infrastructure.

Amazon, however, eventually chose two locations that will help it make a presumably bigger profit—partly by receiving the largest possible tax breaks. This is what companies do: they serve the interests of their shareholders, not the public. The result is a zero-sum tax competition that does nothing to raise the wealth of the nation as a whole.

The competition we have in mind would serve the interest of the nation, not individual companies. Places would compete not on the basis of tax breaks but on the basis of their qualifications to become a new technology hub. This would involve demonstrating the proper preconditions for scientific innovation, including research infrastructure, and support for better scientific education, from high school through college. It would involve ensuring sustainable development plans for the area so that we don’t just create new congested and high-cost-of-living cities. Places would need to demonstrate partnerships with the private sector that can lead from lab science to product development.

Second, we should share the benefits of innovation more directly with the US taxpayer. For too long, the government has funded basic research—such as digital computers, the internet, and the Human Genome Project—that has essentially become windfall profits for a small number of investors who are able to get in early into enough technology-development projects.22 The increasing shift in the returns from production toward capital owners (people who own companies, property, and so on, rather than workers), combined with falling effective rates of taxation on those returns to capital, leaves many Americans rightly suspicious of government investments that lead to more profitable firms.

As part of our competitive criteria for areas to attract the additional federal science funding, local governments would need to provide a way for taxpayers to share directly in the upside. For example, local and state government could hold a large, publicly owned parcel of land for development in and around these new research hubs—with the government getting the upside, in higher rents or capital appreciation, as this land becomes increasingly valuable. Profits would be paid out directly to citizens as a cash dividend every year.

We have a great model of how to do this from a relatively conservative state: the Alaska Permanent Fund, which distributes the revenues from natural resources (oil and gas) equally to all state residents. An annual innovation dividend would be paid out in cash terms equally to all Americans, illustrating vividly the returns from the public’s investment in advancing science.

Taxpayers take risks all the time, whether they know it or not. Ever since the creation of the American Republic—and much more so since 1940—the federal government has invested in pushing frontiers forward, first in a geographical sense and more recently in terms of technology.

When projects go wrong—like the collapse of solar manufacturer Solyndra, which borrowed more than $500 million from the federal government—there are accusations, investigations, and some attempt to assign blame. The taxpayer has to absorb the losses.

When projects go well—radar, penicillin, jet planes, satellites, the internet, and most recently the Human Genome Project—great fortunes are created, but only for the lucky few. It’s time for all Americans to get a serious piece of the upside from accelerated innovation.

A ROAD MAP

Genre:

  • "Something has gone profoundly wrong with the US economy over the last two decades. Economic growth has been disappointing, and the little of it we have witnessed has benefited the already rich and left everybody else behind. This wonderfully readable book by two leading scholars explains why and what to do about it. It is a powerful call for action for the government to get involved, encourage innovation in local clusters, and help the economy get back to creating good jobs for ordinary Americans. A must read."—Daron Acemoglu, coauthor of WhyNations Fail, and Elizabeth and James Killian Professor of Economics, MIT
  • "In this meticulously researched, highly readable, and exquisitely timed book, Jonathan Gruber and Simon Johnson of MIT propose a new, national plan, rooted in expanded scientific research, for accelerating US growth, reducing inequality, and jump-starting regions of America which have been falling behind. And, they show the funding mechanisms, federal and local decision-making processes, and actual areas of new research which would undergird it. It is brilliant at historical, economic, and political levels."—Roger Altman, former deputy secretary of the Treasury, founder andsenior chairman of Evercore
  • "This brilliant book brings together economic history, urban economics, and the design of incentives to build an ambitious proposal to jump-start growth across geographies and mitigate inequality."—SusanAthey, Economics of Technology Professor and director, Initiative for SharedProsperity and Innovation, Stanford University
  • "Opportunities for technological breakthroughs have never been greater, but America is fumbling its historic leadership. Gruber and Johnson explain with clarity, authority and insight how America can regain its innovation mojo."—ErikBrynjolfsson, director of the MIT Initiative on the Digital Economy, and coauthorof The Second Machine Age
  • "What has been missing from our ongoing debate about inequality and bringing the fruits of prosperity to a much wider segment of people is an approach to make it happen in a way that is familiar to American society, tradition, and historical political success. Jon Gruber and Simon Johnson provide that missing link by demonstrating how smart public investment in science will build the capabilities and infrastructure that is the sine qua non for the investment that will generate returns in the form of new products, services, and other benefits that employ millions and are widespread geographically."—Ellen Dulberger, member of the Board on Science, Technology, andEconomic Policy, the National Academies of Science, Engineering, and Medicine
  • "This is the book America needs now. The blueprint for a dazzling future, filled with invention and growth, can be found in our recent past. Johnson and Gruber have resurrected the lost history of American science, and the era of big government that funded it. They have written a manifesto brimming with novel proscriptions that are themselves evidence that this country hasn't lost its capacity to innovate."—FranklinFoer, author of World Without Mind
  • "In Jump-Starting America, Jonathan Gruber and Simon Johnson present an innovative and compelling case to invest more in innovation, and they propose a bold plan to ensure that the benefits are shared throughout the US and the money is spent wisely. Nothing less than the preeminence of the US economy hangs in the balance."—Alan B. Krueger, former chairman of The Council of Economic Advisors,and Bendheim Professor of Economics and Public Affairs, Princeton University
  • "Jump-Starting America is a brilliant, fascinating, timely, and important book. It makes a compelling case that thoughtful government investment in science is the key to achieving a second golden age for the American economy. A joy to read."—SteveLevitt, coauthor of Freakonomics, andWilliam B. Ogden DistinguishedService Professor of Economics, University of Chicago
  • "America's future prosperity depends on investing in our entire country, most especially those areas 'left behind.' Gruber and Johnson show us the way through ingenious ideas based on how it was done in the past and that cut through today's political gridlock, providing the inspiration and optimism we need for enabling many more Americans to secure an economically bright future."—EricSchmidt, former CEO and executive chairman of Google
  • "Long derided, at best for the hubris of officials trying vainly to do science and pick winners and at worst for its cronyism, industrial policy is making a comeback in many countries. Jonathan Gruber and Simon Johnson show how the government can promote innovation while avoiding the classic pitfalls of such policies. The United States of DARPA, NASA, the NIH or the NSF offers a perhaps-unexpected role model. This very important book by two world leading academics is a must-read not only for scholars, but also for all policymakers, from those who still doubt the power of industrial policy to those who might be tempted to apply it carelessly."—Jean Tirole, Toulouse School of Economics, 2014 Nobel laureate ineconomics
  • "Jonathan Gruber and Simon Johnson's important Jump-Starting America argues that public investment in knowledge and research can help put American economic growth back on track... Gruber and Johnson have produced a superbly argued case... Their analyses are always insightful ... One way or another, our economy could use a jump-start, and the authors' vision of a research- and education-led American comeback is compelling."—Wall Street Journal

On Sale
Apr 9, 2019
Page Count
368 pages
Publisher
PublicAffairs
ISBN-13
9781541762480

Jonathan Gruber

About the Author

Jonathan Gruber is the Ford Professor of Economics at MIT. A key architect of both Romneycare and Obamacare, he appears regularly on both Fox News and MSNBC. Slate has named him one of the top twenty-five “Most Innovative and Practical Thinkers of Our Time.” In addition to over 160 academic articles, he is the author of Health Care Reform (Hill & Wang), a graphic novel about the Affordable Care Act.

Learn more about this author

Simon Johnson

About the Author

Simon Johnson is the Kurtz Professor of Entrepreneurship at MIT and a former chief economist to the IMF. His much-viewed opinion pieces have appeared in the New York Times, the Wall Street Journal, the Financial Times, the Atlantic, and elsewhere. With law professor James Kwak, Simon is the co-author of the bestsellers 13 Bankers and White House Burning and a founder of the widely-cited economics blog The Baseline Scenario.

Learn more about this author