To put some intuitive meat the bones of the productivity idea, the discussion starts with a basic example of productivity for an Iowa corn farmer.
"In 1870, a family farmer planting corn in Iowa would have expected to grow 35 bushels an acre. Today, that settler’s descendant can grow nearly 180 bushels an acre and uses sophisticated equipment to work many times the acreage of his or her forbearer. Because of higher yields and the use of time-saving machinery, the quantity of corn produced by an hour of farm labor has risen from an estimated 0.64 bushel in 1870 to more than 60 bushels in 2013. This 90-fold increase in labor productivity—that is, bushels of corn (real output) an hour—corresponds to an annual rate of increase of 3.2 percent compounded over 143 years. In 1870, a bushel of corn sold for approximately $0.80, about two days of earnings for a typical manufacturing worker; today, that bushel sells for approximately $4.30, or 12 minutes worth of average earnings.
This extraordinary increase in corn output, fall in the real price of corn, and the resulting improvement in physical well-being, did not come about because we are stronger, harder-working, or tougher today than the early settlers who first plowed the prairies. Rather, through a combination of invention, more advanced equipment, and better education, the Iowa farmer today uses more productive strains of corn and sophisticated farming methods to get more output an acre. ... Technological advances such as corn hybridization, fertilizer technology, disease resistance, and mechanical planting and harvesting have resulted from decades of research and development."
In the picture, a typical American worker had more than four times the output per hour than a worker in 1948. As the table shows, about 10% of the gain can be traced to higher education levels and about 38% of the gain to workers that are working with capital investments of greater value. But the majority of the change is growth in multifactor producitivity: that is, innovations big and small that make it possible for a given worker with a given amount of capital to produce more.
The U.S. productivity challenge can be seen in the statistics of the last few decades. U.S. productivity growth was healthy and high in the 1950s and 1960s, plummeted from the early 1970s up to the mid-1990s, and has rebounded somewhat since then.
The reasons for the productivity slowdown around 1970 not fully understood. The report lists some of the likely candidates: energy price shocks that made a lot of energy-guzzling capital investment nearly obsolete; a relatively less-experienced labor force as a result of the baby boom generation entering the labor force and the widespread entry of women into the (paid) labor force; and a slowdown after the boost that productivity had received from World War II innovations like jet engines and synthetic rubber, as well as the completion of the interstate highway system in the 1950s. The bounceback of productivity since the mid-1990s is typically traced to information and communications technology, both making it and finding ways to use it. There is a considerable controversy about whether future productivity growth is likely to be faster or slower. But given that economists failed to predict either the productivity slowdown of the 1970s (and still don't fully understand it) or the productivity surge of the 1990s, I am not filled with optimism about our ability to foretell future productivity trends.
Sometimes people look at the vertical axis on these productivity graphs and wonder what all the fuss is about. Does the fall from 1.8% to 0.4% matter all that much? Aren't they both really small? But remember that the growth rate of productivity is an annual rate that shapes how much the overall economy grows. Say that from 1974 to 1995 the productivity growth rate had been 1% per year faster. After 22 years, with the growth rate compounding, the U.S. economy would have been about 25% larger. If the U.S. GDP was 25% larger in 2014, it would be $21.5 trillion instead of $17.2 trillion.
Policy-makers spend an inordinate amount of time trying to fine-tune the outcomes of the market system: for example, consider the recent arguments over raising the minimum wage, raising pay for those on federal contracts, changing how overtime compensation is calculated, or the top tax rate for those with high incomes. Given the rise in inequality in recent decades, I feel some sympathy with the impetus behind policies that seek to slice the pie differently--although I'm sometimes more skeptical about the actual policies proposed. But 20 or 30 years in the future, what will really matter in the U.S. economy is whether annual rates of productivity growth have on average been, say, 1% higher per year.
The agenda for productivity growth is a broad one, and it would include improving education and job training for American workers; tax and regulatory conditions to support business investment; innovation clusters that mix government, higher education, and the private sector; and sensible enforcement of intellectual property law. But here, I'll add a few words about research and development spending, which is often at that growth in innovative ideas that are a primary reason for rises in productivity over. The Council of Economic Advisers writes:
"Investments in R&D often have “spillover” effects; that is, a part of the returns to the investment accrue to parties other than the investor. As a result, investments that are worth making for society at large might not be profitable for any one firm, leaving aggregate R&D investment below the socially optimal level (for example, Nelson 1959). This tendency toward underinvestment creates a role for research that is performed or funded by the government as well as by nonprofit organizations such as universities. These positive spillovers can be particularly large for basic scientific research. Discoveries made through basic research are often of great social value because of their broad applicability, but are of little value to any individual private firm, which would likely have few, if any, profitable applications for them. The empirical analyses of Jones and Williams (1998) and Bloom et al. (2012) suggest that the optimal level of R&D investment is two to four times the actual level."
In other words, it's been clear to economists for a long time that society probably underinvests in R&D. Indeed, some of the biggest cliches of the last few decades is that we are moving to a "knowledge economy" or an "information economy." We should be thinking about doubling our levels of R&D spending, just for starters. But here's what U.S. R&D spending as a share of GDP looks like: a boost related to aerospace R&D in the late 1950s and into the 1960s, which drops off, and basically flat since around 1980.
How best to increase R&D spending is a worthy subject: Direct government spending on R&D? Matching grants from government to universities or corporations? Tax breaks for corporate R&D? Helping collaborative R&D efforts across industry and across public-private lines? Whether we should be increase R&D is a settled question, and the answer is "yes."
Finally, here's an article from the New York Times last weekend on how the U.S. research establishment is depending more and more on private-sector and non-profit funding. The graph above includes all R&D spending--government, private-sector, nonprofit--not just government. Nonprofit private foundations can do some extremely productive work, and I'm all for them. But they are currently filling in the gaps for research programs that lack other support, not causing total R&D spending to rise.