Thursday, June 20, 2013

Technology and Job Destruction

Is there something about the latest wave of information and communication technologies that is especially destructive to jobs? David Rotman offers an overview of the arguments in "How Technology Is Destroying Jobs," in the July/August 2013 issue of the MIT Technology Review.

On one side Rotman emphasizes the work of Erik Brynjolfsson and Andrew McAfee: "That robots, automation, and software can replace people might seem obvious to anyone who’s worked in automotive manufacturing or as a travel agent. But Brynjolfsson and McAfee’s claim is more troubling and controversial. They believe that rapid technological change has been destroying jobs faster than it is creating them, contributing to the stagnation of median income and the growth of inequality in the United States. And, they suspect, something similar is happening in other technologically advanced countries."

As one piece of evidence, they offer this graph showing productivity growth and private-sector employment growth. going back to 1947, these two grew at more-or-less the same speed. But starting around 2000, a gap opens up with productivity growing faster than private sector employment.



The figure sent me over to the U.S. Bureau of Labor Statistics website to look at total jobs. Total U.S. jobs were 132.6 million in December 2000. Then there's a drop associated with the recession of 2001, a rise associated with the housing and finance bubble, a drop associated with the Great Recession, and more recently a bounceback to 135.6  million jobs in May 2013. But put it all together, and from December 2000 to May 2013, total U.S jobs now are about 2.2% higher than they were back at the start of the century.

Why the change? The arguments rooted in technological developments sound like this: "Technologies like the Web, artificial intelligence, big data, and improved analytics—all made possible by the ever increasing availability of cheap computing power and storage capacity—are automating many routine tasks. Countless traditional white-collar jobs, such as many in the post office and in customer service, have disappeared. W. Brian Arthur, a visiting researcher at the Xerox Palo Alto Research Center’s intelligence systems lab and a former economics professor at Stanford University, calls it the “autonomous economy.” It’s far more subtle than the idea of robots and automation doing human jobs, he says: it involves “digital processes talking to other digital processes and creating new processes,” enabling us to do many things with fewer people and making yet other human jobs obsolete."

Of course, there are other arguments about slower job growth rooted in other factors. Looking at the year 2000 as a starting point is not a fair comparison, because the U.S. economy was at the time in the midst of the unsustainable dot-com bubble. The current economy is still recovering from its worst episode since the Great Depression.  In addition, earlier decades have seem demographic changes like a flood of baby boomers entering the workforce from the 1960s through the 1980s, along with a flood of women entering the (paid) workforce. As those trends eased off, the total number of jobs would be expected to grow more slowly.

Another response to the technology-is-killing-jobs argument is that while technology has long been disruptive, the economy has shown an historical pattern of adjusting over time. Rotman writes: "At least since the Industrial Revolution began in the 1700s, improvements in technology have changed the nature of work and destroyed some types of jobs in the process. In 1900, 41 percent of Americans worked in agriculture; by 2000, it was only 2 percent. Likewise, the proportion of Americans employed in manufacturing has dropped from 30 percent in the post–World War II years to around 10 percent today—partly because of increasing automation, especially during the 1980s. ... Even if today’s digital technologies are holding down job creation, history suggests that it is most likely a temporary, albeit painful, shock; as workers adjust their skills and entrepreneurs create opportunities based on the new technologies, the number of jobs will rebound. That, at least, has always been the pattern. The question, then, is whether today’s computing technologies will be different, creating long-term involuntary unemployment."

Given that the U.S. and other high-income economies have been experiencing technological change for well over a century, and the U.S. unemployment rate was below 6% as recently ago as the four straight years from 2004-2007, it seems premature to me to be forecasting that technology is now about to bring a dearth of jobs. Maybe this fear will turn out to be right this time, but it flies in the face of of a couple of centuries of economic history.

However, it does seem plausible to me that technological development in tandem with globalization are altering pay levels in the labor force, contributing to higher pay at the top of the income distribution and lower pay in the middle. For some discussion of technology and income inequality, see my post earlier this week on "Rock Music, Technology, and the Top 1%," and for some discussion of technology and "hollowing out" the middle skill levels of the labor force, see my post on "Job Polarization by Skill Level" or this April 2010 paper by David Autor  (Full disclosure: Autor is also editor of the Journal of Economic Perspectives, and thus is my boss.)

Given that new technological developments can be quite disruptive for existing workers, the conclusion I draw is the importance of finding ways for more workers to find ways to work with computers and robots in ways that can magnify their productivity. Rotman mentions a previous example of such a social transition: "Harvard’s [Larry] Katz has shown that the United States prospered in the early 1900s in part because secondary education became accessible to many people at a time when employment in agriculture was drying up. The result, at least through the 1980s, was an increase in educated workers who found jobs in the industrial sectors, boosting incomes and reducing inequality. Katz’s lesson: painful long-term consequences for the labor force do not follow inevitably from technological changes." It feels to me as if we need a widespread national effort in both the private and the public sector to figure out ways in which every worker in every job can use information technology to become more productive.


The arguments over how technology affects jobs remind me a bit of an old story from the development economics literature. An economist is visiting a public works project in a developing country. The project involves building a dam, and dozens of workers are shoveling dirt and carrying it over to the dam. The economist watches for awhile, and then turns to the project manager and says: "With all these workers using shovels, this project is going to take forever, and it's not going to be very high quality. Why not get a few bulldozers in here?" The project manager responds: "I can tell that you are unfamiliar with the political economy of a project like this one. Sure, we want to build the dam eventually, but really, one of the main purposes of this project is to provide jobs. Getting a bulldozer would wipe out these jobs." The economist mulls this answer a bit, and then replies: "Well, if the real emphasis here is on creating jobs, why give the workers shovels? Wouldn't it create even more jobs if they used spoons to move the dirt?"

The notion that everyone could stay employed if only those new technologies would stay out of the way has a long history. But the rest of the world is not going to back off on using new technologies. And future U.S. prosperity won't be built by workers using the metaphorical equivalent of spoons, rather than bulldozers.