Tuesday, September 9, 2014

Stuck on Economics

I have lamented in the past that when your brain is stuck on economics, it can be hard to escape from your obsession. For example, I explain here what it's like to be  driving around northern Montana wondering why the local population was obsessed with GNP, when everyone knows that the economy is now more commonly measured by GDP. Or here is how I ended up "Endorsing Association 3E: Ethics, Excellence, Economics"--and it tastes excellent on nibbles of sourdough bread. Or here is how the Economic Geyser spouts even in the middle of Yellowstone National Park.

Now McDonald's is messing with my ability to turn off the economics portion of my brain. A few years back they prominently advertised the CBO, which we all know stands for Congressional Budget Office, thus causing me to twitch every time I passed a billboard.


Of course, now it's the eco-nom-nom-nomics advertisements. Most of what I watch on television is live sports, and I'm just trying to sit and relax and watch my baseball or football game in peace, when suddenly my brain is jolted into awareness of economics. Please make it stop.




Of course, my children think the ads are hilarious, partly because they make Dad twitch. The children are also fans of the "lolcats" books, which are cats with funny but ungrammatical captions (that badly need the work of an economics journal editor to fix them all right now. Sorry, lost my train of thought there for a moment.) Oh yes, the lolcats also say "nom nom nom" from time to time. So now the lolcats trigger thoughts of economics in my mind, too. Thanks a lot, McDonalds. I need another month of summer vacation.



Monday, September 8, 2014

19th Century Fencing and Information Technology

It's no surprise that US investment is disproportionately focused on information technology. The broad category of information processing technology and equipment was 8% of all private nonresidential US investment in 1950, but 30% of all investment by 2012. This raises the question: Is there a previous time in U.S.  history when investment has been so  heavily focused in a single category?

David Autor offers a possible answer: Investment in fences in the late 19th century U.S. economy. The answer is side comment in Autor's paper "Polanyi's Paradox and the Shape of Employment Growth," presented in August at the Jackson Hole conference sponsored by the Kansas City Federal Reserve. The paper is well worth reading for what it has to say about the links from automation to jobs and wages. Here, I'll offer some thoughts of my own about fencing and information technology.  (Full disclosure: Autor is the Editor of the Journal of Economic Perspectives, and thus my boss.)

Richard Hornbeck published "Barbed Wire: Property Rights and Agricultural Development", in a 2010 issue of Quarterly Journal of Economics (vol. 125: 2, pp. 767-810). He argues for the importance of fencing in understanding the development of the American West. Hornbeck writes (citations and footnotes omitted):

In 1872, fencing capital stock in the United States was roughly equal to the value of all livestock, the national debt, or the railroads; annual fencing repair costs were greater than combined annual tax receipts at all levels of government ... Fencing became increasingly costly as settlement moved into areas with little woodland. High transportation costs made it impractical to supply low-woodland areas with enough timber for fencing. Although wood scarcity encouraged experimentation, hedge fences were costly to control and smooth iron fences could be broken by animals and were prone to rust. Writers in agricultural journals argued that the major barrier to settlement was the lack of timber for fencing: the Union Agriculturist and Western Prairie Farmer in 1841, the Prairie Farmer in 1848, and the Iowa Homestead in 1863 ... Farmers mainly adjusted to fencing material shortages by settling in areas with nearby timber plots."
Then in 1874, Joseph Glidden patented "the most practical and ultimately successful design for
barbed wire." The fencing business took off. Hornbeck quotes a story from a 1931 history:  “Glidden himself could hardly realize the magnitude of his business. One day he received an order for a
hundred tons; ‘he was dumbfounded and telegraphed to the purchaser asking if his order should not read one hundred pounds'".

Remember that fencing was already of central importance to the U.S. capital stock in 1872. Hornbeck presents estimates of how the total stock of fencing expanded over the decades. The pent-up demand was enormous, and cheaper steel was becoming widely available after the 1870s. From 1880 to 1900, for example, the total amount of fencing in Prairie states went from 80 million rods (where a rod equals 16.5 feet or about 5 meters) to 607 million rods; in the Southwest region, the rise was from 162 million rods in 1880 to 710 million rods by 1900. In the South Central states, the gains were comparatively smaller, only about a doubling from 344 million rods in 1880 to 685 million rods in 1900. By comparing across regions with and without fencing, as the fencing arrived, Hornbeck argues:
"Barbed wire may affect cattle production and county specialization through multiple channels, but these results suggest that barbed wire’s effects are not simply the direct technological benefits that would be expected for an isolated farm. On the contrary, it appears that barbed wire affected agricultural development largely by reducing the threat of encroachment by others’ cattle."

The juxtaposition between 19th century fencing and 21st century information technology offers an irresistible chance for loose speculations and comparisons. Fencing in the 19th century made property rights to U.S. land more valuable, especially in the Prairie and Southwest regions, because it protected the farmers crops. Of course, there was also considerable conflict and dislocation as the land was fenced, including conflicts between farmers and ranchers and between settlers and Native Americans. But for many Americans, the fencing of the American West felt like a clear-cut opening of productive opportunities.

The economic gains from modern information technology often seem to arrive in less clear form. True, for some workers the vast gains of electronic technology feel like a brand-new frontier. But many workers throughout the economy experience information technology as a continual mix of gains, costs, and disruptions. For example, email is great; and email eats up my day. Information technology can offer vast cost savings in office-work, greater efficiency in logistics and shipping, and faster development of new designs and technologies--all of which also disrupt companies and workers.

New information technology is far more mutable than fencing: it finds ways to slither into aspects of almost every job, including how that job is scheduled, organized, and paid for. Moreover, information technology is really a series of new technologies, as Moore's law drives the cost of computing lower and lower, creating waves of distinctively different growth opportunities. As Hornbeck points out, barbed-wire fencing did get substantially cheaper over time, with the cost falling by half from 1874 to 1880, and then again almost another two-thirds by 1890, and falling almost to half of that amount by 1897. But that impressive technological record is dwarfed by the productivity gains in information technology.

In short, 19th-century fencing may well have been an investment similar in relative size to modern information technology (although the economic statistics of the late 19th century don't allow anything resembling an apples-to-apples comparison). But at least to me, information technology seems considerably more disruptive, transformative, and ultimately beneficial for the economy.



Friday, September 5, 2014

Shaping the Direction of Health Care Innovation

My hope would be that the health care innovations of the future focus on two goals: how to attain improvements in health across the population, and how to provide the same or more effective health care at lower cost. My worry is that the direction of health care innovation is shaped by incentives related to beliefs about what can be brought to market and what will be demanded by patients and received with favor by health care providers  that are not necessarily well-aligned with these goals. Steven Garber, Susan M. Gates, Emmett B. Keeler, Mary E. Vaiana, Andrew W. Mulcahy, Christopher Lau and Arthur L. Kellermann tackle these issues in "Redirecting Innovation in U.S. Health Care: Options to Decrease Spending and Increase Value," a report from the RAND Corporation.

The authors point out that since the 1950s, growth in U.S. health care spending has typically been about 2% per year faster than growth in GDP, and that most economists trace this cost difference to the continual arrival of new and more expensive health care technologies. They write: " As we argue in this report, the U.S. health care system provides strong incentives for U.S. medical product innovators to invent high-cost products and provides relatively weak incentives to invent low-cost ones." The system also provides strong incentive to focus on drugs, devices, and health information technologies that will generate profits in high-income countries, not to find low-cost ways of addressing health problems in the rest of the world. Here are four of the examples they offer.

The cardiovascular “polypill” "refers to a multidrug combination pill intended to reduce blood pressure and cholesterol, known risk factors for the development of cardiovascular disease. The rationale is that combining four beneficial drugs in low doses in a single pill should produce an easy and affordable way to dramatically modify cardiovascular risk." But as the authors point out, even though a "polypill" only combines existing drugs, putting them in a single pill means that it would have to go through very expensive and length health and safety testing. The result would be a product that might be cheaper and more effective, but given that people could still take a handful of the other pills, the "polypill" would almost certainly be a low-profit product. Moreover, there have been several patents granted on aspects of a "polypill," so any company seeking to test such a pill would be likely to face a patent battle. No private company is likely to push this kind of innovation.

Better use of health information technology in patient records could save a lot of money in terms of lower paperwork costs, and also provide considerable health benefits by informing health care provides about past and current health experiences--for example, thus helping to minimize risks of allergic reactions or bad drug interactions. But despite various pushes and shoves, the health care sector has not been a leader in adopting and using information technology. Indeed, in many cases it seems to have soaked up the time of health care providers on one hand, while providing a tool for increasing the amount billed to insurance companies on the other hand.

The implantable cardioverter-defibrillator (ICD) is "an implantable device consisting of a small pulse generator (roughly half the size of a smartphone) and one or more thin wire leads threaded through large blood vessels into the heart. ICDs are designed to sense a life-threatening cardiac arrhythmia and automatically provide a dose of direct current (DC) electricity to jolt the patient’s heart back to normal." This technology works very well for some patients with heart disease, but not for others: specifically, it isn't recommended for "such as patients who are undergoing bypass surgery or in the early period following a heart attack, the first three months following coronary revascularization, severe heart failure (New York Heart Association Class IV), and those with newly diagnosed heart failure." Thus, this is a case of a positive and useful innovation that is quite likely overused--at substantial cost.

Prostate-specific antigen (PSA) is a test for whether men have prostate cancer. The authors write: "Despite PSA screening’s initial promise, multiple studies in the United States and in Europe have found that it does not reduce prostate cancer–specific mortality. Moreover, screening is associated with substantial harms caused by over-diagnosis and the complications that can occur from aggressive treatment. . . . Based on unfavorable findings, in 2012 the United States Preventive Services Task Force recommended against routine PSA screening for prostate cancer because the harms of screening outweigh the potential benefits. However, because federal law has not been changed, Medicare must still pay for the test’s use, as well as for the subsequent biopsies, surgical procedures, nonsurgical treatments, and complications that these procedures can cause."

The RAND authors point out a number of features of the U.S. health care system that can push innovation away from the methods that would  most improve health and decrease costs. For example, the existing incentives for innovation don't tend to reward methods that will lead to reduced spending. As they note, in a market full of insured third-party payers, there is "[l]imited price sensitivity on the part of consumers and payers. In addition, a bias arises from the  "limited time horizon of providers when they decide which medical products to use for which patients: In many instances, the health benefits from using a drug, device, or HIT are not realized until years in the future, at which time the patient is likely to be covered by a different insurer, such as Medicare. When this is the case, only the later insurer will obtain the financial benefits associated with the (long-delayed) health benefits." More broadly, "[m]any [health care] provider systems are siloed. When this is the case, most decisionmakers consider only the costs and benefits for their parts of their organizations, and few take into account savings that accrue outside of their silos."

They also write of "treatment creep" and the "medical arms race."
"Undesirable treatment creep often occurs when a medical product that provides substantial benefits to some patients is used for other patients for whom the health benefits are much smaller or completely absent. Treatment creep is encouraged by FFS [fee-for-service] payment arrangements, and it is enabled by lack of knowledge about which patients would truly benefit from which products. Treatment creep often involves using products for indications not approved by the FDA. Such “off-label” use—which delivers good value in some instances—is widespread and difficult to control. Treatment creep may reward developers with additional profits for inventing products whose use can be expanded to groups of patients who will benefit little. ..." 
"The “medical arms race” refers to hospitals and other facilities competing for business by making themselves attractive to physicians, who may care more about using new high-tech services than they care about lower prices. ... Robotic surgery for prostate cancer and proton beam radiation therapy provide striking examples of undesirable treatment creep: Although there is little or no evidence that they are superior to traditional treatments, these high-cost technologies have been successfully marketed directly to patients, hospitals, and physicians. High market rewards for such expensive technologies encourage inventors and investors to develop more of them—regardless of how much they improve health."
The authors have an eminently reasonable list of ways to alter the direction  of health care innovation: basically, thinking through the sources of R&D funding, regulatory approval, and decision-making by third-party payers. For example, there could be public prize contests for certain innovations, or some patents that seem to offer substantial health benefits could be bought out and placed in the public domain, and third-party payers (including Medicare and Medicaid) could place more emphasis on being willing to buy new technologies that cut costs. But I confess that as I look over their list of policy recommendations, I'm not sure they suffice to overcome the incentives currently built into the U.S. healthcare system.





Thursday, September 4, 2014

And Here Come the Interest Payments

The federal government has been on a borrowing binge since the start of the Great Recession. I've argued that in the short-run, the path of the budget deficits has been basically correct, because the deficits have helped to cushion the brutal economy of 2008-2009 and the sluggish recovery since then. But the long-term budget deficit picture is a problem.  And even those of us who have largely supported the budget deficits of the last few years need to face that fact that the bills will eventually come due, and interest payments by the federal government are likely to head sharply upward in the next few years.

For some perspective, here's a figure from the August 2014 Congressional Budget Office report, "An Update to the  Budget and  Economic Outlook:  2014 to 2024." The spending categories are expressed as a share of GDP. Thus, over the next decade Social Security and Major Health Care programs rise, and a number of other categories fall a bit. But the biggest spending jump in any of these categories is for interest payments.



Interest payments jump for two reasons: the recent accumulation of federal debt and the expectation that interest rates are going to rise. "Between calendar years 2014 and 2019, CBO expects, the interest rate on 3-month Treasury bills will rise from 0.1 percent to 3.5 percent and the rate on 10-year Treasury notes will rise from 2.8 percent to 4.7 percent; both will remain at those levels through 2024." Of course, predictions don't always come true. But the CBO has already scaled down how much it expects interest rates to rise, and its projections of future deficits may well be on the optimistic side.

When looking at spending as a share of GDP, it's useful to remember that the GDP is now around $17 trillion. This prediction shows a rise in federal interest payments from 1.3 percent of GDP in 2014 to 3.0 percent of GDP by 2024. Converted to actual dollars, this prediction means that the projected rise in interest payments from from $231 billion in 2014 to $799 billion in 2024--more than tripling in unadjusted dollars. By 2024, that's going to be $568 billion per year that isn't available for other spending or to finance tax cuts. It's going to bite  hard.

For an historical comparison, a December 2010 CBO report looked at "Federal Debt and
Interest Costs." The light blue line shows interest payments in nominal dollars, not adjusted for inflation or the size of the economy, and thus isn't useful for looking back several decades. The dark blue line helps to illustrate rise in interest rates is headed for its highest levels since we were paying off the government borrowing of the mid-1980s at relatively high interest rates into the mid-1990s.





When economic times are dire, as they were in the U.S. economy in 2008-2009, having the government borrow money makes sense. Given the lethargic pace of the growth that followed, and the underlying financial fragility of the economy, it made some sense not to make a dramatic push for lower deficits in the last few years. But the coming surge in interest payments is a warning signal that it's past time to start thinking about how to bring down budget deficits in the middle and longer-term.






Wednesday, September 3, 2014

Competition as a Form of Cooperation

Like most economists, I find myself from time to time confronting the complaint that economics is all about competition, when we should be emphasizing cooperation instead. One standard response to this concern focuses on making a distinction between the way people and firms actually behave and the ways in which moralists might prefer that they behave. But I often try a different answer, pointing out that the idea of cooperation is actually embedded in the meaning of the word "compete."

Check the etymology of "compete" in the Oxford English Dictionary. It tells you that the word derives from Latin, in which "com-" means "together" and "petÄ•re" has a variety of meanings, which include "to fall upon, assail, aim at, make for, try to reach, strive after, sue for, solicit, ask, seek." Based on this derivation, valid meanings of competition would be  "to aim at together," "to try to reach together" and "to strive after together." 

Competition can come in many forms. The kind of market competition that economists typically invoke is not about wolves competing in a pen full of sheep, nor is it competition between weeds to choke the flowerbed. The market-based competition envisioned in economics is disciplined by rules and reputations, and those who break the rules through fraud or theft or manipulation are clearly viewed as outside the shared process of competition. Market-based competition is closer in spirit to the interaction between Olympic figure-skaters, in which pressure from other competitors and from outside judges pushes individuals to strive for doing the old and familiar better, along with seeking out new innovations. Sure, the figure-skaters are trying their hardest to win. But in a broader sense, their process of training and coming together under agreed-upon rules is a deeply cooperative and shared enterprise.  

In fact, competition within a market context actually happens as a series of cooperative decisions, every time a buyer and seller come together in a mutually agreed and voluntarily made transaction. This idea of cooperation within the market is at the heart of what the philosopher Robert Nozick in his 1974 work Anarchy, State, Utopia referred to as “capitalist acts between consenting adults.”

Tuesday, September 2, 2014

Attendance Rates for U.S. K-12 Teachers

My heart always sinks a bit when one my children reports over dinner that their class had a substitute teacher that day. What usually follows is a discussion of the video they watched, or the worksheet they filled out, or how many other children in the class (never mine, of course) misbehaved. How prevalent is teacher absence from classes in the U.S.? The National Council on Teacher Quality collects some evidence in its June 2014 report "Roll call: The importance
of teacher attendance.

The study collected data from 40 urban school districts across the United States for the 2012-13 school year. The definition of "absence" in this study was that a substitute teacher was used in the classroom. Thus, the overall totals mix together the times when a teacher was absent from the classroom for sickness, for other personal leave, and for some kind of professional development. As the authors of the study note: "Importantly, we looked only at short-term absences, which are absences of 1 to 10 consecutive days. Long-term absences (absences lasting more than 10 consecutive days) were not included to exclude leave taken for serious illness and maternity/paternity leave."

The average teachers across these 40 districts was absent 11 days during the school year. This amount of teacher absence matters to students. The NCTQ study cites studies to make the point: "
As common sense suggests, teacher attendance is directly related to student outcomes: the more
teachers are absent, the more their students’ achievement suffers. When teachers are absent 10 days, the decrease in student achievement is equivalent to the difference between having a brand new teacher and one with two or three years more experience."

Here's a figure showing average rates of absence across the 40 districts. Again, these include professional development activities that take teachers our of the classroom, but do not include long-term absences or parental leave. Indianapolis, the District of Columbia, Louisville, and Milwaukee lead the way with relatively few teacher absences, while Cleveland, Columbus (what's with Ohio teachers?), Nashville, and Portland have relatively high numbers of teacher absences.

Based on little more than my own gut reaction, an average of 11 teacher absences per year seems a little on the high side to me. But as with so many issues in education, the real problem doesn't lie with the averages, but with the tail end of the distribution. The study calculates that 28% of teachers are "frequently absent," meaning that they missed 11-17 days of class, and an additional 16% are "chronically absent," meaning that they missed 18 or more days of class.

Here's a city-by-city charts showing the breakdown of teacher absence by category.

I'm willing to cut some slack to teachers who happen to have a lousy personal year and are chronically absent. But I have a hard time believing that across the United States, 1/6 of all teachers--that is, about 16%--are simultaneously having the kind of a lousy year that forces them to miss 18 or more school days. (Again, remember that these numbers don't include long-term sickness or parental leave.)  Those who can't find a way to show up for the job of classroom teacher, year after year, need to face some consequences.

A few years back in the Winter 2006 issue of the Journal of Economic Perspectives, Nazmul Chaudhury, Jeffrey Hammer, Michael Kremer, Karthik Muralidharan, and F. Halsey Rogers reported on "Missing in Action: Teacher and Health Worker Absence in Developing Countries." They wrote: "In this paper, we report results from surveys in which enumerators made unannounced visits to primary schools and health clinics in Bangladesh, Ecuador, India, Indonesia, Peru and Uganda and recorded whether they found teachers and health workers in the facilities. Averaging across the countries, about 19 percent of teachers and 35 percent of health workers were absent. The survey focused on whether providers were present in their facilities, but since many providers who were at their facilities were not working, even these figures may present too favorable a picture." The situation with U.S. teacher absence isn't directly comparable, of course. One suspect that the provision of substitute teachers is a lot better in Cleveland, Columbus, Nashville, and Portland than in Bangladesh, Ecuador, India, and Indonesia. Still, wherever it occurs, an institutional culture where many teachers don't show up needs to be confronted.

Monday, September 1, 2014

The Origins of Labor Day

It's clear that the first Labor Day celebration was held on Tuesday, September 5, 1882, and organized by the Central Labor Union, an early trade union organization operating in the greater New York City area in the 1880s. By the early 1890s, more than 20 states had adopted the holiday. On June 28, 1894, President Grover Cleveland signed into law: ''The first Monday of September in each year, being the day celebrated and known as Labor's Holiday, is hereby made a legal public holiday, to all intents and purposes, in the same manner as Christmas, the first day of January, the twenty-second day of February, the thirtieth day of May, and the fourth day of July are now made by law public holidays." (Note: This post has been reprinted on this blog each year since 2011.)

What is less well-known, at least to me, is that the very first Labor Day parade almost didn't happen, and that historians now dispute which person is most responsible for that first Labor Day. The U.S. Department of Labor tells how first Labor Day almost didn't happen, for lack of a band:

"On the morning of September 5, 1882, a crowd of spectators filled the sidewalks of lower Manhattan near city hall and along Broadway. They had come early, well before the Labor Day Parade marchers, to claim the best vantage points from which to view the first Labor Day Parade. A newspaper account of the day described "...men on horseback, men wearing regalia, men with society aprons, and men with flags, musical instruments, badges, and all the other paraphernalia of a procession.
The police, wary that a riot would break out, were out in force that morning as well. By 9 a.m., columns of police and club-wielding officers on horseback surrounded city hall. By 10 a.m., the Grand Marshall of the parade, William McCabe, his aides and their police escort were all in place for the start of the parade. There was only one problem: none of the men had moved. The few marchers that had shown up had no music.
According to McCabe, the spectators began to suggest that he give up the idea of parading, but he was determined to start on time with the few marchers that had shown up. Suddenly, Mathew Maguire of the Central Labor Union of New York (and probably the father of Labor Day) ran across the lawn and told McCabe that two hundred marchers from the Jewelers Union of Newark Two had just crossed the ferry — and they had a band!
Just after 10 a.m., the marching jewelers turned onto lower Broadway — they were playing "When I First Put This Uniform On," from Patience, an opera by Gilbert and Sullivan. The police escort then took its place in the street. When the jewelers marched past McCabe and his aides, they followed in behind. Then, spectators began to join the march. Eventually there were 700 men in line in the first of three divisions of Labor Day marchers.
With all of the pieces in place, the parade marched through lower Manhattan. The New York Tribune reported that, "The windows and roofs and even the lamp posts and awning frames were occupied by persons anxious to get a good view of the first parade in New York of workingmen of all trades united in one organization.
At noon, the marchers arrived at Reservoir Park, the termination point of the parade. While some returned to work, most continued on to the post-parade party at Wendel's Elm Park at 92nd Street and Ninth Avenue; even some unions that had not participated in the parade showed up to join in the post-parade festivities that included speeches, a picnic, an abundance of cigars and, "Lager beer kegs... mounted in every conceivable place.
From 1 p.m. until 9 p.m. that night, nearly 25,000 union members and their families filled the park and celebrated the very first, and almost entirely disastrous, Labor Day."

As to the originator of Labor Day, the traditional story I learned back in the day gave credit to Peter McGuire, the founder of the Carpenters Union and a co-founder of the American Federation of Labor. At a meeting of the Central Labor Union of New York on May 8, 1882, the story went, he recommended that Labor Day be designated to honor "those who from rude nature have delved and carved all the grandeur we behold." McGuire also typically received credit for suggesting the first Monday in September for the holiday, "as it would come at the most pleasant season of the year, nearly midway between the Fourth of July and Thanksgiving, and would fill a wide gap in the chronology of legal holidays." He envisioned that the day would begin with a parade, "which would publicly show the strength and esprit de corps of the trade and labor organizations," and then continue with "a picnic or festival in some grove.

But in recent years, the International Association of Machinists have also staked their claim, because one of their members named Matthew Maguire, a machinist, was serving as secretary of the Central Labor Union in New York in 1882 and clearly played a major role in organizing the day. The U.S. Department of Labor has a quick summary of the controversy.
"According to the New Jersey Historical Society, after President Cleveland signed into law the creation of a national Labor Day, The Paterson (N.J.) Morning Call published an opinion piece entitled, "Honor to Whom Honor is Due," which stated that "the souvenir pen should go to Alderman Matthew Maguire of this city, who is the undisputed author of Labor Day as a holiday." This editorial also referred to Maguire as the "Father of the Labor Day holiday.
So why has Matthew Maguire been overlooked as the "Father of Labor Day"? According to The First Labor Day Parade, by Ted Watts, Maguire held some political beliefs that were considered fairly radical for the day and also for Samuel Gompers and his American Federation of Labor. Allegedly, Gompers did not want Labor Day to become associated with the sort of "radical" politics of Matthew Maguire, so in a 1897 interview, Gompers' close friend Peter J. McGuire was assigned the credit for the origination of Labor Day."