Monday, November 7, 2011

Costs of Air Pollution in the U.S.

What costs does air pollution impose on the U.S. economy? Nicholas Z. Muller, Robert Mendelsohn, and William Nordhaus tackle that question in the August 2011 issue of the American Economic Review. AER articles are typically available only by subscription to the journal, but their article, "Environmental Accounting for Pollution in the United States," is publicly available. I'll start here by summarizing some of their findings, and then backtrack to offer a quick overview of their methodology.

Total "gross external damages" the six "criterion" air pollutants in 2002--sulfur dioxide, nitrogen oxides, volatile organic compounds, ammonia, fine particulate matter,and coarse particulate matter--was $182 billion. Since GDP was about $10.5 trillion in 2002,  the cost of air pollution was a bit under 2% of the total."The effects included in the model calculations are adverse consequences for human health, decreased timber and agriculture yields, reduced visibility, accelerated depreciation of materials, and reductions in recreation services." This total does not include costs of carbon emissions, for which comprehensive sector-by-sector, place-by-place data are not available in 2002.


The sectors with the biggest air pollution costs measured in terms of "gross external damages" (GED) (counting the same six pollutants but again not counting carbon emissions) are utilities, agriculture/forestry, transportation, and manufacturing. If one looks at the ratio of gross economic damages to value-added in the sector, agriculture/forestry and utilities lead the way by far with ratios above one-third. Manufacturing has fairly high gross external damages, but the GED/VA ratio for the sector as a whole is only 0.01.


If one breaks down sectors into specific, here is a list of all industries that have either more than $4 billion in gross external damages from air pollution or a GED/VA ratio (gross external damages divided by value added) of more than 0.45. In particular, coal-fired power generation jumps off the list to me, with its very large GED and a GED/VA ratio of 2.2.

If pollution taxes or tradeable pollution permits were imposed, so that industry was required to take the social costs of pollution into account, the value of the gross external damages caused by air pollution would be reduced by about four-fifths.

Taking carbon emissions into account, which they do for the electric power industry, makes a relatively small difference to the harms of coal-fired plants, but a larger relative difference for natural gas power plants. Coal-fired power plants already have gross external damages of $53.4 billion, and adding the costs of carbon (priced at $27/ton of emissions) raises that total to $68.7 billion. However, natural gas power plants emit relatively small levels of the six "criteria" pollutants and have gross external damages of only $0.9 billion. For them, adding costs of carbon emissions nearly quadruples their gross external damages to $3.4 billion--and raises their GED/VA ratio from a worrisome 0.34 to an eyebrow-raising 1.30.



The methodology behind these estimates is quite reasonable, which means of necessity that it is also comprehensive and complex. The basic approach in this kind of work is to choose what you think are the most plausible estimates, but also to compare them with other data sources, other models, and other estimates, so that you can continually double-check the reasonableness of your choices.

They start with an inventory of all U.S. air pollution emissions published by the Environmental Protection Agency, covering emissions in 2002. It includes 10,000 emissions sources, including 656 point sources (individual facilities) and then area sources, like vehicles and other stationary sources, at the county level. The source of these emissions is distinguished by height--which affects their environmental costs--and also categorized by six-digit industry code. Thus, these authors use the Air Pollution Emission Experiments and Policy model to look at how these emissions spread. However, they cross-check this approach by looking at another model, the Community Multiscale Air Quality model. They look at particular studies for how each of these pollutants affects health and other costs--and compare their chosen studies to others that are available.
They use a value of a statistical life for an average worker of $6 million, but also do illustrative calculations using $2 million and $10 million.  The overall result is like building up a mosaic one tile at a time: even if you disagree with the placement or color of an individual tile here or there, the overall picture is persuasive.

To me, a lesson that emerges from these calculations is that the costs of air pollution and of burning fossil fuels are very high, both in absolute terms and compared to the value-added of certain industries, even without taking carbon emissions into account. Environmentalists who are discouraged by their inability to persuade more people of the risks of climate change might have more luck in reducing carbon emissions if they deemphasized that topic--and instead focused on the costs of these old-fashioned pollutants. 



Friday, November 4, 2011

The Diminishing Gender Wage Gap in the U.S.

Natalia Kolesnikova and Yang Liu of the St. Louis Fed have an interesting overview of the evidence: "Gender Wage Gap May Be Much Smaller Than Most Think."

Start with a provocative figure, comparing median weekly earnings of full-time male and female workers from 1979 to 2011. Back when I was starting college in 1979, it was common to hear the claim that women earned only about 70% of what men earned. The data from the figure in 1979 showing a wage gap of about 35% at that time backs up that claim. But since then, the gap has fallen to 16.5%. 

Of course, this sort of graph is just the beginning of a serious discussion. An obvious next step is to adjust these median wage differentials for demographic characteristics like educational attainment, work experience, occupation, career interruptions, overtime worked, availability of fringe benefits, and the like. These sorts of adjustments typically push the remaining gender wage gap down into low single digits. Moreover, the higher levels of women now attending college certainly suggest that the wage gap will diminish further in the future.

The standard response is to point out that a number of these adjustments to the wage gap are not exogenous choices by women, but instead are part of societal pressures. For example, the ease with which women can leave or re-enter the labor force is related to social, legal, and government support that makes it easier to do so. Adjusting for occupation means adjusting away the fact that women are still more likely to be teachers, nurses, and office clerks than men, and less like to be lawyers, doctors and top executives. Indeed, using the median wage in the figure above, rather than an average, means that the wage ratios are not affected by the much higher growth of incomes in the top few percentage points of the wage distribution--wage growth that has disproportionately benefited men.

Decades ago, newspapers used to run separate help-wanted ads for men's jobs and women's jobs, and if a woman who was teaching school married, she often was required to quit her job. That sort of egregious gender discrimination is largely in the past. But a more delicate interplay of gender roles, legal rules, and labor market outcomes remains. 


"Big Oil"--Actually Small and Vulnerable

When it comes to Big Oil, I run into a lot of people who are still living in the 1960s. They still think that Exxon and Mobil and Shell and BP and a few others dominate world oil markets. The October 29 issue of the Economist has a nice article ("Big Oil’s bigger brothers"), which puts the modern reality of Big Oil in context. 


When it comes to reserves, the big country-owned firms control 80% of the world's oil, while the private oil companies are bit players. 


ExxonMobil, Shell and BP do continue to have enormous expertise in getting to hard-to-reach oil. When world oil prices are high, and this costly technology gets put to work, they can make high profits. But their technological leadership is under continual challenge  both from the state-owned oil companies and from small technology-intensive private firms--often working together. Moreover, Big Oil is very vulnerable to a drop in oil prices. As the Economist writes: 


"Life is getting harder for the supermajors. Their edge over their rivals—the ability to extract oil from difficult places—is terrifically useful while prices are high. But since it is terrifically costly to extract oil from difficult places, their competitive advantage fizzles if oil prices fall. If it does, their bumper profits could vanish like a pool of petrol into which a lighted match has been carelessly dropped."

Thursday, November 3, 2011

Recognizing Non-formal and Informal Learning

When information is imperfect, markets may not work well. If consumers are highly uncertain about the quality of what they are buying, they become less likely to buy. If a lender is highly uncertain about whether a potential borrower will repay a loan, the lender is less likely to make that loan. The more uncertain that an employer is about the quality of a potential employee, the less likely the employer is to hire that person. This problem of imperfect information in labor markets is especially severe now, with an unemployment rate that has been between 8.8% and 10.1% since April 2009. If someone hasn't been working, how can an employer judge their skills and talents?


Is there a way in which workers with experience could have a way of demonstrating their skills and knowledge that didn't involve taking a class or getting a degree? There are some recent experiments along these lines in the U.S. economy, but it turns out that several other countries have already developed processes for recognizing  non-formal and informal learning.

Jeff Selingo, who is editorial director of the Chronicle of Higher Education, tackled the question of whether colleges might lose their near-monopoly power over anointing people with job credentials in a short article last month. Selingo writes:

The day when other organizations besides colleges provide a nondegree credential to signify learning might not be as far off as we think. One interesting project on this front is an effort to create “digital badges,” which would allow people to demonstrate their skills and knowledge to prospective employers without necessarily having a degree. Badges could recognize, for example, informal learning that happens outside the classroom; “soft skills,” such as critical thinking and communication; and new literacies, such as aggregating information from various sources and judging its quality. And in a digital age, the badge could include links back to documents and other artifacts demonstrating the work that led to earning the stamp of approval.

Until now an interesting-but-somewhat-fringe idea, digital badges received a big boost last week, when the John D. and Catherine T. MacArthur Foundation announced a $2-million competition to create and develop badges and a badge system. (The contest is also supported by Mozilla and the Humanities, Arts, Sciences, and Technology Advance Collaboratory, otherwise known as Hastac.)

At the announcement in Washington, the U.S. secretary of education, Arne Duncan, called badges a “game-changing strategy” and said his agency would join with the Department of Veterans Affairs to award $25,000 for the best badge prototype that serves veterans looking for well-paying jobs. Under a badge system, colleges would no longer be the sole providers of a credential. While badges could be awarded by traditional colleges, they could also be given out by professional organizations, online and open-courseware providers, companies, or community groups.

In the Autumn 2011 issue of the Wilson Quarterly (not available free online), Kevin Carey writes in an essay about "College for All?" about how the Western Governors University is awarding degrees based on competency, not classroom hours. Carey writes:

"While American higher education is diverse in many ways, encompassing a variety of missions and constituencies, it is remarkably undiverse when it comes to awarding degrees. Every institution grants the same two- and four-year credentials that signify little more than how many hours the bearer sat in classrooms. Newer institutions such as Western Governors University (WGU) are turning that equation upside-down, awarding degrees when students demonstrate defined competencies, regardless of how long it took to achieve them."

WGU is a fully accredited nonprofit institution founded in the 1990s by the governors of 19 western states that now enrolls 25,000 mostly adult students online. It currently focuses on occupation-specific fields such as education, business, and health care. But efforts are afoot to expand the model into more traditional academic fields.

The WGU experiment points to a future public education system in which public subsidies are tied to commonly understood goals for learning, not how old the student happens to be or whether he or she happens to live. In increasingly digital learning environments, it will be possible to track, store, and summarize evidence of learning in ways that render traditional time-based credentials obsolete."

On the international front, Patrick Werquin wote an OECD report on "Recognising Non-Formal and Informal Learning: Outcomes, Policies and Practices" which was published in spring 2010 (and to my knowledge is not freely available on-line). Some highlights (omitting some references for readability): 

"All data on lifelong learning indicate that the highest qualification held by the great majority of people is obtained in the formal system of education and initial training, which in the case of many adults occurred some time ago. This is confirmed by other sources revealing that almost 90%of adult learning initiatives do not lead to a qualification, even though, depending on the country, 20-60% of  individuals who embark on learning do so primarily to obtain one. ... There is therefore a patent lack of visibility as regards people's real knowledge, skills and competences, since those acquired during their working lives or other activities remain invisible. This lack of visibility is all the more significant for those who left the initial education and training system many ears earlier. It is also especially detrimental to those with a low level of qualification ..."

"More recently, OECD (2007) ranked the recognition of non-formal and informal learning outcomes high on a list of 20 mechanisms identified as potentially capable of motivating learning. At the same time, major international organizations are showing a close interest in the recognition of learning outcomes. All these studies point in the same direction: formal learning alone cannot account for all of the learning encompassed by the concept of lifelong learning. There is thus no shortage of studies that argue for the recognition of non-formal and informal learning outcomes. ..."

Werquin's report for the OECD lists mechanisms for recognizing non-formal and informal learning in 21 countries--notably, with no mention of any such effort in United States. Two countries with especially well-developed policies along these lines are Ireland, which has certificates for Recognition of Prior Learning (RPL), Accreditation of Prior Experiential Learning (APEL), Recognition of Current Competences (RCC), Learning Outside Formal Teaching (LOFT) and others, and Norway, which has a "skills passport" system.

These systems for recognizing non-formal and informal learning systems vary considerably across countries. I can imagine a number of practical concerns. But for many Americans, maybe especially those with lower and medium skill levels, their educational credentials (often from long-ago) don't reveal their true skill set. For many of them, going back to school for some additional degree or certificate is impractical, and frankly a waste of time--because whether they have a piece of paper from an educational institution to prove it, they have already acquired the skills they need for many jobs. America should be thinking more about ways of connecting potential workers to the labor market that don't involve telling those who don't flourish in school that they need to keep attending. A couple of weeks ago I posted on Apprenticeships for the U.S. Economy as one such option. Ways to recognize competences achieved through non-formal and informal learning seems like a complementary approach.








Wednesday, November 2, 2011

What if Country Size Was Relative to Population? A World Map

Joseph Chamie, former director of the UN Population Division and now Director of the Center for Migration Studies,is interviewed in the Third Quarter 2011 issue Southwest Economy, a publication of the Dallas Fed: 
On the Record: Shifting from World Population Explosion to Global Aging--A Conversation with Joseph Chamie.

The interview includes one of those usefully provocative maps: What would a map of the world look like if it were distorted so that the size of every country is relative to its population? The patterns are expected: In North America, Canada shrinks and Mexico grows. In the rest of the world, Russia shrinks and China and India grow. Japan looks a lot larger when weighted by population; Australia looks smaller. Africa appears notably larger than South America. For me, such maps also emphasize that U.S. economic growth over the next few decades is likely to be related to how extensively the American economy participates in the growth that is happening in the rest of the world.


Chamie also points out that the growth rate of world population is slowing dramatically. One of the next main demographic preoccupations will be population aging. Soon, the above-65 elderly in the world population will exceed the number of children for the first time in world history.

"Two thousand years ago, world population was estimated at about 300 million. It reached the first billion mark at the beginning of the 19th century—the estimate is about 1804—when Thomas Jefferson was U.S. president. The second billion mark was reached in 1927. We had a tripling of world population from 1927 to near the end of the 20th century, when it reached 6 billion. We’re now approaching 7 billion people.
Why did that happen? It’s because we had this wonderful thing occur: a decline in mortality rates. This decrease in mortality is humanity’s greatest achievement. Every government wishes to see lower mortality and longer life. The world benefited from modern medicine and public health; antibiotics, of course; also better nutrition, better facilities, better working conditions. What lagged behind were changes in birth rates. This difference between birth rates and death rates gave rise to what is commonly called the population explosion. We reached a peak population growth rate of about 2.1 percent in the late ’60s, and we reached the peak annual increase of about 87 million people in the late ’80s. The latest United Nations projections show a world of about 10.1 billion people by the end of the 21st century. ..."

"While the 20th century was the century of demographic growth (and this growth will continue through the 21st century—we are likely to add 2 to 3 billion people), the world’s population is aging. Very soon, we will see a reversal where the number of children, which has historically been more than the number of people above 65, will become less than the elderly. The aging of the world’s population will be pervasive; it will affect every household. It will affect the economy, social interactions, voting patterns, lifestyles."

Tuesday, November 1, 2011

Lorenz curves and Gini coefficients: CBO #3.

This is the third of three posts based on the the Congressional Budget Office report,  "Trends in the Distribution of Household Income Between 1979 and 2007."  The first was Incomes of the Top 1%, and the second was about Federal Redistribution is Dropping.

This post focuses on explaining some basic tools for measuring inequality. The Lorenz curve offers an intuitively clear picture of inequality. The Gini coefficient, which is based on the curve, offers a way of measuring inequality across the income distribution as a single number--and thus is often used in graphs and figures about inequality. The CBO report has a nice clear explanation of these topics.

The Lorenz curve

The Lorenz curve was developed by an American statistician and economist named Max Lorenz when he was a graduate student at the University of Wisconsin. His article on the the topic
"Methods of Measuring the Concentration of Wealth," appeared in Publications of the American Statistical Association , Vol. 9, No. 70 (Jun., 1905), pp. 209-219. The CBO report explains it this way:  

"The cumulative percentage of income can be plotted against the cumulative percentage of the population, producing a so-called Lorenz curve (see the figure). The more even the income distribution is, the closer to a 45-degree line the Lorenz curve is. At one extreme, if each income group had the same income, then the cumulative income share would equal the cumulative population share, and the Lorenz curve would follow the 45-degree line, known as the line of equality. At the other extreme, if the highest income group earned all the income, the Lorenz curve would be flat across the vast majority of the income range,following the bottom edge of the figure, and then jump to the top of the figure at the very right-hand edge.

Lorenz curves for actual income distributions fall between those two hypothetical extremes. Typically, they intersect the diagonal line only at the very first and last points. Between those points, the curves are bow-shaped below the 45-degree line. The Lorenz curve of market income falls to the right and below the curve for after-tax income, reflecting its greater inequality. Both curves fall to the right and below the line of equality, reflecting the inequality in both market income and after-tax income."

The Gini coefficient

The Gini coefficient was developed by an Italian statistician (and noted fascist thinker) Corrado Gini in a 1912 paper written in Italian (and to my knowledge not freely available on the web). The intuition is straightforward (although the mathematical formula will look a little messier). On a Lorenz curve, greater equality means that the line based on actual data is closer to the 45-degree line that shows a perfectly equal distribution. Greater inequality means that the line based on actual data will be more "bowed" away from the 45-degree line. The Gini coefficient is based on the area between the 45-degree line and the actual data line. As the CBO writes:

"The Gini index is equal to twice the area between the 45-degree line and the Lorenz curve. Once again, the
extreme cases of complete equality and complete inequality bound the measure. At one extreme, if
income was evenly distributed and the Lorenz curve followed the 45-degree line, there would be no area
between the curve and the line, so the Gini index would be zero. At the other extreme, if all income was
in the highest income group, the area between the line and the curve would be equal to the entire area
under the line, and the Gini index would equal one. The Gini index for [U.S.] after-tax income in 2007 was
0.489—about halfway between those two extremes."





Federal Redistribution is Dropping: CBO #2

This is the second of three posts on the recent Congressional Budget Office Report "Trends in the Distribution of Household Income Between 1979 and 2007."  The first post on Incomes of the Top 1% is here, while the third explains the concepts of the Lorenz Curve and the Gini Coefficient.

The federal government can redistribute income in two ways: by taking those with high incomes relatively more, and by making transfer payments to those with lower income. Using the Gini index (explained in the third post in this grouping) as a measure of inequality, CBO reports: "The dispersion of after-tax income in 2007 is about four-fifths as large as the dispersion of market income. Roughly 60 percent of the difference in dispersion between market income and after-tax income is attributable to transfers and roughly 40 percent is attributable to federal taxes.The redistributive effect of transfers and federal taxes was smaller in 2007 than in 1979 ..."

Redistribution through federal taxes

Here are three figures showing average federal tax rates paid by the top 1% of the income distribution in each year from 1979 to 2007, by the 81st to 99th percentiles, the 21st to 80th percentiles, and the lowest 20%. The first graph shows average payments as a share of income for individual income tax, the second shows payroll taxes, and the third shows all federal taxes combined. Here are a few patterns that jump out.

  • The top 1% pays more of its income on average in income taxes, but much less in terms of payroll taxes. Of course, this is the income on which Social Security payroll taxes must be paid is  capped, so such taxes are a smaller share of income for those with very high incomes. 
  • With income taxes, the lowest quintile pays on average a negative tax rate: that is, with refundable tax credits, they receive more from the federal government through the tax code than they pay. 
  • Total taxes paid as a share of income have dropped off somewhat for all groups in the last decade or so; if one looks back to the mid-1990s, the drop in tax rates for the top 1% looks larger than for other groups.
  • The overall patterns here seem to be that the federal tax code as a whole became less progressive in the 1980s, more progressive in the 1990s, and since then has either not changed or become slightly less progressive, depending on what statistical measure one chooses to emphasize.



These sorts of graphs always turn my mind to Warren Buffett, and his claim that he pays less in taxes than his secretaries. For example, see Buffett's August 14 article, "Stop Coddling the Super-Rich," in the New York Times, where he writes: "Last year my federal tax bill — the income tax I paid, as well as payroll taxes paid by me and on my behalf — was $6,938,744. That sounds like a lot of money. But what I paid was only 17.4 percent of my taxable income — and that’s actually a lower percentage than was paid by any of the other 20 people in our office. Their tax burdens ranged from 33 percent to 41 percent and averaged 36 percent."
What Buffett pays as a share of income sounds plausible to me: he is well-known for taking a fairly small annual salary and then receiving most of his income in the form of gains from his investments. Because many of the investments have been held for long periods of time, they are subject to a lower capital gains tax rate. But the tax burdens that Buffett claims for his staff look unrealistically high. Let's say his office staff are in the 81st-99th percentiles of the income distribution. Average tax rates for that group are in the range of 22-23% in recent years. Buffett's staff might face a marginal tax rate might be 36% or 41%, depending on rules bout phase-outs of deductions and the like. But if Buffett's staff really are paying an average federal tax rate of 36% as a share of total income, they need access to better accountants or tax lawyer. 

 Redistribution through federal transfer payments

Federal government transfer payments are about 10-12% of household market income from 1979 to 2007. Spending in this category is heavily driven by Social Security and Medicare. In recent years, about half of federal transfer spending is Social Security. A third is health-related programs like Medicare and Medicaid. The rest is programs like unemployment insurance and welfare.

The share of federal transfer spending on the elderly is rising. In 1979 about 62% of all federal transfer payments went to elderly childless households, while about 19% went to nonelderly childless households and another 19% to households with children. By 2007, 69% of all federal transfer payments went to elderly childless households. The share going to nonelderly childless households stayed about the same, and the share going to households with children fell to about 11%.

Of course, Social Security and Medicare are not means-tested programs, so as they took a larger share of the federal transfer pie, the share going to the poor declined. Not coincidentally, back in 1979 about 54% of federal transfers went to households in the lowest quintile of income; by 2007, only about 36% of federal transfers went to households in the lowest quintile of income.

Summing up the redistribution by federal taxes and transfers
Here's a final figure showing how federal transfers and taxes affect income inequality, which is measured by the percent amount that these policies reduce the Gini index of inequality. The extent to which these policies reduce income inequality dropped in the 1980s, rose in the early 1990s, dropped in the late 1990s, rose in the early 2000s and has fallen since then. Interestingly, the diminished effect of federal redistribution since the mid-1990s is, by this measure, much more traceable to the changes in transfer payments than to the changes in progressivity of taxes.