"In a 1957 work, An Economic Theory of Democracy, the economist Anthony Downs stated the problem this way: “It seems probable that for a great many citizens in a democracy, rational behavior excludes any investment whatever in political information per se. No matter how significant a difference between parties is revealed to the rational citizen by his free information, or how uncertain he is about which party to support, he realizes that his vote has almost no chance of influencing the outcome. . . . He will not even utilize all the free information available, since assimilating it takes time.” In his classic 1948 novel Walden II, the psychologist B. F. Skinner puts the issue even more succinctly via one of his characters, who states: “The chance that one man’s vote will decide the issue in a national election . . . is less than the chance that he will be killed on his way to the polls.”
Indeed, one of the arguments for compulsory voting is that otherwise, not enough people will bother. The arguments for why people vote quickly seem to invoke social motivations: that is, people vote because they feel part of a broader society, and participating in that society seems like a social norm that is worthwhile to them. But finding a way to confirm this feeling, rather than to assert its existence, and to measure its intensity has been elusive.
Stefano DellaVigna, John A. List, Ulrike Malmendier, and Gautam Rao offer a deeply interesting experiment along these lines in "Voting to Tell Others," which is available as NBER Working Paper #19832 (January 2014), but they have also written a nice readable overview for the Vox website here. Along with what the study has to say about voting, it also offers an interesting and hands-on method for doing social science.
The authors mix together two sorts of data. One set of data they create themselves by interviewing people in some Chicago suburbs about whether they voted. However, some of the people received a flyer on their doorknob in advance. Some of the flyers said that a someone would come to do a five-minute survey. Other flyers said that it would be a five-minute survey "on your voter participation in the 2010 Congressional election." Some of the flyers also promised to pay $10 for participating in the survey, and some said that the survey would take 10 rather than 5 minutes. Thus, the first set of data is how many people come to the door with and without the flyers, with longer and shorter times, with a promise of monetary payment and without. But the second big source of data is that the researchers had already accessed the voting rolls, so they actually already knew whether people had voted. Thus, they could compare the answers people gave, and whether people were more likely to respond to the survey, according to whether they had actually voted.
With this study design in in place, they draw several conclusions:
Finding 1: Voters do not feel pride from saying they voted, but non-voters do feel shame ... In fact, voting households are slightly less likely to answer the door and do the survey when they are informed about the turnout question. However, non-voters sort out significantly, decreasing their survey participation by 20%. ... We find that the effect of reducing payment by $10 is comparable to the sorting response of non-voters to the election flyer. In other words, non-voters appear to dislike being asked whether they voted as much as they dislike being paid $10 less for completing the survey. ...
Finding 2: Non-voters lie and claim they voted half the time, while voters tell the truth ... We find that voters tell the truth and say they voted 90% of the time, while non-voters lie and claim to have voted 46% of the time.This kind of research clearly doesn't fit the stereotype of the economist sitting in an office, downloading electronic data to a spreadsheet. Instead, these researchers hired "many" undergraduate students to distribute the flyers, and 50 people to carry out the surveys, thus accumulating a dataset of over 13,000 households. Because of the pre-planned and random variation across the different households, with and without flyers before the survey, and with different information on the flyers, there is compelling reason to believe that they are capturing something real about how people think about voting.
Of course, one may raise concerns that the 2010 Congressional elections in in Illinois were a special case in some way, and the result might not generalize elsewhere. Those who raise such objections now have a template for the follow-up research they should to address those concerns.