The February 2012 issue of Economy and Society is a special issue focused on a theme of "Strategic unknowns: towards a sociology of ignorance." The opening essay with this title, by Linsey McGoey, is freely available here. Many academics will have access to the rest of the issue through their library subscriptions.
The central theme of the issue is that ambiguity and ignorance are not just the absence of knowledge, waiting to be illuminated by facts and disclosure. Instead, ambiguity and ignorance are in certain situations the preferred strategic outcome. McGoey writes (citations omitted): "Ignorance is knowledge: that is the starting premise and impetus of the following collection of papers. Together, they contribute to a small but growing literature which explores how different forms of strategic ignorance and social unknowing help both to maintain and to disrupt social and political orders, allowing both governors and the governed to deny awareness of things it is not in their interest to acknowledge ..."
Many of the examples are sociological in nature, but others are based in economic and policy situations. For example, consider a number of situations that have to do with a policy response to risky situations: the risk that smoking causes cancer, the risk that growing carbon emissions will lead to climate change, the risk of future terrorist actions (and whether invading certain countries will increase or reduce those risks), and the risk of fluctuations in fluctuations in financial markets. McGoey writes:
"Within the game of predicting risk, one often wins regardless of whether risks materialize or not. If a predicted threat fails to emerge, the identification of the threat is credited for deterring it. If a predicted threat does emerge, authorities are commended for their foresight. If an unpredicted threat appears, authorities have a right to call for more resources to combat their own earlier ignorance. ‘The beauty of a futuristic vision, of course, is that it does not have to be true’, writes Kaushik Sunder Rajan (2006, p. 121) in a study of the way expectations surrounding new biotechnologies help to create funding opportunities and foster faith in the technology regardless of whether expectations prove true or not. In fact, expectations are often particularly fruitful when they fail to materialize, for more hope and hype are needed to remedy thwarted expectations. Attention to the resilience of risks the way that claims of risk often feed on their own inaccuracy helps to highlight the value of conditionality for those in political authority."
One of the essays in the volume, by William Davies and Linsey McGoey, applies this framework to thinking about the recent financial crisis. They point out that many financial professionals begin from the starting point that risk and uncertainty are huge problems, and thus one needs their high-priced help to address these issues. In this way, claims of ambiguity and ignorance are an asset for the finance industry. If the investment go well, then the financial professionals claim credit for steering successfully through these oceans of uncertainty. But when investments and decision go badly, as in the Great Recession, they claim absolution for their decisions by reiterating just how ambiguous and unclear the financial markets are, and how no one could have really known what was going to happen. And somehow, this just proves that their expertise is more needed than ever. They write: "We examine the usefulness of the failure or refusal to act on warning signs, regardless of the motivations why. We look at the double value of ignorance: the ways that social silence surrounding unsettling facts enabled profitable activities to endure despite unease about their implications and, second, the way earlier silences are then harnessed and mobilized to absolve earlier inaction.
In another essay, Jacqueline Best applies these ideas in the context of the World Bank's "good governance agenda" and the IMF's "conditionality policy." She writes: "Both policies have been ambiguously defined throughout their history, enabling them to be interpreted and applied in different ways. This ambiguity has facilitated the gradual expansion of the scope of the policies. ... Actors at both the IMF and the World Bank were not only aware of the central role of ambiguity in their policies, but were also ambivalent about it. ... Finally, although staff and directors at both institutions may have been ambivalent about the role of ambiguity in these policies, they ultimately ensured that ambiguities persisted and even proliferated." Best also notes that ambiguity is hard to control, and can lead to unintended consequences.
In yet another essay, Steve Rayner write about "Uncomfortable knowledge: the social construction of ignorance in science and environmental policy discourses." He writes: "My interest is therefore in how information is kept out rather than kept in and my approach is to treat ignorance as a necessary social achievement rather than a a simple background failure to acquire, store, and retrieve knowledge." Rayner writes: "An example of clumsy or incompletely theorized arrangements is the implicit consensus on US nuclear energy policy that emerged in the 1980s and persisted for the best part of three decades. Despite the complete absence of any Act of Congress or Presidential Order, it was implicitly accepted by government, industry, and environmental NGOs that the US would continue to support nuclear R&D while operating an informal moratorium on the addition of new nuclear generating capacity. All of the parties agreed to this, but for various reasons, all had a stake in not acknowledging the existence of an settlement."
One might add that many environmental laws and other regulatory policies are chock-full of ambiguous language, which gives regulators the ability to interpret these rules as tough-minded while also giving potential offenders the possibility of saying that they had no way of knowing the rules would be applied in this way. Rayner also offers a nicely provocative claim about tendencies to dismiss and deny in the context of warnings about climate change: "It seems odd that climate science has been held to a `platinum standard' of precision and reliability that goes well beyond anything that is normally required to make significant decisions in either the public or private sectors. Governments have recently gone to war based on much lower-quality intelligence than that which science offers us about climate change. Similar firms embark on product launches and mergers on the bases of much lower-quality information."
Academic research of course often uses of a feigned ignorance to generate a greater persuasive effect. The title of a research paper is often written in the form of a question, and the theory and data are often presented as if the author was a Solomonic figure encountering this material for the first time, guided only by a disinterested pursuit of Truth (with a capital T). The implications for reputation of past past work, or its political implications, are shunted off to the side. Research would have less persuasive effect if it started off by saying, "I've been hammering on this same conclusion for 25 years now, and I find pretty much exactly the same result every time I look at any data set from any time or place--and by the way, this conclusion also supports the political outcomes I prefer."
One of many implications of thinking about ignorance and ambiguity as assets and as strategic behavior is that it highlights that many economic actors and policy-makers have strong incentives to promote both their own ignorance, and more broadly, the idea that ambiguity makes true knowledge impossible. Ignorance can be a power grab, and the basis for a job, and a get-out-of-jail-free card.