Iwo Hwerka provides a short readable overview of some of the evidence behind "the curse of knowledge at the "Towards Data Science" blog (November 26, 2019). For example, one study asked a group of experience salespeople how long it would take an novice to learn to do certain tasks with a cellphone: their estimate were about twice as long as it actually took.
One aspect of the curse of knowledge is what psychologists sometimes call "hindsight bias." Say that you make a prediction, and later events show that your prediction was incorrect. Do you remember making the incorrect prediction? Or do you find some reason to believe that your prediction was actually correct all along? One of the early studies of this phenomenon was ""1 Knew It Would Happen": Remembered Probabilities of Once-Future Things" by Baruch Fischhoff and Ruth Beyth (Organizational Behavior and Human Performance, 1975 13, 1-16).
For example, one of their sets of questions revolved around President Nixon's trip to China in 1972. Before Nixon went, they distributed a questionnaire to students asking them to estimate the probabilities of specific events: for example, "(1) The USA will establish a permanent diplomatic mission in
Peking, but not grant diplomatic recognition;(2) President Nixon will meet Mao at least once; (3) President Nixon will announce that his trip was successful." Several weeks after the trip was done, they then gave the same students the same questions. They asked the students whether these events had actually happened, and asked them to remember what they had predicted. As it turns out, when students believed that an event had happened, they were more likely to believe that they had previously predicted it.
Fischhoff and Ruth Beyth refer to this pattern as "creeping determinism," by which they mean that once something has happened, we can't readily imagine it not happening. Scholars of events like wars (say, the US Civil War or World War II) or election outcomes often tend to emphasize that the outcome was not preordained. It could have gone the other way. There was an element of chance involved in the outcome. But once the event has happened, for many of us the nuance quickly falls away, and it becomes easy to explain--with the operation of full 20:20 hindsight--why the outcome that happened was really almost certain to happen all along.
Fischhoff and Ruth Beyth refer to this pattern as "creeping determinism," by which they mean that once something has happened, we can't readily imagine it not happening. Scholars of events like wars (say, the US Civil War or World War II) or election outcomes often tend to emphasize that the outcome was not preordained. It could have gone the other way. There was an element of chance involved in the outcome. But once the event has happened, for many of us the nuance quickly falls away, and it becomes easy to explain--with the operation of full 20:20 hindsight--why the outcome that happened was really almost certain to happen all along.
The label of this bias seems to have originated in a 1989 Journal of Political Economy article, "The Curse of Knowledge in Economic Settings: An Experimental Analysis," by Colin Camerer, George Loewenstein, and Martin Weber. They write that the term was suggested to them by Robin Hogarth. Their article is focused on a point that will immediately have occurred to economists: in most models, a party with more knowledge can in some way benefit from that knowledge over the party with less knowledge. But the curse of knowledge seems to suggest that the party with more knowledge won't be able to imagine not having that knowledge, and thus will not benefit from it (or at least will not benefit as much as expected).
They set up a series of classroom experiments in which one group of students were given financial information about companies from 1970-1979, and then asked to make a prediction about those companies for 1980. Another group of students were given the same information from 1970-79, and then also were told the actual outcome for the companies in 1980. The set-up of the experiment then rewarded the second group of students (the ones who knew the outcome in 1980) for being able to estimate the predictions of the first group of students (the one who did not know the outcome in 1980). Could the students ignore the outcome they knew had happened, and instead just replicate the thinking of the other students, if there was a cash reward on the line? The answer is "partly:" "[W]e find that market forces reduce the curse by approximately 50 percent but do not eliminate it."
Indeed, a different study found that those selling cars tend to overestimate how much consumers know about cars, and thus they underestimate how much ignorant customers would have been willing to pay for cars.
The "curse of knowledge" leads to a variety of bad communications outcomes. The psychologist Stephen Pinker wrote:
I once attended a lecture on biology addressed to a large general audience at a conference on technology, entertainment and design. The lecture was also being filmed for distribution over the Internet to millions of other laypeople. The speaker was an eminent biologist who had been invited to explain his recent breakthrough in the structure of DNA. He launched into a jargon-packed technical presentation that was geared to his fellow molecular biologists, and it was immediately apparent to everyone in the room that none of them understood a word and he was wasting their time. Apparent to everyone, that is, except the eminent biologist. When the host interrupted and asked him to explain the work more clearly, he seemed genuinely surprised and not a little annoyed. This is the kind of stupidity I am talking about. Call it the Curse of Knowledge: a difficulty in imagining what it is like for someone else not to know something that you know.Haven't we all been an audience of that kind, at one time or another? Maybe it was an academic lecture. Maybe it was your car mechanic telling you what was wrong with the engine, or your neighbor explaining their gardening tips, or a distant relative explaining their job to you. Pinker also writes:
The curse of knowledge is the single best explanation of why good people write bad prose. It simply doesn't occur to the writer that her readers don't know what she knows--that they haven't mastered the argot of her guild, can't divine the missing steps that seem too obvious to mention, have no way to visualize a scene that to her is as clear as day. And so the writer doesn't bother to explain the jargon, or spell out the logic, or supply the necessary detail.My guess is that the curse of knowledge goes well beyond these settings, and causes problems in all kind of communications between specialists in one area and others. In many companies, the communications between engineers and marketing departments is fraught with misunderstandings. When doctors and patients interact, can doctors really remember what it was like not to know about symptoms and health conditions?
How does one fight the cognitive bubble that is the curse of knowledge? One of my own methods is to get comfortable with saying: "I was wrong about that" or "I really didn't expect that." Admitting that you had inaccurate expectations is not a confession of weakness or gullibility: no one has a crystal ball for the future. After all, even if you are 90% confident that something will happen, you should expect to be wrong 10% of the time; indeed, Damon Runyon's law (as exposited by some characters in his 1935 story "A Nice Price") holds that nothing between human beings deserves odds of more than three to one.
Perhaps the deeper challenge is to be aware of how others perceive a given topic, including the likelihood that they may just not know (or care) much about it, so if you want to communicate with them, you need to reach out and meet them where they are, not where you are.
I can't complain about the curse of knowledge: after all, as the editor of an academic journal, revising papers by authors afflicted to greater or lesser extents by the curse of knowledge is how I make my living. In some ways, this blog is an effort to avoid the curse of knowledge, too.
Still, saying that one's professional goal is to avoid "the curse of knowledge" doesn't exactly sound complimentary. In a way, the "curse of knowledge" is misnamed, because it's not the knowledge that the problem; instead, one might more properly call it the "curse of socially oblivious knowledge." There. Now I feel much better.