Overcoming The Curse of Knowledge

[crossposted at LessWrong]

What is the Curse of Knowledge, and how does it apply to science education, persuasion, and communication? No, it’s not a reference to the Garden of Eden story. I’m referring to a particular psychological phenomenon that can make our messages backfire if we’re not careful.

Communication isn’t a solo activity; it involves both you and the audience. Writing a diary entry is a great way to sort out thoughts, but if you want to be informative and persuasive to others, you need to figure out what they’ll understand and be persuaded by. A common habit is to use ourselves as a mental model – assuming that everyone else will laugh at what we find funny, agree with what we find convincing, and interpret words the way we use them. The model works to an extent – especially with people similar to us – but other times our efforts fall flat. You can present the best argument you’ve ever heard, only to have it fall on dumb – sorry, deaf – ears.

That’s not necessarily your fault – maybe they’re just dense! Maybe the argument is brilliant! But if we want to communicate successfully, pointing fingers and assigning blame is irrelevant. What matters is getting our point across, and we can’t do it if we’re stuck in our head, unable to see things from our audience’s perspective. We need to figure out what words will work.

Unfortunately, that’s where the Curse of Knowledge comes in. In 1990, Elizabeth Newton did a fascinating psychology experiment: She paired participants into teams of two: one tapper and one listener. The tappers picked one of 25 well-known songs and would tap out the rhythm on a table. Their partner – the designated listener – was asked to guess the song. How do you think they did?

Not well. Of the 120 songs tapped out on the table, the listeners only guessed 3 of them correctly – a measly 2.5 percent. But get this: before the listeners gave their answer, the tappers were asked to predict how likely their partner was to get it right. Their guess? Tappers thought their partners would get the song 50 percent of the time. You know, only overconfident by a factor of 20. What made the tappers so far off?

They lost perspective because they were “cursed” with the additional knowledge of the song title. Chip and Dan Heath use the story in their book Made to Stick to introduce the term:

“The problem is that tappers have been given knowledge (the song title) that makes it impossible for them to imagine what it’s like to lack that knowledge. When they’re tapping, they can’t imagine what it’s like for the listeners to hear isolated taps rather than a song. This is the Curse of Knowledge. Once we know something, we find it hard to imagine what it was like not to know it. Our knowledge has “cursed” us. And it becomes difficult or us to share our knowledge with others, because we can’t readily re-create our listeners’ state of mind.”

So it goes with communicating complex information. Because we have all the background knowledge and understanding, we’re overconfident that what we’re saying is clear to everyone else. WE know what we mean! Why don’t they get it? It’s tough to remember that other people won’t make the same inferences, have the same word-meaning connections, or share our associations.

It’s particularly important in science education. The more time a person spends in a field, the more the field’s obscure language becomes second nature. Without special attention, audiences might not understand the words being used – or worse yet, they might get the wrong impression.

Over at the American Geophysical Union blog, Callan Bentley gives a fantastic list of Terms that have different meanings for scientists and the public.

What great examples! Even though the scientific terms are technically correct in context, they’re obviously the wrong ones to use when talking to the public about climate change. An inattentive scientist could know all the material but leave the audience walking away with the wrong message.

We need to spend the effort to phrase ideas in a way the audience will understand. Is that the same as “dumbing down” a message? After all, complicated ideas require complicated words and nuanced answers, right? Well, no. A real expert on a topic can give a simple distillation of material, identifying the core of the issue. Bentley did an outstanding job rephrasing technical, scientific terms in a way that conveys the intended message to the public.

That’s not dumbing things down, it’s showing a mastery of the concepts. And he was able to do it by overcoming the “curse of knowledge,” seeing the issue from other people’s perspective. Kudos to him – it’s an essential part of science education, and something I really admire.

31 Responses to Overcoming The Curse of Knowledge

  1. Jesse Galef says:

    By the way, I chose that image for a reason – I bet once you see the baby in the tree you won’t be able to ‘unsee’ it. (image via Richard Wiseman)

  2. Betsy Haibel says:

    I have such a complicated reaction to this.

    On the one hand, it is good practical advice.

    On the other hand, language shapes thought – using these words in their scientific-jargon ways, and understanding them, reflects and reinforces a worldview in which it’s possible to accept both the limits of our knowledge and the fact that – despite its limits – working from current scientific consensus about our world is pretty much the best existing way to make decisions about it.

    And I think getting people into that mindset is *really important.*

    • PCGuyIV says:

      I agree that the issue is a conflict of mindsets. The general public does not, and never will, view things through a quantitative filter. Scientific terminology is, by necessity, very quantitative and specific. The quantitative definitions of those terms still apply and exist outside of the realm of science, but more often, those words will evoke their more qualitative and general meanings in the average person.

      There seem to be, in my opinion, two possible outcomes to using words in a scientific context when providing information to the general public. The first possibility is that the person will not recognize the scientific context, therefore misinterpreting the true meanings of the words used, and thus receive the wrong message, as stated in the post. The second possibility is that the listener or reader will understand that it is a scientific context, and immediately quit paying attention because it’s “all science-talk”, and thus, not get any message, even a wrong one. Statistically, there probably would be a few who would recognize the context and apply the appropriate interpretation to the words used, but my guess is that they would be a very small group, possibly with a statistical significance very near being an anomaly.

  3. wallowinmaya says:

    Eliezer Yudkowsky wrote a related post on the problem of “inferential distances”. (http://lesswrong.com/lw/kg/expecting_short_inferential_distances)

  4. I think you made the same error, as in a false leap in logic, with the word distillation.

    Hip hip for great minds

  5. Pingback: The curse of knowledge | Interchange Project

  6. Max says:

    Professionals often talk past each other, especially when they’re in different fields. For example “bandwidth” can refer to the rate of data transfer or a range of frequencies, and “resolution” can refer to the number of pixels or the smallest detectable distance between parallel lines. Not to mention higher level terms like “information” and “complexity.”

    It helps to be more specific, like distinguishing digital bandwidth from analog bandwidth, but a picture is worth a thousand words. I usually have an image in my head anyway, so I either draw it or say exactly what I’m picturing.

    Richard Feynman noticed that people count seconds differently. Feynman was able to read but not speak while counting seconds, but John Tukey could speak but not read. When they compared notes, they realized that Feynman was saying the numbers in his head, whereas Tukey was picturing numbers scrolling by on a tape.

  7. Max says:

    Dumbing down just means oversimplifying. The more you remove nuance and clarity, they more you dumb down, until you end up with “Global Warming bad.”
    I once saw a TV show where a guest chef was making a citrus supreme salad, and the host laughed that “citrus supreme” is just a fancy way to say oranges. In the time he was joking around, he could’ve explained what citrus supreme really means.

    • PCGuyIV says:

      A perfect example. I was trying to recall one but couldn’t. (Too much flotsam floating in the brain lately.) Even if he had just said that it was a fancy way of saying “orange section” that would have been an improvement, though still not quite accurate.

    • Jesse Galef says:

      “The more you remove nuance and clarity, they more you dumb down, until you end up with “Global Warming bad.””

      Do you think it’s possible to remove nuance without removing clarity? What would such a statement look like?

      • Betsy Haibel says:

        I think it’s possible to remove nuance without removing clarity – that’s what simplification is. I have greater doubts about people’s ability to preserve precision when removing nuance; these seem like directly opposed goals to me.

      • Max says:

        Hehe, I guess “clarity” is unclear. I was equating clarity with precision, because to me “citrus supreme” is more clear than “orange pieces,” and “90% probability” is more clear than “very likely.” But when I think about, say, legalese fine print, I can see how too much precision can be confusing.

      • PCGuyIV says:

        I would say that it is, because clarity, to me, seems to be relative to the recipient. The scientific meaning of terms used in a scientific context are perfectly clear to another scientist, but not so to the general public.

        For example, lets say tomorrow, some scientist comes out and says the following:
        “Findings from our recent survey of 50 years of meteorological data indicate a sustained positive increase in average global temperature of 0.0200 degrees Celsius per calendar year, with a significant statistical variance of 0.0030 degrees per year.”

        Another scientist would understand implications immediately, as to him, the meaning of that statement is perfectly clear. The general public would think that the global climate was improving slowly.

        Now for the dumbed down versions:
        “New study shows Global Warming bad” (Obvious and not very precise or accurate based on the original statement.)
        “New study shows the world is getting warmer” (Likewise)

        Now for “sanitized for public consumption” version:
        Five-decades of weather monitoring indicates that the rate at which the world’s average temperature is rising, is roughly 1 degree every 50 years.

        This statement, while possibly not as precise as the scientific statement, is equally accurate and, to the general public, far more informative.

      • Max says:

        Dr. Hibbert: Homer, I’m afraid you’ll have to undergo a coronary bypass operation.
        Homer: Say it in English, Doc!
        Dr. Hibbert: You’re going to need open-heart surgery.
        Homer: Spare me your medical mumbo jumbo!
        Dr. Hibbert: We’re going to cut you open and tinker with your ticker.
        Homer: Could you dumb it down a shade?

  8. Max says:

    If we define “clarity” as communicating something in such a way that the audience understands it the same way you do, then yeah, removing nuance can increase clarity.
    For example, “brain surgery” and “rocket science” are less nuanced terms than “neurosurgery” and “aerospace engineering,” but at least people have a clue about their meaning.

  9. When I saw the title of this post, it made me think of the things I’m more cautious about doing now that I know the consequences if things were to go awry… for example, climbing trees and skateboarding. I sometimes wish to rid myself of my own “curse of knowledge” in order to enjoy things more fully without worry or fear.

  10. Pingback: How ‘Overcoming The Curse Of Knowledge’ will make you a better communicator | paulbalcerak's blog

  11. ClimateBites says:

    Excellent post. When I worked at NOAA, I saw first-hand how deadly The “Curse of Knowledge” can be to effective communication. I made Made to Stick my bible, and started to collect examples of climate language that are clear, correct, concise ….and sticky.

    We now post these metaphors, quotes, quips and stories at http://www.ClimateBites.org, and would sure welcome any additions!

  12. Pingback: The Talk « Monkeytraps

  13. Pingback: Consider your Audience to Improve Communication | Tech Thoughts

  14. Pingback: #1 Reason You CAN’T Write Your Own Resume « Technical Resume Writing Services – IT, Healthcare IT, Engineering, and Science Resumes

  15. Pingback: The 5 Simple Rules of Concept Branding Alchemy | Endurance Leader

  16. Julia in France says:

    A german writer (forgot his name) said it would make every sense to have the arts teacher teach math to the kids and the chemistry teacher do English and so on, so they have less “knowledge” and more shared curiosity in the subject… That would help students sooooo much…

  17. Pingback: How to Remove the Curse of Knowledge from Your Writing

  18. Pingback: We know too much - Hans' Blog: Modern Industrial Marketing

  19. Pingback: Improve your editing skills - 5 easy tips - Peter Rey's Blog

Leave a Reply to Andrew TCancel reply

Discover more from Measure of Doubt

Subscribe now to keep reading and get access to the full archive.

Continue reading