The Need for Closure

Maria Konnikova has an interesting post at The New Yorker about Why We Need Answers.

Uncertainty agitates the human mind. People need and want “cognitive closure.” This means we want an explanation for why things happen the way they do. We want it settled in our mind. And once we find (or invent) that explanation we invest ourselves in it and have a hard time letting go of our belief regardless of evidence to the contrary – politics is filled with nagging examples.

Arie Kruglanski and Donna Webster (1994) invented a way to measure our Need for Closure (NFC). Their measure looked at 5 motivational tendencies: preference for order, predictability, decisiveness, discomfort with ambiguity and closed-mindedness. The combined measure of these 5 tendencies tell us where we’re at in our need for cognitive closure in any given situation. The important point in measuring NFC is that when we “rush” (as many of us do) to find closure we biases our choices, we generate fewer hypotheses, we form judgments too early, and we arrest our search for information. And worse, we don’t even realize just how poorly we’ve formed our judgement.

Kruglanski’s research suggest people pass through two stages on their way to cognitive closure. Konnikova writes:

In the first stage, we are driven by urgency, or the need to reach closure quickly: we “seize” whatever information we can, without necessarily taking the time to verify it as we otherwise would. In the second stage, we are driven by permanence, or the need to preserve that closure for as long as possible: we “freeze” our knowledge and do what we can to safeguard it. (So, for instance, we support policies or arguments that validate our initial view). And once we’ve frozen? Our confidence increases apace.

It’s a self-reinforcing loop: we search energetically, but once we’ve seized onto an idea we remain crystallized at that point. And if we’ve externally committed ourselves to our position by tweeting or posting or speaking? We crystallize our judgment all the more, so as not to appear inconsistent.

I suspect many of you have experienced the highly “crystallized” positions people reach on matters where the evidence, or even their own self-interests (not realized), are to the contrary. People can become so invested in their belief for the sake of politics, or competitiveness, or ego that almost no evidence to the contrary will change their mind. The denial of human caused global warming is one of the best examples. There is always the possibility that approximately 90% of the world’s scientists and about 95% of the climatologists, and many of the world’s major science foundations, are wrong…but what if the experts are actually right! (Hmmm, novel idea.) This, of course, should be the prime consideration of a sensible and free thinking person. Because the consequences for being wrong, for not listening to the scientific community on this, are cataclysmic. And yet we’re speeding (alarmingly fast) down this road of denial toward a wall with our eyes wild open.

In discussing how we might mitigate the negative effects of the NFC in people, Konnikova reports what I would consider the most obvious way most of us deal with people with very closed and crystallized positions: we point out the personal costs. Once you’re able to make a person “see” how their belief or position will personally cost them something–money, health, personal safety, personal reputation, etc–most tend to soften their position and reconsider. (This doesn’t always work I have to admit. I’ve seen more that a few people willing to cut off their nose to spite their face. We can only hope those people are never in leadership positions…wishful thinking I realize.)

For me personally there’s a down side to being able to successfully turn a person’s position through elucidating the personal costs. I’m mostly thinking about big issues that have a national, local, or community impact. If it’s all about reconsidering when there’s a personal costs, then obviously one isn’t thinking about the social costs – the costs to other people or to future generations. One is, well, thinking about themselves, their group or their political self-interests only. This is the stuff of tragedy. History and psychology teach us that this is actually the norm and we shouldn’t be surprised or feel we’re all doomed. But it’s hard not to sometimes.

The ancient historian Thucydides said that people are motivated by fear, honor, and self-interests. And there’s a lot of truth in that. But it never seems to amaze (or shock) me that some people can be motivated only by that.

3 thoughts on “The Need for Closure

  1. One could argue that the 90% of climatologists have a high need for closure since they are not open-minded enough to consider that they might be incorrect or partially incorrect. Consensus science is not science, it is dogmatic power. Many skeptics I know of appear to have a low need for closure. They want better science before we make monumental decisions. They are taking a wait and see approach–this indicates a low need for closure. Both sides of any argument will have folks with a dysfunctional high need for closure. If you go back to the original Skeptics (in philosophy), you find folks that had a *truly* low need for closure. Just saying.

    If separation of church and state is a good idea, separation of science and state is an even better idea.

  2. Very incisive… So much so, in fact, it hurts! Now I’m afraid I’ll spend the day (and doubtless should invest more that one day) asking myself, “am I guilty?” Thank you for providing me such good grist for the mill!

    • I think it’s hard to be human and not fall into these very human errors. I think the important quality to develop is that of critical self-reflection and awareness that one will tend to do this. It’s hard to overcome our natural default settings unless we’re aware of them and have a real need or want to overcome them.

      Thanks for commenting Noble.

Leave a Comment