fox and grapes

fox and grapes

One hot summer’s day a Fox was strolling through an orchard till he came to a bunch of Grapes just ripening on a vine which had been trained over a lofty branch.

“Just the thing to quench my thirst,” quoth he.

Drawing back a few paces, he took a run and a jump, and just missed the bunch. Turning round again with a One, Two, Three, he jumped up, but with no greater success.

Again and again he tried after the tempting morsel, but at last had to give it up, and walked away with his nose in the air, saying: “I am sure they are sour.”

It is easy to despise what you cannot get.

This fable by Aesop (circa 620-564 BC) illustrates a powerful example of the theory of cognitive dissonance. The Fox desires a bunch of grapes hanging high from its vine in a tree. After numerous attempts to get the grapes he “decides” that he doesn’t want the grapes. They are probably sour. The Fox rationalizes his failure.

Cognitive dissonance refers to the mental stress or discomfort experienced when we hold two or more contradictory beliefs, ideas, or values at the same time, perform an action that is contradictory to one or more beliefs, ideas or values, or are confronted by new information that conflicts with our existing beliefs, ideas, or values.

Notice the pattern in the tale of the Fox: he desires something, finds it unattainable, and reduces his cognitive dissonance (conflict) by criticizing it.

He can’t have the grapes, so he tells himself he didn’t really want them anyway.

****

It is human nature to dislike being wrong. When we make a mistake, it is hard to admit it.

We resort to mental gymnastics to avoid accepting that our logic – or our belief system itself – is flawed. Lying, denying, and rationalizing are among the tactics we employ to dance around the truth and avoid the discomfort that contradiction creates. We avoid or toss aside information that isn’t consistent with our current beliefs. Emotions trump logic and evidence. Once our minds are made up, it is very difficult to change them.

This is cognitive dissonance.

****

The late social psychologist Leon Festinger began developing the theory of cognitive dissonance in the mid-1950s.  He found that humans strive for internal consistency, and when we experience inconsistency (dissonance), we tend to become psychologically uncomfortable. This motivates us to try to reduce this dissonance and actively avoid situations and information that are likely to increase it.

This natural drive to maintain cognitive consistency can give rise to irrational and sometimes maladaptive behavior.

****

The development of Festinger’s theory was inspired by his observations of a group called the Seekers, a doomsday cult that believed that the world would end before dawn on December 21, 1954. The group’s prophetess, Dorothy Martin (who was given the alias Marian Keech by Festinger in his writings to protect her identity), claimed to have received a message from a fictional planet called Clarion. The message revealed that a cataclysmic flood would end the world, and aliens would save the few true believers.

Festinger was interested in how the cult would respond to the discrepancy between their beliefs and the inevitable failure of the apocalypse prophecy, so he and a few of his colleagues pretended to be believers and infiltrated the group.

On the expected doomsday, dawn came and went. When the apocalypse failed to materialize, the cult members attempted to find an explanation.

At 4:45 a.m., Keech claimed to receive another message from Clarion via automatic writing. It stated, in essence, that the God of Earth had decided to spare the planet.

The cataclysm had been called off:

The little group, sitting all night long, had spread so much light that God had saved the world from destruction.

The cult members faced acute cognitive dissonance: Had they been the victim of a hoax? Had they donated their worldly possessions in vain?

Rather than considering the possibility that the prophecy was false and that perhaps Keech was either lying or delusional (or both), and the apocalypse was never going to happen, the group’s faith deepened.

They chose to believe something less dissonant to deal with reality not meeting their expectations: they chose to believe Keech’s claim that Earth had been given a second chance. Now, the group was empowered to spread the word that Earth-spoiling must stop.

That afternoon, the group – which had previously avoided publicity – called newspapers and sought interviews. They began an urgent campaign to spread their message to as broad an audience as possible.

The group dramatically increased their proselytism despite (because of) the failed prophecy.

Keech and her group had committed – at considerable expense – to maintain their belief, so altering or abandoning it would have been very difficult. By attempting to enlist social support for their belief,  the cult was seeking to validate it. After all, there is strength in numbers.

As Festinger wrote,

If more and more people can be persuaded that the system of belief is correct, then clearly it must after all be correct.

But WHY did the group become more resolute in their belief?

Festinger explained that five conditions must be present if someone is to become a more fervent believer after a failure or disconfirmation:

  • A belief must be held with deep conviction and it must have some relevance to action, that is, to what the believer does or how he or she behaves.
  • The person holding the belief must have committed himself to it; that is, for the sake of his belief, he must have taken some important action that is difficult to undo. In general, the more important such actions are, and the more difficult they are to undo, the greater is the individual’s commitment to the belief.
  • The belief must be sufficiently specific and sufficiently concerned with the real world so that events may unequivocally refute the belief.
  • Such undeniable disconfirmatory evidence must occur and must be recognized by the individual holding the belief.
  • The individual believer must have social support. It is unlikely that one isolated believer could withstand the kind of disconfirming evidence that has been specified. If, however, the believer is a member of a group of convinced persons who can support one another, the belief may be maintained and the believers may attempt to proselytize or persuade nonmembers that the belief is correct.

 

In 1956, Festinger, Henry Riecken, and Stanley Schachter shared their experience and findings in the book When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World, which is considered a classic work of social psychology.

The following year, Festinger published the book A Theory of Cognitive Dissonance, which has been a major influence on the study of psychology, politics, economics, and other fields.

Festinger described the basic hypotheses of cognitive dissonance as follows:

  1. The existence of dissonance [or inconsistency], being psychologically uncomfortable, will motivate the person to try to reduce the dissonance and achieve consonance [or consistency].
  2. When dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance.

 

The amount of dissonance produced by two conflicting cognitions or actions (as well as the subsequent psychological distress) depends on two factors:

  1. The importance of cognitions: The more that the elements are personally valued, the greater the magnitude of the dissonant relationship.
  2. Ratio of cognitions: The proportion of dissonant to consonant elements.

 

The pressure to reduce cognitive dissonance is related to the magnitude of the dissonance.

According to Festinger, dissonance reduction can be achieved in four ways. 

Take the example of a person who has adopted the attitude that they will no longer eat high fat food, but eats a high-fat doughnut anyway. In this scenario, the four methods of reduction are:

  1. Change behavior or cognition (“I will not eat any more of this doughnut”)
  2. Justify behavior or cognition by changing the conflicting cognition (“I’m allowed to cheat every once in a while”)
  3. Justify behavior or cognition by adding new cognitions (“I’ll spend 30 extra minutes at the gym to work this off”)
  4. Ignore or deny any information that conflicts with existing beliefs (“This doughnut is not high in fat”)

 

Festinger also provided an example of a smoker who has acknowledged that smoking is bad for his health. The smoker may reduce dissonance by choosing to quit smoking, by changing his thoughts about the effects of smoking (ex: smoking is not as bad for your health as others claim), or by acquiring knowledge pointing to the positive effects of smoking (ex: smoking prevents weight gain).

A key tenet of cognitive dissonance theory is that those who have heavily invested in a position may go to greater lengths to justify their position when they are confronted with disconfirming evidence.

Important research generated by cognitive dissonance theory has been concerned with the consequences of exposure to information inconsistent with a prior belief, what happens after individuals act in ways that are inconsistent with their prior attitudes, what happens after individuals make decisions, and the effects of effort expenditure.

Most of the research on cognitive dissonance employs the use of one of four major paradigms:

Belief disconfirmation paradigm: People experience dissonance when they are confronted with information that is inconsistent with their beliefs. If the dissonance is not reduced by changing one’s belief, the dissonance can result in restoring consonance through misperception, rejection or refutation of the information, seeking support from others who share the beliefs, and attempting to persuade others.

Induced compliance paradigm: Festinger and James M. Carlsmith conducted a cognitive dissonance experiment in 1959. They wanted to find out what happens to an individual’s personal opinion if he is forced to do or say something contrary to that opinion. They asked study participants to spend an hour on boring and tedious tasks (turning pegs a quarter turn, over and over again). The tasks were designed to generate a strong negative attitude. Once the subjects completed the task, the experimenters asked some of them for a simple favor. They were asked to talk to another subject (who was actually an actor) and persuade the impostor that the tasks were interesting and engaging. Some participants were paid $20 for this favor, another group was paid $1, and a control group was not asked to perform the favor.

At the conclusion of the study, the participants were asked to rate the boring tasks (not in the presence of the other “subject”). Those in the $1 group rated them more positively than those in the $20 and control groups. This was explained by Festinger and Carlsmith as evidence for cognitive dissonance. They theorized that people experienced dissonance between the conflicting cognitions, “I told someone that the task was interesting”, and “I actually found it boring.” But when paid only $1, students were forced to internalize the attitude they were induced to express, because they had no other justification. Those in the $20 condition, however, had an obvious external justification for their behavior, and thus experienced less dissonance.  In other words, being paid only $1 is not sufficient incentive for lying and so those who were paid $1 experienced dissonance. They could only overcome that dissonance by coming to believe that the tasks really were interesting and enjoyable. Being paid $20 provides a reason for turning pegs and there is therefore no dissonance.

Free choice paradigm:  In a different type of experiment conducted by researcherJack Brehm, 225 female students rated a series of common appliances and were then allowed to choose one of two appliances to take home as a gift. During a second round of ratings, the participants increased their ratings of the item they chose, and lowered their ratings of the rejected item.

This can be explained in terms of cognitive dissonance. When making a difficult decision, there are always aspects of the rejected choice that one finds appealing and these features are dissonant with choosing something else. In other words, the cognition, “I chose X” is dissonant with the cognition, “There are some things I like about Y.”

Effort justification paradigm: Dissonance flares when individuals voluntarily engage in an unpleasant activity to achieve some desired goal. Dissonance can be reduced by exaggerating the desirability of the goal. In one study, individuals were asked to undergo a severe or mild “initiation” to join a group. In the severe-initiation condition, the individuals engaged in an embarrassing activity. The group they joined turned out to be dull and boring. The individuals in the severe-initiation condition evaluated the group as more interesting than the individuals in the mild-initiation condition.

****

If you think you are immune to cognitive dissonance, you are likely mistaken (yes, you).

Several studies using imaging technology (functional magnetic resonance imaging (fMRI) in particular) have been conducted to observe what happens to the brain when we experience cognitive dissonance.

Neuroscientists have found that when we’re confronted with dissonant information and use rationalization to compensate, the reasoning areas of our brains essentially shut down while the emotion circuits of the brain light up with activity.

Keise Izuma, a lecturer in the department of psychology at the University of York in England, explains some of his team’s findings:

The area implicated most consistently is the posterior part of the medial frontal cortex (pMFC), known to play an important role in avoiding aversive outcomes, a powerful built-in survival instinct. In fMRI studies, when subjects lie to a peer despite knowing that lying is wrong—a task that puts their actions and beliefs in conflict—the pMFC lights up.

Recently my colleagues and I demonstrated a causal link between pMFC activity and the attitude change required to reduce dissonance. We induced cognitive dissonance in 52 participants by having them rate two wallpapers. When asked to evaluate their choices on a second viewing, some participants realized that they had actually rejected their preferred wallpaper, whereas others had initially chosen their least favorite option. We found that by temporarily decreasing activity in the pMFC using a technique called transcranial magnetic stimulation (TMS), we could also diminish their attitude changes and their desire to create consistency.

Additional studies have revealed that cognitive dissonance engages other brain regions, such as the insula and dorsolateral prefrontal cortex (DLPFC). The insula, which processes emotions, often becomes more active when people are upset or angry, and the DLPFC is strongly associated with cognitive control. One study found that disrupting the activity of the DLPFC by zapping it with electrodes reduces the extent to which we try to rationalize our beliefs following cognitive dissonance.

In 2006, Emory University psychology professor Drew Westen, PhD, and colleagues published a study in the Journal of Cognitive Neuroscience that described the neural correlates of political judgment and decision-making.

Using functional magnetic resonance imaging (fMRI), the team examined the brain activity of 30 men. Half were self-described “strong” Republicans and half were “strong” Democrats. The men were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves.

In their assessments, Republicans were as critical of Kerry as Democrats were of Bush, yet both let their own candidate off the hook.

While that result might not be surprising, the neuroimaging results were: the part of the brain most associated with reasoning – the dorsolateral prefrontal cortex – was inactive. Most active were the orbital frontal cortex, which is involved in the processing of emotions; the anterior cingulate, which is associated with conflict resolution; the posterior cingulate, which is concerned with making judgments about moral accountability; and – once subjects had arrived at a conclusion that made them emotionally comfortable – the ventral striatum, which is related to reward and pleasure.

Westen said the results suggest that the notion of “partisan reasoning” is an oxymoron, and that most of the time, partisans feel their way to beliefs rather than use their thinking caps.

In an interview, Westen elaborated on his study’s findings:

We ultimately found that reason and knowledge contribute very little. From three studies during the Clinton impeachment era to the disputed vote count of 2000 to people’s reactions to Abu Ghraib, we found we could predict somewhere between 80 percent and 85 percent of the time which way people would go on questions of presumed fact from emotions alone. Even when we gave them empirical data that pushed them one way or the other, that had no impact, or it only hardened their emotionally biased views.

In an Emory University press release, Westen is quoted as saying:

We did not see any increased activation of the parts of the brain normally engaged during reasoning. What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts.

Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones.

You may be thinking, “No, I don’t suffer from this! I am not like that! I’m always a rational, objective, unbiased thinker!”

But here the thing: you don’t consciously notice that you are experiencing cognitive dissonance. It happens outside of your awareness.

The discomfort can become relentless, however. At some point, it will drive you to seek relief.

It is often assumed that humans are incessant information-gatherers with analytical minds.

But are we consistently truth-seeking beings?

Evidence indicates that maintaining our emotional stability is much more important to us than facing reality.

So, what do we do with this information?

Mark Tyrrell, therapist and co-founder of Uncommon Knowledge, has some ideas:

I really think we need to be braver, and willing to face what are sometimes uncomfortable truths about ourselves and our behavior.

Interestingly, self justifications don’t always have to put us in a positive light – just a consistent one.

Cognitive dissonance is essentially a matter of commitment to the choices one has made, and the ongoing need to satisfactorily justify that commitment, even in the face of convincing but conflicting evidence. This is why it can take a long time to leave a cult or an abusive relationship – or even to stop smoking. Life’s commitments, whether to a job, a social cause, or a romantic partner, require heavy emotional investment, and so carry significant emotional risks.

Cognitive dissonance can actually help me mature, if I can bring myself, first, to notice it (making it conscious) and second, to be more open to the message it brings me, in spite of the discomfort. As dissonance increases, providing I do not run away into self-justification, I can get a clearer and clearer sense of what has changed, and what I need to do about it.

By looking at our thoughts and actions critically, objectively, and dispassionately, we can avoid making mistakes and be open to changing our minds as new information comes in. We can break the cycle of lying, denying, and rationalizing – if we remain vigilant and open-minded.

Related Reading

Just Following Orders: It Is Shockingly Easy to Get People to Do Bad Things

What’s the Best Way to Overcome Your Fears? Face Them.

Feeling Like a Fraud? How to Overcome Impostor Syndrome

Comments are closed.