Saturday, May 4, 2013

Cognitive dissonance - I

Mistakes were made (but not by me) is a book about cognitive dissonance and the effect that it has on decision making in various fields like medicine, criminal justice etc. It is a term coined by Leon Festinger to explain the behavior of the members of a doomsday cult when their prophesy failed. Cognitive dissonance occurs when an individual holds two views that are opposed to each other. He then goes through various mental hoops to reduce the dissonance he feels between these views.

 All of us try to reduce cognitive dissonance all the time. For instance, we will use office time to surf the net. But we want to think of ourselves as good, ethical human beings so we come up with self-justifying reasons - we are not paid enough, everybody does it, anyway the boss doesn't appreciate our handwork...The difference between the self-justifications of most of us and those of powerful people is that the latter has big consequences. Dissonance theory has some disturbing implications:
  1. Severe initiation rites increases  the loyalty of a member. So if a person undergoes severe ragging before getting into a group, his loyalty to the group increases.
  2. If we come across any information that is consonant with our views, we will view it positively. If the information is dissonant, we will view it as biased or sloppy. The confirmation bias ensures that even absence of evidence is evidence for our beliefs.
  3. People become more certain of something that they have recently done if it is irrevocable. So, asking a person who has recently purchased an expensive item whether you should buy it is not a good idea.He will be highly motivated to persuade you to buy it.   A person who has jut spent a  lot of money on something is unlikely to say that it was a waste.
  4. The escalation of  brutality by perpetrators will be more if victims are helpless than if victims are  armed and able to strike back. When I see scenes like these, this is the explanation that occurs to  me. Such penchant for cruelty is shown in the Stanford prison experiment. It is sobering to think that given the right conditions, I could also behave in the same way.(Here is a You tube video about the Lucifer effect. Warning: A little bit of it is NSFW.)
We often hear from powerful people about say, police reform, military purchases, etc. They will rarely admit that they made a mistake. The first impulse will be to deny any mistake for the obvious reason of protecting one's  job, reputation and colleagues. But there are powerful internal reasons for such denial: they would like to think of themselves as honourable, competent people who would never commit the errors that they are accused of.They thus convince themselves that conditions were different back then, funds and staff were insufficient, situation was more complicated than realised...Admitting to the error would be very difficult because it would be antithetical to their perception of themselves as competent individuals.

When the atom bomb was exploded, Einstein said, “The release of atomic power has changed everything except our way of thinking ... the solution to this problem lies in the heart of mankind. If only I had known, I should have become a watchmaker.” The problem is that this way of thinking is hardwired into our brain.

Neuroscientists have shown that biases are built into the way the brain processes information.Self-justification is not the same as lying. It is lying to oneself. It allows people to convince themselves that what they did was the best thing they could have done. It minimises our mistakes and is the reason why everybody can see a hypocrite in action except the hypocrite himself. (There is my excuse for the hypocrisies that you have seen in this blog - I don't  know that they exist! But I can see the hypocrisies in others' statements every other day!) The authors write:
The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on us the comforting delusion that we,  personally, do not have any. In a sense, dissonance theory is a theory of blind spots - of how and why people unintentionally blind themselves so that they fail to notice vital events and information that might make them question their behaviour or their convictions.Along with the confirmation bias, the brain comes packaged with other self-serving habits that allow us to  justify our own perceptions and beliefs as being accurate, realistic and unbiased....We assume that other reasonable people see things the same way as we do.If they disagree with us, they obviously aren't seeing clearly.

No comments:

Post a Comment