Categories
Book review

Mistakes Were Made (but Not by Me)

Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts

By Carol Tavris, Elliot Aronson

29908948. sy475

Many people (but obviously not me) find it difficult to admit to their mistakes. This book explains why (although clearly it doesn’t apply to me, because I’m not like that), illustrating its points with some great examples, and explaining the experimentally-verified underlying psychological tendencies. Most important is the concept of cognitive dissonance: how we (well you lot, anyway, I’m too smart to do it) will resolve the conflict between our view of ourselves as fundamentally honest and right-minded, and any actions we may have taken that aren’t quite as honest or sensible as they might have been:

“Most people, when directly confronted by evidence that they are wrong, do not change their point of view or plan of action but justify it even more tenaciously.”

We are terrifically good at this, too. Over time we will nudge our memories so they align better with how we wish things had gone, we tell half-truths to others that we come to fully believe are justified, and maybe even actually a more realistic version of what happened. And we do all this entirely unconsciously – our minds are just seeking to resolve this cognitive dissonance, and do it all for us without our awareness. For example, talking of the first Gulf war and the invasion of Iraq:

“Before the invasion, about 46 percent of Democrats supported it; by 2006, only 21 percent remembered having done so. Just before the war, 72 percent of Democrats said they thought Iraq had WMDs, but later, only 26 percent remembered having believed this.”

Mostly we do this to make ourselves feel better, but actually it’s about reinforcing whatever our self image is. If you are that way inclined, and you have a negative image of yourself, you will reinforce that view. “Yes, I won the Pulitzer prize, but it was just a fluke, I’ll never write anything half as good again”, these people would tell themselves – quite sincerely believing it.

While we may have noticed this inclination in others, the authors are at pains to point out that we all do it – they give various annecdotes about how they did just this on one occasion or another.

“The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on its owner the comforting delusion that he or she does not have any.”

The worst aspect of this isn’t what it may do to our individual lives, but how it can play out on a larger canvas. Several chapters are given over to examples in our courts or in our political system where the implications have included everything from wrongful imprisonment of innocent people of decades, to children taken away from their parents, to corruption, to thousands of deaths in wars or accidents.

Even when we think we are certain we are not doing it this time, we may still be doing it:

“The weakness of the relationship between accuracy and confidence is one of the best-documented phenomena in the 100-year history of eyewitness memory research”

How or why do we do this? One plausible explanation is that we tend to view things that people do as due to either the situation, or due to an innate tendency on their part. When we make mistakes, this is due to the situation – I bumped into you because I was rushing to get somewhere. When others make mistakes, all too often we blame it on their character: they bumped into me because they are just plain rude. This is the so-called “fundamental attribution error”, and again, we may see it in others much more easily than we see it in ourselves.

The final chapter talks a little about how we might get better at spotting these problems in ourselves. If I had one complaint about this book, it’s about how short this section is. The advice is basically “now you know it’s a risk, you can be on the lookout for it, and catch yourself doing it”. True, I’m sure, but how, and what else might we do as well?

I shall give the last word to George Orwell, who the authors quote as putting the whole thing quite magnificently thus:

“We are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield.”