If you are anything like I am, you’ve probably found yourself wondering how on earth people can cling to the low-fat diet when all the data out there shows it is vastly inferior to the low-carb diet in virtually all parameters.
If you’ve had great results yourself with a low-carb diet, you’ve also probably wondered why it is so hard to persuade others to try it. And you may have asked yourself – as I have asked myself – why every little study that comes out purporting to show that low-carb diets are somehow dangerous gets media coverage out the wazoo while studies showing the superiority of low-carb diets are ignored.
A few weeks ago I followed the recommendation of one of the readers of this blog and purchased the book Mistakes Were Made, (but not by me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. This book gave me the answers to all the above questions. The good news is that now I know why people have the prejudices and biases they have. The bad news is that now I know how really difficult it is to change them. The sort of good news is that I can keep a watchful eye on my own tendency (which is inherent in everyone) not to slip into the same prejudiced, biased way of thinking myself.
Mistakes Were Made is one of the better books I’ve read in the last couple of years. It is well written, humorous in places, tragic in others where the devastating effects of pigheaded bias have lead to disaster, and informative all the way through. I can give it my highest rating as a must read book for all intelligent people.
The authors, Carol Tavris and Elliot Aronson, are both distinguished social scientists who have published widely and have been in clinical practice for years. Their writing is the model of clarity and devoid of the jargon for which most psychologists are infamous. It’s been a long time since I’ve pored through a book containing so much literally life-changing information that was such a pleasure to read.
Why do people make crazy decisions, then stick by them when all evidence should point them in the opposite direction? It all starts with an internal effort to resolve dissonance.
The book begins by giving the reader the real, psychological definition of cognitive dissonance, which the authors refer to as the engine of self destruction.
The engine that drives self-justification, the energy that produces the need to justify or actions and decisions – especially the wrong ones – is an unpleasant feeling…called “cognitive dissonance.” Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.”
Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn’t really so harmful, or that smoking is worth the risk because it helps her relax or prevents her from gaining weight (and, after all obesity is a health risk, too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways.
Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity and…we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd.
Once we find our selves in a position in which we are on the horns of a cognitively dissonant dilemma, we try to reduce our stress by making a decision that resolves the dissonance. Once we’ve made that decision another factor enters the picture. One we’re all familiar with: the confirmation bias.
Dissonance theory also exploded the self-flattering idea that we humans…process information logically. On the contrary: If the new information is consonant with our beliefs, we think it is well founded and useful: “Just what I always said!” But if the new information is dissonant, then we consider it biased or foolish: “What a dumb argument!” So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief. This mental contortion is called the “confirmation bias.”
What this means is that Dean Ornish and others like him are not any stupider than the rest of us.
Ornish believes (or so he says) in the sanctity of animal life and is a PETA supporter. This viewpoint would create great dissonance if he ate meat. But there is a ton of evidence that eating meat is a healthful thing to do and there is evidence that a vegetarian lifestyle is not so healthful. I am sure Ornish knows this, but to resolve his dissonance he has decided that the vegetarian diet is more healthful, and he chooses to consider only that evidence that confirms his bias. When scientific research produces irrefutable evidence that is counter to his bias, he distorts the meaning of the data to one more consonant with his bias. Look here to see what I mean.
What he should be doing is using the scientific method to try to shoot down his own theories.
The scientific method consists of the use of procedures designed to show not that our predictions and hypotheses are right, but that they might be wrong. Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control.
Instead of trying to justify why his way of thinking is correct, Ornish should be trying to puncture holes in it. If he tries hard and can’t refute his own idea, then it probably has some merit. But he isn’t the only one guilty of that method of operation, we pretty much all do the same thing.
Consider this quote from Lenny Bruce after he watched the televised Kennedy-Nixon presidential debate in 1960.
I would be with a bunch of Kennedy fans watching the debate and their comment would be, “He’s really slaughtering Nixon.” Then we would all go to another apartment, and the Nixon fans would say, “How do you like the shellacking he gave Kennedy?” And then I realized that each group loved their candidate so that a guy would have to be this blatant – he would have to look into the camera and say: “I am a thief, a crook, do you hear me, I am the worst choice you could ever make for the Presidency!” And even then his following would say, “Now there’s an honest man for you. It takes a big guy to admit that. There’s the kind of guy we need for President.”
We’ve all seen this in action in politics.
Have you ever pointed out the failings of a candidate to someone who is a fan of that candidate and had him/her say: “Oh, they all do that.” How many times have you said that about your own candidate? It’s called resolving dissonance because you can’t support a candidate who is a crook, cheater, embezzler, whatever, but if whatever your candidate is alleged to have done is simply something ‘they all do’ then you’re off the hook. No cognitive dissonance.
Mistakes Were Made describes how pharmaceutical execs read the data in a way that makes their drugs look like gifts from God. It ain’t greed. Most pharmaceutical execs are no different than you and I – they are nice, friendly, intelligent folks who love their families and wouldn’t consider doing anything harmful.
So how then can they promote statins to everyone who breathes despite the evidence that statins don’t do any good for most people and are actually harmful to some? If they thought they were harming people they would be deep in the throes of cognitive dissonance, so they make the decision that they are really helping people, then use confirmation bias to convince themselves they really are helping people. And they ignore or blow off any data to the contrary as the ramblings of malcontent alternative healthcare types.
And it not just pigheadedness that keeps people thinking the wrong way once they’ve made a decision. Their brains actually change.
Neuroscientists have recently shown that these biases in thinking are built into the very way the brain processes information – all brains, regardless of their owners’ political affiliations. For example, in a study of people who were being monitored by magnetic resonance imaging (MRI) wile they were trying to process dissonant or consonant information about George Bush or John Kerry, [researchers] found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information and the emotion circuits of the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observation that once our minds are made up, it is hard to change them.
This book describes how the combination of cognitive dissonance and the confirmation bias are involved in the medical profession, scientific research, business, law, police work, love and war. The chapter on recovered memory is alone worth the price of the book. And could be a surrogate chapter for the history of the low-fat diet: a ton of trouble due to doctrinaire adherence to a faulty hypothesis.
Many books such as this one are great at the start with all the early chapters filled with valuable info, but then the book runs out of steam. It becomes obvious that the author didn’t have enough material to stretch to a full-length book because the chapters near the end are filled with fluff. Not so with Mistakes Were Made. I gained some of my most valuable insights from the later chapters. In my opinion the book is a winner from beginning to end.
I’ll leave you with one last quote, this one from Albert Speer, the architect of Hitler’s war machine. Speer’s description must mirror the way academic physicians feel with their unshakable faith in the power of the low-fat diet and statin drugs to cure the world.
In normal circumstances, people who turn their backs on reality are soon set straight by the mockery and criticism of those around them, which makes them aware they have lost credibility. In the Third Reich there were no such correctives, especially for those who belonged to the upper stratum. On the contrary, every self-deception was multiplied as in a hall of distorting mirrors, becoming a repeatedly confirmed picture of a fantastical dream world which no longer bore any relationship to the grim outside world. In those mirrors I could see nothing but my own face reproduced many times over.
Buy the book and read it. You’ll be glad you did.
Hat tip to Sue for recommending the book to me.