By Massimo Pigliucci
It will be our pleasure at the Rationally Speaking podcast to have Carol Tavris as a guest. She co-authored with Elliot Aronson the delightful Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts.
The book is about the incredible ability of the human mind to rationalize events and beliefs so that we, personally, always end up being better than average at being right. The basic idea is that cognitive dissonance is a powerful engine of self-justification, as the authors put it, because we absolutely need somehow to square our most dearly held opinions with the nasty tendency of some facts to contradict them. (This reminds me of the famous disclaimer on the Hitchhiker’s Guide to the Galaxy, to the effect that in any case in which the Guide and reality are at odds, reality is to be blamed...)
As Carol and her co-author make clear throughout the book, this isn’t just an interesting academic problem, it affects all spheres of our lives, from politics to religion, from our personal relationships to the (mal)functioning of our system of justice. Take the case of pharmaceutical companies lavishing an estimated $22 billion in gifts to physicians in 2003 alone. Shouldn’t that make our medical practitioners a bit uncomfortable about essentially being bought to push on us drugs that may not do much good, or that at the very least may not be better than cheaper generic alternatives? Not to fear, leave it to our good doctors’ rationalizing engines to convince themselves that they are completely uninfluenced in their decisions by the presents they receive from Big Pharma.
Or consider the very well known tendency of people who fall in love to seek confirmation that the other person is the desperately sought “other half” of their own selves (a metaphor that goes back to Plato’s Symposium and a rather funny theory of love propounded in that dialogue by the playwright Aristophanes). Not only do we find plenty of signs to confirm our hopes, but we systematically ignore all the conflicting information that might otherwise alert us to trouble down the road. If things do turn sour, our brain doesn’t skip a bit and immediately concocts perfectly reasonable narratives to coherently explain the whole thing — previous statements, hopes and expectations suddenly forgotten or reinterpreted through a new filter. Anything except admitting that we could have possibly being wrong all along.
So, what’s your favorite story of rationalization — obviously indulged in by someone other than you? What would you like to ask Carol during the podcast?
I cant wait for this one!ReplyDelete
I havent read the book but could you please ask Dr. Tavris why it is that when people rationalize their beliefs and actions when they contradict facts instead of changing them? Wouldnt it be more beneficial to an individual to change his wrong and/or harmful beliefs instead of making up a story to justify them? Who benefits by doing that? I mean i know that humans arent perfect pattern seeking machines but when the evidence for a true pattern are pointed out to us we should be able to soak em up like a sponge. Having some built in machinery that prevents us from doing that seems very counterproductive seeing as that machinery probably evolved for some purpose.
Also does she think that rationalizing applies only to false beliefs? They say that humans are story telling creatures and to me that means that we create myths out of the building blocks of our culture (call them memes if you want) to explain why we did something or why we believe something. If the story is good then we gain the acceptance of our peers and if not then we are punished or marginalized. But its the appeal of the story and not the validity if the claim is what matters. We rationalize true beliefs all the time because thats what we do. Besides we dont ever really know if the claim is true or not. We just believe it. Well thats my PetTheory™ anyway.
Looking forward to this episode
This is a great topic. I was wondering what are Carol's thought on "Error Management Theory". The idea behind the theory is that it's better to be wrong on the safe side. We tend to make mistakes and rationalize things that can enhance our fitness or reproductive success. For example, it makes prefectly sense to think that the person we are dating is interest in us even when he or she are not, because if we don't think that way we might loose a mating opportunity. From this perspective we can predict that men will make more such false positive mistakes because the cost of reproduction is lower for them. Women, on the other hand, will make more false negative mistakes in this context because it's safer to be a bit more restrained, and not fall for someone that is not willing to commit.ReplyDelete
I was wondering if there are generally sex difference in the type of mistakes men and women do (with or without relating them to evolutionary explanations).
This is more general, but applicable.ReplyDelete
I justified my conservative christianity because it felt good to be religious. I kept tricking myself into believing there was an explanation somewhere for the Abrahamic god's absence.
Because when I prayed I felt 'something.' When I heard Bach I was moved. Since I knew the "truth," I felt secure... so I found explanations: in a miraculous coincidence, a ritual, other peoples' testimonies, and especially the bible.
Fortunately the cognitive dissonance became too great. No amount of exegesis compensated for the bible's obscene contradictions about this god's character. Eventually I couldn't justify choosing good feelings over clarity.
IMHO... (please persevere through the preparatory waffle)ReplyDelete
Deep down, we know that we should aim to proportion the strength of our beliefs to the strength of the evidence. We also know that we should assess the rationality of others' reasoning on whether they appear to be doing the same.
In some circumstances we may use the term "proof" to indicate overwhelming strength of empirical evidence but we know this is merely shorthand since no empirical claim can be completely proved and countless other explanations (such as deceptive daemons) will always remain possible.
I notice that when we find our views looking shaky under attack, we often flick from reasoning about balance of evidence to reasoning about what can or cannot be proved or disproved.
I think one of the most common tactics on hearing new criticism is to check that it doesn't 100% disprove our pre-held views and then dismiss it if it doesn't. It would be much better to question: "how does this criticism affect my understanding of the balance of evidence and should it make me reconsider which explanation I think most likely?"
I propose that we should all hear alarm bells when we hear anyone make this jump from balance of evidence to possibility of proof. I propose we should question the leap and question whether the resulting argument is still as strong if phrased in terms of balance of evidence.
"You can't prove my view X is unreasonable since you can't disprove it" is a weak but widely used argument that fails to discriminate against many claims that are obviously silly. In contrast, "my view X looks highly likely on the balance of evidence" is much harder to argue and much stronger for it.
PS. Massimo: To be a little cheeky (and feel free not to bite :-) ), I'm tempted to say that you sometimes elide the distinction between proof and balance of evidence. In the discussion about whether the appropriate degree of belief in the supernatural could be affected by the balance of available evidence, you jump between this "balance of evidence" question to a quite different "proof" question with the "Gods could never be proven because it could always be technological aliens instead" point.
Are there any examples of endemic domain scoped rationalization?ReplyDelete
For example given the obvious success of the 'scientific method' are there examples of copy-cat applying it to inappropriate domains and what rationalizations were used? I suppose creationism is an obvious example, but I was thinking more along the lines of history, geography, cooking, dress-making etc...? :)
I look forward to Dr. Tavris' wise council on the topic of breaking through the self-delusional practice of self-justification and rationalization. I've heard her before on this topic and she did not disappoint.ReplyDelete
But I wonder if she could address what might be something of a challenge to the (classical?) concept of cognitive dissonance. I read a paper a few years ago by Robert Kurzban called Are Psychologists too 'Self'ish? in which he challenges the widespread assumption that it is necessary to presuppose a unitary 'self' that animates many of our psychological theories. He singles out cognitive dissonance theory as being guilty of this assumption and proposes an alternative explanation for the unease of cognitive dissonance: it's not that we can't stand holding contradictory ideas but that we don't want to get publicly called out as incoherent or hypocritical for it. So it's the social threat of being found out that motivates reconciling and justifying our attitudes, beliefs, and actions, not the clash of cognitions per se.
What does Dr. Tavris think of this reevaluation or reinterpretation of the experience of cognitive dissonance? Thanks.
On average, drivers tend to view themselves as above average.ReplyDelete
Two things regarding this book.ReplyDelete
Cognitive dissonance seems a very powerful explanatory theory and after reading the book I see it everywhere. What are the limits to the theory and how does one stop oneself from stretching beyond where its limits lie?
In some ways I found reading the book very disheartening because of how futile it seemed to actually try to reason with people. And even though each chapter did end with a story of hope, they still seemed so few and far between. Is being disheartened justified in the face of the book's thesis?
This is tangentially on topic with respect to "everybody making mistakes, except us... " about global warming.ReplyDelete
In your Skeptic 16(1) article, "Science by Think Tank," Massimo, you wrote, "think tanks like the American Enterprise Institute that have the gall to bribe scientists so that they speak critically of reports about global warming..."
Can you provide any factual evidence that a scientist has accepted a bribe to lie about global warming?
That's what you really meant, by the equivocal "critically," isn't it. Lie. That a scientist accepted money to lie.
Who has lied for money, Massimo?
Pat, in the book I cite the BBC article that talks about that incident. Scientists were paid by the AEI for speaking tours on condition that they would be critical of global warming. If that doesn't constitute lying for money, I don't what does. And what on earth did you mean when you accused me to do the same? The Skeptic article was not paid for, you may want to know.ReplyDelete
Carol should be an interesting guest. Her book points to the workings and failing of human logic. We don't know when we are cognitive dissonant. Not knowing is part of the definition of cognitive dissonance. And the particularly scary part.ReplyDelete
Seeing this, it is amazing we evolved at all and we have the bare semblance of cohesive society that we do. As the world becomes more complicated are we prepared for the novel ways we'll dissonanate? (Probably not a real word dissonanate, to have perpetrated dissonance.)
This brings up the notion that cognitive dissonance can be self delusion or can be intentionally and systematically introduced. This is the ideas Brendan Nyhan was spreading in your podcast #19.
I'm only a short way into the book and there have been a couple of tips for dissonance avoidance but I want you to pry more tips out of her.
In a video presentation of a talk given at "Achieving Clinical Excellence 2010 Conference" by Scott Miller PhD, showed problems with clinician's perception of success in their psychotherapeutic treatments. He really hits hard at this idea that we are all above average. He shows that therapists suffer cognitive dissonance when it comes to ideas of their own success rate and shows that those who actually do statistically better, typically don't think, act or claim to be better. In fact the opposite. They don't feel that make enough of an impact and are more humble about there accomplishments.
Scott is a great presenter. Even though the video production is poor, worth a look. 3 parts. http://qik.com/video/16395220
The example that comes to mind when I think about cognitive dissonance is Han Christian Anderson's "The Emperor's New Clothes." People convince themselves that they see clothes rather than be thought "unfit for their position."ReplyDelete
My question for Carol Tavris is how much is the theory of cognitive dissonance itself used rationalize the behavior of others.
Is it used as a catch-all explanation for someone else's behavior even when there is significant evidence that support alternative explanations because it allows the person proclaiming to resolve their own cognitive dissonance?
Are people projecting a psychologically soothing explanation for their own behavior on other people's behavior?
Massimo, I don't have your book. Can you please cite the BBC article here, on your blog? I'd like to read it.ReplyDelete
Also, I made no allegation against you in my post. I quoted what you wrote, and interpreted what you meant by "critically." Thank-you for confirming my understanding that you meant 'lie' when you wrote "speak critically."
The last sentence in my prior post is a question about who you suggest has lied about climate. It was not directed at you or your Skeptic article. Sorry for the misunderstanding.
thanks for clarifying, I was in a hurry last night (I'm at Notre Dame for a couple of invited talks, and they working me hard!).
The article in question turned out to be in The Guardian, 2 February 2007.
Here is a link to the Guardian article. It opens with this: "Scientists and economists have been offered $10,000 each by a lobby group funded by one of the world's largest oil companies to undermine a major climate change report due to be published today."
Here's a link to AEI's explanation of the incident, which includes copies of the letters AEI sent to the various scientists.
It is hardly the offer of a bribe. Among the climate scientists solicited, AEI included Jerry North and Steve Schroeder, both of Texas A&M. Jerry North is completely committed to the IPCC view.
To Schroeder, AEI wrote that they had, "considerable respect for the integrity with which your lab approaches the characterization of climate modeling data. We are hoping to sponsor a paper by you and Prof. North that thoughtfully explores the limitations of climate model outputs as they pertain to the development of climate policy ..."
This is hardly unreasonable. In fact, the outcome of such a study would bring the strong caveats about reliability, presently deeply buried in the Supplemental part of the IPCC's WG1 Assessment Reports, forward into public view where they should have been all along.
In your Skeptic article, you accused the American Enterprise Institute of having actually bribed ("the gall to bribe"), not of having attempted to bribe. In constructing your sentence that way, you implied there are scientists who accepted such bribes.
The word "bribe" does not appear in the Guardian article. It does, however, appear on environmentalist blogs. Is that where your "bribe" allegation really originated?
It seems evident that you accepted the Guardian story at its word, and extended it into a very defamatory allegation against scientists who "speak critically" (and when did speaking critically suddenly become wrong?), about the science offered in support of imminent and dangerous climate warming.
There doesn't seem to be any evidence of a bribe, Massimo. Unless someone can document a secret understanding between AEI and certain scientists, your defamation appears to be evidence of a secular demonology.
my intention was indeed to say that AEI attempted a bribe, not that it succeeded. And no, my source was not an environmentalist blog, it was the Guardian article. And yes, I do trust the Guardian a hell of a lot more than the AEI. That's because of the evident pattern of behavior by the AEI and similar think tanks (like the CATO Institute), which I document in the book.
Massimo, the Guardian article does not say or even imply that a bribe was offered.ReplyDelete
The letters put into public evidence on their website by AEI contain no hint whatsoever of skulduggery, false dealing, or of an inferential bribe.
Any recipient of one of their letters could have disputed the factual case publicly displayed by AEI. I saw no evidence of any dispute about those facts by the recipients. Did you?
Here is Andrew Dessler's follow-up comments on Grist about the incident. He mentions nothing of any attempt at bribery.
What we're left with is that you accused AEI of attempted bribery on no grounds except your own personal conviction that they're untrustworthy.
And you wrote the accusation in such a way as to imply that scientists have been dishonest for merely venal motives. There's no evidence of that, either.
one more time, I have not accused any scientist of any such thing. I have simply taken the lead from the Guardian article, which came out immediately after the release of the most recent IPCC report. I consider AEI a bunch of ideologues with no intellectual integrity, so I simply do not trust anything that they publish on their web site.
Massimo, you wrote, "I have not accused any scientist of any such thing. [as taking bribes.]"ReplyDelete
In Skeptic, your statement is this: "think tanks like the American Enterprise Institute that have the gall to bribe scientists so that they speak critically of reports about global warming..."
That is worded as an accusation that scientists have in fact taken bribes. And the way you wrote it, Massimo, you defamed them as a class.
Latterly resting your innocence on a claim that you didn't accuse any scientist, when in fact you accused them all, looks a lot like an exculpatory equivocation.
I know some of these "critically speaking" scientists, both by their work and by email contact. Scientists such as Willie Soon, Chris Essex, Ross McKitrick, and Demetris Koutsoyiannis, among others. They're honorable people, and the way you worded your comment, as a declaration of fact, you defamed them by implication. That's why I decided to ask you about it, here on your blog.
The accusation of bribery, in fact or by attempt, is not supported by the Guardian article. It's not supported by Andrew Dessler, and it's not supported by the facts put in evidence by the AEI. Facts that remain unchallenged.
Your allegations give every evidence of being factually false, and of merely reflecting your own personal bias.
Your last post is also bereft of evidence. The accusation that the AEI offered bribes is factually unsupported and now rests only on your a priori personal opinion that the AEI are "a bunch of ideologues with no intellectual integrity."
So the support for your first allegation is your second allegation.
To believe you, we have to trust you to know the truth without benefit of factual evidence.
Revelation, secular style.
Skeptic has printed a speculative and defamatory allegation under your by-line. I wonder whether you have put Micheal Shermer in a legally vulnerable position.
I have already conceded that my wording may not have been clear, but I repeat that there was no intention at all to criticize the scientists in question, just AEI. If you don't believe it, so be it.
As for AEI itself, mine is a considerate opinion of that outlet, based on what I've read on their web site, which I examined thoroughly in preparation for the book. Of course my opinions are informed by my political views. Aren't yours?
Finally, the Guardian article. Its title is "Scientists offered cash to dispute climate study." Does that NOT sound like an attempt at bribing to you? and the article says, in part:
"Scientists and economists have been offered $10,000 each by a lobby group funded by one of the world's largest oil companies to undermine a major climate change report due to be published today.
Letters sent by the American Enterprise Institute (AEI), an ExxonMobil-funded thinktank with close links to the Bush administration, offered the payments for articles that emphasise the shortcomings of a report from the UN's Intergovernmental Panel on Climate Change (IPCC)."
Again, where exactly am I misreading it?
1. You directly implied that scientists had accepted bribes. You now write that's not a criticism of scientists. What I believe is unimportant. What your words in Skeptic show is unambiguous.
2. I do my best to dissociate my views about factual proofs away from my political opinions. Don't you? The facts in question are about whether AEI offered bribes for lies. Views about them are immune from politics. Or should be so.
3. The title of the article is inconsistent with the explanation offered in the article by AEI's Kenneth Green: "Right now, the whole debate is polarised ... We don't think that approach has a lot of utility for intelligent policy."
The disparity between the message of the title and the Guardian's portrayal, and Green's reasonable explanation, should have been a warning to you that something was not right. That in turn should have led you to look further.
What possible rational objection could there be for what Green proposed? The letters posted on the AEI website completely vindicate Green's description of their motive.
You may decide that Kenneth Green is lying in his statement, or that the AEI was being disingenuous. But what factual evidence is there for that, apart from your personal and politically based distrust?
The letters sent out by AEI are entirely reasonable. They request a candid scientific appraisal of the strengths and weaknesses of, e.g., climate models. Those letters do not support the implied criticism in the Guardian article or your own latter accusation of bribes for lies.
End of reply part I
Reply part IIReplyDelete
Andrew Dessler commented on the incident 4 years ago. The AEI's web page with their explanation and the copies of the letters they sent out was put up more than 3.5 years ago. Your book is copyright 2010. You had plenty of time to investigate the issue of bribery.
4. The "ExxonMobil-funded thinktank" part of the Guardian article clearly plays large for you. So, let's look here at other stuff that ExxonMobil funds. "Malaria No More" in Africa received $10M over three years, with the second $3.3M disbursed in 2009. So, "Malaria No More" is an ExxonMobil-funded program. Obviously untrustworthy.
ExxonMobil has also given $2M to the Accordia Foundation over the last two years. This group of prima facie untrustworthy activists is sponsoring the JUMP program in Uganda. That's the Joint Ugandan Malaria Training Program.
It should be obvious that "ExxonMobil-funded" is a propaganda-motivated code-word meant to defame, but one that actually carries no moral or ethical message. ExxonMobil turns out to fund lots of worthy projects. So, what we're left with is that "ExxonMobil-funded" is a pejorative only when the object -- the AEI in this case -- is already in politically (not factually) bad odor.
"ExxonMobil-funded" just offers an opportunistic fact, lending specious justification to a political prejudice. Ethical window-dressing on a political hovel.
The Climate Research Unit of the University of East Anglia, home of Phil Jones and a center of global warming prediction, is funded in part by British Petroleum and Shell. So, the CRU is 'BP/Shell-funded.' Are they, therefore, untrustworthy liars, too?
Why weren't you able to see through a luridly written article in the Guardian? Should you not have checked past that article before making defamatory allegations? You had plenty of time. But apparently you didn't bother.
"ExxonMobil-funded," and "close links to the Bush administration" were all the evidence necessary, apparently, for a judgment of guilty.
Really, Massimo, I don't know why you are belaboring this issue with a clearly irrelevant defense. The content of the Guardian article is not a defense. It doesn't mention bribes. It mentions giving money for work that would critically examine claims about climate. Claims that have since been discovered to have been prejudiced by an in-house IPCC tendentious incompetence, and by the ethical failings of scientists whose names are found in the FOIA file leaked from the CRU a year ago.
You were at best misled by the Guardian and negligent in your fact checking. You were ethically careless in making a defamatory allegation and portraying it as fact. Your allegation was offensive and offended me.
There's really no more I can write on this. It all seems patently obvious. This post, though more detailed, is mostly repetitious. So, I'll leave things here. Best wishes.
I've already told you what I think and why, but thanks for the additional explanation of your point of view.
I would be interested to hear why cognitive dissonance is stronger than our survival instincts. Several "controversies" involve scenarios that threaten our personal, and collective, survival: i.e. global warming, vaccination.ReplyDelete
Although I appreciate the influence cognitive dissonance has what would be more compelling than preventing death by vaccinating yourself and your loved ones?
I just finished reading the book and i have to admit i found it disappointing. It felt more like a self-help book rather than a science popularizing book. One neatly packaged gospel made "manifest" by a billion contrived redundant anecdotal examples plus a few studies weaved into the storyline. It certainly didnt improve my view of psychology. And the funny thing is that i cant really disagree with any of its points (well basically theres only one point in the entire book).ReplyDelete
I wonder what the psychologists definitions of explanation and theory are. How can you talk about cognitive dissonance increasing or decreasing when you cant measure it? How can you talk about the dissonance theory predicting something when clearly it cant predict anything of importance. I wonder if psychologists actually have some hypothesis to confirm or disprove before performing an experiment (as opposed to after...). Can you write a book about screwdrivers without knowing what screwdrivers are supposed to do and without mentioning screws, furniture, machines or other tools? Well how can you write a book like this without mentioning other minds and the interests of the bodies they inhabit? At some point the authors note that one of the reasons why we rationalize our decisions is so that we dont lose face. Near the end of the book they tell us that not only we dont achieve that but that in order to do so we have to overcome our nature and stop rationalizing. Well why then is our nature the way it is? Shouldnt they have included some words about why we do these things we do? Any explanation of the mind or some specific aspect of it should include some connection with the real world and not just the functions of the mind itself alone. Minds dont float in a vacuum. People (sometimes) rationalize because some people buy the excuses so they can avoid the consequences of their actions. Why do some people buy them and some dont? Isnt it funny that all the examples concern "mistakes" that are likely to be perceived as such by the people who are most likely to read the book? Why not include some controversial examples and let the readers decide for themselves? Well of course we dont want to cause dissonance in the minds of the reader...
If anyone has any material that justifies why psychology should be considered to be science i would love to get my hands on it. Until then i will stick to physics :P
I dont mean to offend any psychologists or their particular field of study. I find issues like these very engaging and interesting but i really cant see how the conclusions they reach are objectively true or useful somehow.
To clarify my comment. ... Which is not to say the scientific method cannot be applied in areas of geography etc (some areas of geography are science domains), I'm just wondering if it has been applied inappropriately in some areas, just because it seems seems to work in other contexts, and if so what rationalizations were made.ReplyDelete