The Backfire Effect
There is an interesting article: “The Backfire Effect“, by David McRaney, cross-posted on OpEdNews with the subtitle: Why Showing People the Truth Sometimes Makes Them Believe BS Even More. This is interesting not just because of the article, but the reaction in comments that follow.
It starts out well enough, citing a recent study previously mentioned in this blog, conveniently summarized:
The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.
Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.
Understanding the nature of this backfire effect is critical for somehow dealing with it in a way that results in more people waking up.
But one thing that is also critical is being able to discern when you yourself are engaging in the same backfire effect, falling back on your beliefs because, to believe otherwise is just too difficult. Here is a good example to start with:
This is why hardcore doubters who believe Barack Obama was not born in the United States will never be satisfied with any amount of evidence put forth suggesting otherwise. When the Obama administration released his long-form birth certificate in April of 2011, the reaction from birthers was as the backfire effect predicts.
I’m willing to bet that most readers will side with the official story on this, and discount any amount of evidence to the contrary as mere wacko conspiracy theory. But hold on there a second. Who’s evidence counts more? Or more appropo, since it is not the quantity of evidence but its quality that should weigh most heavily, how do we decide what evidence qualifies? Have YOU looked at the evidence? We can’t all be spending our entire lives digging into everyone’s latest theory, so we have to rely on trust of others to do that for us. But who do you trust to verify the evidence is accurate and consistent?
For myself, I would have to honestly admit that I don’t know about Obama’s birth certificate. (But I am not bothered by not knowing mostly because I don’t think it is particular important. I mean, what difference does it really make? It’s a mere technicality at best.) I’ve actually looked into it a bit, watching one video that claims the long-form birth certificate was obviously a forgery because the document retained the history of editing changes, and you can see how they patched the date, etc. Well, I ask myself, how do we know this claim about the document is not the real forgery? And I don’t know, though I could go further and dig up the sources, corroborating evidence, etc. But then I remind myself, what difference does it really make? I’ve got better things to do with my time.
But here is the key thing to observe. Unless YOU have looked into the evidence yourself to verify that it is true, you really should admit that you are falling back on the trust of others and your trustworthy beliefs. Something that is stated as a fact, even by someone you have trusted, is not necessarily a fact. People lie. THAT is a fact!
Hence there is a little problem in deciding what is really evidence of the backfire effect. Is the victim of the backfire effect the one who counters your “facts” or is it YOU, by assuming that your “facts” must be true?
Indeed, the author of this article states as much:
What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
I’m a bit happier with this neutral position, not taking sides about whether you or your opponent is right. But are we all just fools, forever awash in the bliss of ignorance, or fomenting uselessly against each other?
Backfire or Forward Progress?
What this article lacks is any way out of this dilemma. How are we to decide who is really right? Not to mention, how do we avoid assuming we are right? It is almost as if the author believes there is no hope. The title of the associated book, “You Are Not So Smart” and the blog’s tagline: “A Celebration of Self Delusion” says it quite plainly. The page about the book says “After reading You Are Not So Smart you’ll see your just as deluded as the rest of us, and that’s just fine because it keeps you sane.” So, we should be happy about this?
But there is another way. David Chandler, a physics teacher, added one of the best comments on the OpEdNews posting.
The problem with this article is it leaves the problem of determining the truth completely up in the air. Certain positions are declared, a priori, to be the truth and others error, or misconception, or conspiracy theory. Not very useful. This article is essentially self referential because apart from anything substantive, its assertions will tend to bolster the beliefs of readers.
And a very reasonable question by Jonathan Allen:
Backfire Effect versus scientific training
Do we all suffer from the “backfire effect” to the same extent? Does education make a difference?
More specifically, has anyone determined whether the “backfire effect” is any weaker or stronger among test subjects who are professional scientists or at least have scientific training? I am wondering whether such training and practice can overcome what the article implies is an innate weakness of the human mind.
which led to my indirect response:
Rational thinking should override beliefs
If anyone can overcome this backfire effect, it would be those educated in rational scientific thinking, where the basis for truth is what we should all agree on, careful measurements and observations of the world, correct use of logic and statistics.
This is not to say that scientists are always right, and they will admit they are not always right. But to claim that scientific thinking is the wrong way to determine what is true – that cannot be rational.
A few comments gave the contrary view, that science is not the way, and that scientists are chief among the deluded. Dante DeNavarre replied to my comment with:
All the rationality, all the rigor, all the certainty wrought by the scientific method is dependent on repeatable experimental results. Absent repeatabilty, scientists are no more open minded than literal creationists. They should have an advantage due to their knowledge and practice of critical thought, but they become so invested in certainty of the latest truth that they attack new ideas that challenge the edifice they wrote textbooks about. If they don’t, they lose their paycheck.
I take that as a back-handed attack on science, though he is right that scientists can also be invested in what they have come to believe. What he misses is that while individual scientists, being human, may be flawed, science as a whole moves on and self-corrects, eventually.
9/11 Truth vs Denial vs Pseudo-truths and Other Lies
The first comment by David Watts pushed a lot of the discussion in the direction of taking on the official story about 9/11. I’ll quote the whole thing because of his very relevant quotes of others:
Perfect example“The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”
9/11 is a perfect example. Despite the mountain of contradictory evidence of the official story, most people’s belief that what they are being told by the authorities and media becomes even stronger. If people were able to assimilate the contradictory evidence, they would understand that 9/11 was a false flag operation.
“The great masses of the people… will more easily fall victims to a big lie than to a small one.” — Adolf Hitler, Mein Kampf
“The individual is handicapped by coming face to face with a conspiracy so monstrous he cannot believe it exists.” — J. Edgar Hoover, former FBI director
“Only the small secrets need to be protected. The big ones are kept secret by public incredulity.” — Marshall McLuhan, Canadian educator, philosopher, and scholar–a professor of English literature, a literary critic, a rhetorician, and a communication theorist.
What is interesting about the comments that follow is the disproportionately strong support by about four people for pseudo-science nonsense about vaporized steel and energy pulse weapons, all supporting each other and pretending to be more scientific while being critical of the official story. All nonsense, I assure you, but not because I am falling back on beliefs. I have looked into this enough to know that they have no rational basis for their beliefs. It’s all pretense. And as such, it constitutes a subversive attack on rationality, and on 9/11 truth. But don’t believe me either – please look into it yourself.
Why Fire Back?
I’ll end with an interesting observation, from the blog, about how we react to contrary information.
Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice.
So by taking notice of contrary beliefs, and then reaffirming our original beliefs, we go out of our way to merely return to normal? Why bother? Why not just ignore the contrary beliefs? In fact, I suspect we do that too, particularly when we believe the contrary beliefs don’t matter much (e.g. Obama’s birth certificate). When we do take notice, I think we are not merely defending our beliefs but first questioning, however briefly, whether they are true. By listening to the challenge, we are hypothesizing, asking ourselves “what if I am wrong?” And the reaction is, often enough, “no, that is too impossible to believe, so I must be right.” Even though we may briefly question our beliefs, without sufficient reason to change, then we feel safe to reaffirm, “I am right, and you are wrong.”
So ironically enough, I suspect this act of denial in the backfire effect has an upside, the subject of a future blog.