“Just as terror, even in its pre-total, merely tyrannical form ruins all relationships between men, so the self-compulsion of ideological thinking ruins all relationship with reality. The preparation has succeeded when people have contact with their fellow men as well as the reality around them; for together with these contacts, men lose the capacity of both experience and thought. The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.”
~ Hannah Arendt, The Origins of Totalitarianism
In this supposed post-fact age dominated by alt-facts, it has come to be questioned how much truth matters. This is hardly a new concern, simply because we have proud ignoramus as president, as Ron Suskind years ago wrote of Karl Rove:
“The aide said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality.’ […] ‘That’s not the way the world really works anymore,’ he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do’.”
When ignorance is cynically wielded as a weapon, what power can truth have? The problem of ignorance isn’t only about what we don’t know but what we ignore, sometimes what pretend to not know, sometimes even to ourselves by way of dissociation or else by way of welcoming any comforting lie. There are cognitive biases and failures that we are prone to, as our shared human inheritance, but it has been claimed that some are more prone than others — as I’ve argued in the past (6 years ago):
Research shows that liberals are more willing to challenge authority and so lack the submissive attitude of unquestioning respect toward authority which is common among conservatives. For example, more liberals than conservatives state they’d be willing to slap their own father. ‘Openness’ is the main psychological trait that correlates to liberalism. What ‘openness’ is about is cognitive complexity, capacity for cognitive dissonance, intellectual curiosity, desire to experiment and explore, etc. But ‘openness’ also relates to being less inclined to fall into motivated reasoning (confirmation bias, backfire effect, etc)… on issues related to politics, anyway. I’ll point out the obvious fact that ‘openness’ can’t operate while submitting to authority. […]
Relatively speaking, liberals are more rational than conservatives when it comes to political issues (or so the research shows it to be the case in liberal democracies like the US). This is significant since the political issues that provoke the strongest motivated reasoning are always mired in moral issues, all of politics ultimately being inseparable from morality. In practical terms, this doesn’t necessarily mean liberals are more well informed for that has more to do with education and there are plenty of well educated conservatives; but what it does mean (as shown by research; read Mooney’s book for a helpful summary) is that liberals are less misinformed while conservatives are more misinformed. The odd part is that conservatives are more misinformed to the degree they are informed, what is described as the “smart idiot” effect. This also relates to how conservatives and experts (well educated conservatives fitting both categories) are most prone to the backfire effect which is when challenging info causes someone to become even stronger in their opinions.
Is that true? Does the evidence still support this assessment? That is what I’ll explore.
Let me be clear. One of my favorite hobbies is criticizing and complaining about liberals (e.g., Liberalism: Weaknesses & Failures) and increasingly left-wingers as well (e.g., Is there a balance point in a society of extremes?). I end up obsessing more about the political left than the political right and my conclusions are often far from kind, to such an extent that I’ve lost some liberal friends these past couple of years (even my sister-in-law, a good liberal and partisan Democrat, who likes me on a personal level admitted that she blocked me on Facebook because of my political views). I personally know liberalism as someone who is a liberal, having been raised in a liberal church and having spent most of my life in a liberal town. But when I speak of conservatism, I also do so from a personal perspective, having been raised by conservative parents and having spent much of my life in conservative places (even this liberal town is in a rural farm state that is conservative in many ways, the state government presently controlled by right-wing Republicans).
My picking on conservatism isn’t separate from my picking on liberals. One of the main irritations about liberals is how easily, under conditions of stress and cognitive overload, they begin thinking and acting like conservatives. Under those conditions, liberals will share the same tendencies and biases as conservatives. The difference is that it requires pushing liberals out of their preferred mindset to get this response from them. This interests me more, the conditions that create and change ideological mindsets — that isn’t exactly my focus here, but it relates.
My own view is more in line with Chris Mooney, as opposed to Jonathan Haidt (I should point out that when I first read about Haidt’s research many years ago I found it quite compelling or at least interesting, but I later changed my mind as I read his book and analyzed his arguments and data more closely). Some see these two thinkers as making the same basic argument. It’s true that they rarely disagree about much (at least, not strongly when the two dialogue in person), and Mooney goes so far as to praise Haidt while sometimes dismissing apparent differences. I understand how their their arguments resonate, as they both started from a liberal position and from there sought to understand the American ideological divide. They share a common goal, to improve understanding and communication. Still, I sense something fundamentally different not just about their views but how they approach and hold those views. Their ultimate conclusions diverge greatly, Mooney leaning to the left and Haidt leaning over backwards toward the right. As I see it, much of what Haidt says is way off the mark. And for this reason, he is an example of the kind of public intellectual that confuses and annoys me, despite his amiable personality and liberal-minded good intentions. Mooney, though also being a fairly standard liberal, has a way of being more direct and so what can seem more honest, calling a spade a spade (The Republican Brain, Kindle Locations 2075-2079):
“You will probably have noted by now that the moral intuition research of Haidt and Ditto is not fully separate from the [cognitive] research covered in the last chapter. It overlaps. For instance, take conservatives’ greater respect for authority, and their stronger loyalty to the in-group, the tribe, the team. Respect for authority, at its extreme, is hard to distinguish from authoritarianism. And viewing the world with a strong distinction between the in-group and the out-group clearly relates to having lower integrative complexity and less tolerance of difference (although it can also, on a more positive note, mean showing loyalty and allegiance to one’s friends, and more patriotism).”
As I compared the two elsewhere:
So, Haidt’s view of intuition being greater than reasoning has some truth to it while also containing much speculation. We know that all people are predisposed to motivated reasoning. Yes, such bias can manifest as post hoc rationalizations of our intuited moral values. What Haidt ignores or doesn’t fully acknowledge, intentionally or not, is that not all people are equally predisposed to motivated reasoning in all types of situations. Mooney’s book presents a logical argument based on damning evidence about how conservatives are more predisposed to motivated reasoning when it comes to political issues, and it is clear that political issues are inseparable from moral issues in these cases of motivated reasoning.
A major example of motivated reasoning is the backfire effect. It has been well researched at this point. And the research shows it to be complex and context-dependent, as is presumably true of any cognitive biases. One early result found was that two oddly paired groups were most prone to the backfire effect, conservatives and the highly educated with highly educated conservatives being the worst (I’ll further discuss this finding below).
What can we make of this? As always, it depends. It’s not that conservatives are inherently anti-truth and anti-fact, anti-intellectual and anti-science. If you go back almost a half century ago, conservatives actually had slightly greater trust in science than liberals at the time, the two having switched places over time (the same was true with average IQ, having been higher among Republicans under Reagan but since then having been higher among Democrats, but intriguing piece of data is straying too far afield).
Why is that? Why did this change occur? There might be a simple explanation for it. During the Cold War, scientists were highly respected and science heavily funded by government in the fight against communism. For conservatives, the Cold War was all about an ideological war and a defense of the American Way. A major form that took was a technological competition between the two global superpowers, a space race and a nuclear weapons conflict. Science was a tool of ideology and the ideology in question was in line with an authoritarian vision of establishment power and a socially conservative vision of a status quo social order (an era during which perceived leftist radicals and deviants were the victims of big gov and big biz oppression, targeted by witch-hunts, blackballing, COINTELPRO, etc). Government funding of science and technology was often directly linked to the the military (e.g., R&D that created an early version of the internet as a communication system that would survive a military attack), and hence proof and expression of American greatness as part of the Whiggish view of White Man’s Burden and Manifest Destiny. Liberal values were also useful in the fight against communism and, unsurprisingly, during the early Cold War even conservatives like Ike and Nixon would publicly praise liberalism.
Humans in general are swayed by consensus views as an indicator of social norms. But conservatives are particularly motivated, as consensus among authority figures can be useful for conformity within and enforcement of the social order. In the anti-communist mindset back then, science and liberalism were part of the status quo of idealized American greatness as embodied in the American Dream (industrialized technology being commodified and experienced through a growing middle class of citizen-consumers; e.g., “Better living through chemistry”), what supposedly differentiated us from the backward authoritarianism of the Soviet regime (the ‘progressive’ authoritarianism of neocon corporatism is so much better!).
As the USSR weakened and eventually the Cold War ended, that consensus was broken and there was no longer a foreign authoritarian power posing a real threat. Liberalism and science no longer served any ideological purpose for the conservative agenda. So, to the conservative mind, liberalism once again became the enemy and so scientists were treated as liberal elites to be opposed (of course, excluding all of the scientists working for corporations and right-wing think tanks, as the big money of capitalism washes away their sins of intellectual pride; and also conveniently ignoring the sizable proportion of scientists along with engineering and tech field professors in universities who are on the political right).
When the US lost its only major global competitor with the collapse of the Soviet Union, consensus seemed irrelevant. America ruled the world and the Cold War had pushed conservatives into power. Conservatives didn’t need to make any concessions or compromises with the ideological opposition, as decades of persecution had broken the back of the political left. Conservatives no longer felt a need to justify themselves or look for allies. But that is changing now that the American star is on the decline and new global competitors are taking the stage. We have the opportunity to put pressure back on the political right for they are vulnerable to persuasion right now by anyone who will take advantage of it.
This brings me back to some of the research on backfire effect. This pressure seems to work. In Cosmos Magazine, Jeff Glorfeld offers a happy thought: “The added negative effect of conservatism plus high education was completely neutralised through exposure to the fact of scientific agreement around man-made climate change.” Consensus prevails! What this means is that defeating backfire effect requires pulling out the big guns. Repeat, repeat, repeat the facts of consensus. Don’t be shy about it!
More generally, I must admit that the backfire effect research doesn’t allow for simple conclusions. Some of it even seems contradictory, but I suspect this is because of the multiple factors (many of them confounding) involved. There is no single population and single set of conditions and so it’s unsurprising that various studies using different subjects from different backgrounds would come to different results (and we aren’t even talking about the even larger biases and problems of this kind of WEIRD research). Some of what we presently think we know about backfire effect and similar motivated reasoning might turn out to be wrong, misinterpreted, or more nuanced.
Let me give an example. Related to the above discussion about consensus, previous research wasn’t replicated by recent research (see: Wood & Porter’s published The Elusive Backfire Effect; Guess & Coppock’s unpublished The Exception, Not the Rule?). It indicates backfire effect might not be so strong and common, after all (not that the original researchers ever claimed it was ubiquitous and, showing no backfire effect of their own, the original researchers have supported the publishing of this new data). Also, there is no new evidence of any ideological disparity, if anything demonstrating that moderates are the least prone to it (are we to assume moderates are the least ideologically dogmatic in the partisan sense or are they simply the most apathetic with fewer ideological commitments because of intellectual laziness, thoughtlessness, or whatever?). Does this disprove the prior research? Flynn, Nyhand, and Reifler responded with some commentary.
Whatever it might or might not mean, I wouldn’t allow this to comfort you too much. Even though “[t]his finding is contested by other research that finds limited evidence that corrective information contributes to such a ‘backfire effect,'” writes Jennifer Kavanagh and Michael D. Rich (Truth Decay, p. 83), “even this research suggests that altering preexisting beliefs can be difficult.” One of the authors of the published work, Ethan Porter, admits that what “Our work shows is that people do accept new information, but we have no evidence that this then affects their downstream policy attitudes.”
This latter suspicion was confirmed, at least among certain people. The original researchers collaborated with the challenging researchers. They again couldn’t find backfire effect, which seems to put the original research into doubt, although it is a bit early to come to strong conclusions. What they did find was maybe even more disheartening, as written about in a Vox piece by Brian Resnick — that “facts make an impression. They just don’t matter for our decision-making, which is a conclusion that’s abundant in psychology science.” And this is specifically relevant for the present: “there’s still a big problem: Trump supporters know their candidate lies, but that doesn’t change how they feel about him. Which prompts a scary thought: Is this just a Trump phenomenon? Or can any charismatic politician get away with being called out on lies?” It still doesn’t disprove the backfire effect, since it’s possible that they had already backfired as far back as they could go at this point: “Many of his supporters may have to come to terms with his records of misstatements by the time this study was conducted.” Further research will be required.
If we take this latest research as is, it would simply justify the view of backfire effect being the least of our worries. Backfire effect can only occur after facts are shown to someone and they look at them. But how often do political debates even get to the point where facts get exchanged, much less acknowledged?
“At least it’s nice to know that facts do make an impression, right? On the other hand, we tend to avoid confronting facts that run hostile to our political allegiances. Getting partisans to confront facts might be easy in the context of an online experiment. It’s much harder to do in the real world.”
* * *
Let me make a note. Ideological mindsets are as much social constructs as are races. They are part of a particular social order and cultural worldview. Conservatives and liberals didn’t exist until the Enlightenment. Any such labels are one of many possible ways of grouping diverse potentials and tendencies within human nature.
That might explain why, as research shows (in the American population at least), there is an overlap between conservatism and authoritarianism. But that is just another way of saying all authoritarians, left and right, are socially conservative (the reason why it is sometimes referred to as right-wing authoritarianism, as there is no such thing as socially liberal authoritarianism) — whereas fiscal conservatism has no known positive or negative correlation to authoritarianism (so-called fiscal conservatism simply being an old form of liberalism, i.e., classical liberalism). So, this is the reason authoritarians are mostly found on the political right in countries like the United States and on the political left in countries like Russia (left and liberal not being the same thing, as always depending on what specific ideologies we are talking about).
It depends on context, on definition and perception. There is no singular ‘conservatism’ for its just a general way of speaking about overlapping patterns of ideology, culture, personality, and neurology. The overlap of social conservatism and fiscal conservatism in contemporary American thought might be more of a fluke of historical conditions. Russell Kirk, the godfather of modern American conservatism, actually thought the two were fundamentally incompatible.
* * *
Why the Right Wing’s War on Facts Is Driving the Divide in America
by Sophia A. McClennen
A recent study by the Duke Reporters’ Lab shows that, in addition to a partisan difference in the frequency of lying, there is a partisan division over the very idea of fact-checking itself.
The researchers logged 792 statements mentioning fact-checkers and coded them as positive, negative or neutral. While a majority of citations (68 percent) were neutral, they found a dramatic divide in the source of negative comments. The study noted 71 accusations of bias against fact-checkers. Conservative websites were responsible for 97 percent of them.
The study shows that conservative sites take a hostile, negative attitude toward the practice of fact-checking. In some cases the tone is hardly subtle. In one example, they cite Jonah Goldberg of National Review Online, who noted that Hillary Clinton’s record with the truth was far from spotless. “Even PolitiFact, the hackiest and most biased of the fact-checking outfits, which bends over like a Bangkok hooker to defend Democrats, has a long list of her more recent lies.”
Goldberg seems pleased that Politifact has a list of Clinton’s lies, but at the same time he feels compelled to denigrate the fact-checking operation that produced the list. In fact, the Duke study shows that even when conservative sites are happy to reference fact-checks that bolster their ideological perspective, they often still find a way to denigrate their sources.
How Campaign Messages Are Received and Processed
by David Helfert
Left Brain, Right Brain
Other neurological studies seem congruent with Westen’s findings. In the 1980s, pop psychology began to describe people as either left or right brained and suggested that the characteristic determined whether they tended to be more artistic, sensitive, thoughtful, creative, emotional, or analytical, depending on which lobes of the brain dominated their thought processing and behavior. The theory that everyone is either one or the other has been roundly disputed in recent years. Now, however, it appears there may be something to the basic idea after all, and that the unique characteristics of the left and right lobes of the brain may have consequences in political communication.
Journalist and author Chris Mooney has written extensively on how different kinds of political messages are received and processed by different people. Mooney has built on Westen’s research about neurological differences in processing varying kinds of messages. In his 2012 book The Republican Brain: The Science of Why They Deny Science—and Reality, he points to research that finds the predisposition to process stimuli in one lobe of the brain or the other is due to an actual physical difference in the size of the respective lobes.
Some people, says Mooney, actually have a larger right brain lobe, including the limbic system, which supports emotion, behavior, motivation, and long-term memory. Other people, he says, have a larger left brain lobe and tend to process most information through their prefrontal cortex, the lobes that help in reasoning and logical processing.
Mooney suggests that this neurological difference can reflect political tendencies. In The Republican Brain, Mooney describes “a recent magnetic resonance imaging (MRI) study of 90 University College of London students that found on average, political conservatives actually had a larger right lobe, including the amygdalae, while political liberals had more gray matter in the anterior cingulated cortex (ACC),” part of the brain’s frontal lobe, with many links to the prefrontal cortex.
This seems consistent with studies conducted in 2013 by Darren Schreiber, a researcher in neuropolitics at the University of Exeter in the UK, and colleagues at the University of California. Their research was described in “Red Brain, Blue Brain: Evaluative Processes Differ in Democrats and Republicans” in the international online journal PLOS ONE.
The study used data from a previous experiment in which a group of people were asked to play a simple gambling task. Schreiber’s team took the brain activity measurement of eighty-two people and cross-referenced them with the participants’ publicly available political party registration data. They found that Republicans tended to use their right amygdala, the part of the brain associated with the body’s fight-or-flight system, when making risk-taking decisions; Democrats tended to show greater activity in their left insula, an area associated with self and social awareness.
Schreiber claims the insula/amygdala brain function model offers an 82.9 percent accuracy rate in predicting whether a person is a Democrat or Republican. In comparison, the longstanding model using the party affiliation of parents to predict a child’s affiliation is accurate about 69.5 percent of the time. Another model based on the differences in brain structure distinguishes liberals from conservatives with 71.6 percent accuracy.
Mooney cites other academic research findings indicating that people whose limbic system is more involved in processing information are less likely to change their minds. Once they have arrived at a position on an issue that is congruent with their belief system and values, they are unlikely to change their minds even when presented with irrefutable evidence to support a different viewpoint. They will actually reject or discount facts or attempt to discredit the source of facts that conflict with their position.
Motivated Reasoning
A series of related behavioral concepts could shed light on why different people seem to react differently to various political messages. One of the best known concepts is motivated reasoning, which is based on research findings, such as that described by Mooney, that some people tend to process most information through the prefrontal cortex of their brains while others tend to receive and process information through the limbic system.
Other research has found that subjects who tend to process information through the prefrontal lobes of the brain tend to be more open to new information, and to be more politically liberal. Those subjects who tend to process information through the emotion-centers in the brain tend to be more politically conservative.
How Warnings About False Claims Become Recommendations
by Skurnik, Yoon, Park, & Schwarz
Telling people that a consumer claim is false can make them misremember it as true. In two experiments older adults were especially susceptible to this “illusion of truth” effect. Repeatedly identifying a claim as false helped older adults remember it as false in the short term, but paradoxically made them more likely to remember it as true after a three-day delay. This unintended effect of repetition comes from increased familiarity with the claim itself, but decreased recollection of the claim’s original context. Findings provide insight into susceptibility over time to memory distortions and exploitation via repetition of claims in media and advertising.
Misinformation lingers in memory: Failure of three pro-vaccination strategies
by Pluviano, Watt , & Sala
People’s inability to update their memories in light of corrective information may have important public health consequences, as in the case of vaccination choice. In the present study, we compare three potentially effective strategies in vaccine promotion: one contrasting myths vs. facts, one employing fact and icon boxes, and one showing images of non-vaccinated sick children. Beliefs in the autism/vaccines link and in vaccines side effects, along with intention to vaccinate a future child, were evaluated both immediately after the correction intervention and after a 7-day delay to reveal possible backfire effects. Results show that existing strategies to correct vaccine misinformation are ineffective and often backfire, resulting in the unintended opposite effect, reinforcing ill-founded beliefs about vaccination and reducing intentions to vaccinate.
Sometimes busting myths can backfire
by Bethany Brookshire
But bursting mythical bubbles can also backfire. The first problem is that people are easily persuaded by things they hear more often. “The mere repetition of a myth leads people to believe it to be more true,” notes Christina Peter, a communication scientist at the Ludwig Maximillian University of Munich.
And unfortunately, our brains don’t remember myths in a very helpful way. “There’s a lot of research that tells us people have a hard time remembering negations,” says Stephan Lewandowsky, a cognitive scientist at the University of Bristol in England. We remember myths not as myths, but rather as statements that are additionally tagged as “false.” So instead of remembering “cheese is nothing like crack,” our brains remember “cheese is like crack (false).” As our memories fade, the qualifier on the statement may fade too, leaving us with the false idea that brie really is the next cocaine.
Peter and her colleague Thomas Koch decided to find out how best to combat this backfire effect — our tendency to misremember myths as fact — when confronted with scientific information. They recruited 335 volunteers and asked them to read three newspaper articles. The first and last were decoys. The important one was in the middle, and concerned a new in-home bowel cancer test. The article included eight statements about the new test, with each immediately identified as fact or myth, and with an explanation of why the items were true or false.
The scientists also asked the participants to focus on different things. They asked one group to form an opinion about the articles as they read them. They asked another just to study the language.
After all the groups were done reading, Peter and Koch presented them with the eight statements from the bowel test article, and asked them whether they were true or false. Then the scientists asked the participants those questions again after five days to test what they retained.
Readers who focused just on the language of the articles suffered from the backfire effect. They were more likely to remember false statements as true than to remember true statements as false. This backfire effect got stronger when they saw the statements again five days later, and it influenced what they thought of the bowel test. The articles described the test in a slightly negative light. But if people remembered more of the myths as facts, they ended up with a positive view of the test. Oops.
But the backfire effect changed if participants formed an opinion as they read. Participants who were making up their minds on the fly made errors half as often as those who were reading only for language.
Peter says the results suggest that when presenting readers with new information, “try to avoid repeating false information,” since that may be what remains in people’s minds. And in some situations, Peter says, asking readers for their opinion or getting them to form an opinion as they read might help them distinguish between what is truth and what is myth. Peter and Koch published their results in the January Science Communication.
Backfire Effect Not Significant
by Steven Novella
For me there are two main limitations of this study – the first is that it is difficult to extrapolate from the artificial setting of a psychological study to an emotional discussion around the dinner table (or in the comments to a blog). It seems likely that people are much more willing to be reasonable in the former setting.
Second, we have no idea how persistent the correction effect is. People may immediately correct their belief, but then quickly forget the new information that runs counter to their narrative. That would be consistent with my personal experience, at least some of the time. It seems I can correct someone’s false information, with objective references, but then a month later they repeat their original claim as if the prior conversation never happened. I would love to see some long term follow up to these studies.
So if people do not respond to ideologically inconvenient facts by forming counterarguments and moving away from them (again – that is the backfire effect) then what do they do? The authors discuss a competing hypothesis, that people are fundamentally intellectually lazy. In fact, forming counterarguments is a lot of mental work that people will tend to avoid. It is much easier to just ignore the new facts.
Further there is evidence that to some extent people not only ignore facts, they may think that facts are not important. They may conclude that the specific fact they are being presented is not relevant to their ideological belief. Or they may believe that facts in general are not important.
What that generally means is that they dismiss facts as being biased and subjective. You have your facts, but I have my facts, and everyone is entitled to their opinion – meaning they get to choose which facts to believe.
Of course all of this is exacerbated by the echochamber effect. People overwhelmingly seek out sources of information that are in line with their ideology.
I think it is very important to recognize that the backfire effect is a small or perhaps even nonexistent phenomenon. The problem with belief in the backfire effect is that it portrays people as hopelessly biased, and suggests that attempts at educating people or changing their mind is fruitless. It suggests that the problem of incorrect beliefs is an unfixable inherent problem with human psychology.
Mick West says:
January 4, 2018 at 11:52 am
The primary problem with this study is that it is only measuring the IMMEDIATE effect of corrections. As they say in the final sentence of the discussion, there’s little backfire effect to correcting ideologically biased misinformation “at least for a brief moment”. It tells use nothing about what might happen weeks or months later. In fact the design of the study seems more like a reading comprehension test than about measuring changes in belief.
I’d recommend people have a look at the overview of backfire effects in The Debunking Handbook by Cook & Lewandowsky (free online). They identify three types: Familiarity Backfire, Overkill Backfire, and Worldview Backfire. Worldview backfire (which the Wood & Porter study measures) is more manifest as a disconfirmation bias, something which Wood and Porter dismiss, but don’t measure – not because people are too lazy to come up with alternative explanations, but because the immediate nature of the study does not allow the participants time for any mental gymnastics. The other two forms of backfire are likewise things that happen over time.
So I’d not put too large an asterisk on the backfire effect just yet.
B.S. says:
January 4, 2018 at 2:35 pm
I think that the backfire effect is most likely an emotional response. I’m reading “Crucial Conversations” right now and this book describes emotional responses to uncomfortable conversations- attacking someone who disagrees with you (perceived as an adversary) and defending yourself without thinking are a huge portion of this book. This model seems to fits both anecdotal observations of the backfire effect and this new research.
The mechanical turn questions appear to be emotionless and have no cues from an opponent with an opposing view. The corrections were all “neutral data from [cited] governmental sources.”. I’d bet that changing the factual correction to “No it isn’t you asshole! President Obama has deported illegal immigrants at twice the rate of Bush!” (note no source cited, because we rarely remember them in conversations) would elicit some sort of backfire effect that would likely be even larger if delivered emotionally and in person by an “adversary”. Maybe this all means that the key to eliminating any backfire effect is removing emotion from your response and accurately citing neutral sources. Maybe this means that dispassionate real-time fact checking of politicians could actually make a difference. Regardless, this is an interesting addition to the literature and conversation. It restores some of my hope.
NiroZ says:
January 4, 2018 at 11:37 pm
I’d wager that the reason for this would be in line with the research for motivational interviewing (a therapy technique) as well as the research around stigma, shame and vulnerability. Basically, when people make arguments that appear to be part of the ‘backfire’ effect, they’re actually responding to the feeling of being cornered, the loss of control and power in find found incorrect and the possible sense of alienation they feel about identifying with an ‘incorrect’ belief. If this is correct, it’s likely that these people would, under the right circumstances/ to people they feel safe with, admit that X belief is wrong, but they need adhere to it for other reasons (to belong in a group, to annoy someone they dislike, to avoid losing face).
Nidwin says:
January 5, 2018 at 3:41 am
From my experience the backfire effect kicks in when folks can’t say “woops, was I wrong on that one”.
Folks only change their minds as long as the subject doesn’t breech their little personal cocoon. And even then it’s often FIFO (first in first out).
At this point, I’d say the media is probably the biggest “post fact” media source. It is basically the Pravda machine for the very rich.
It’s job is to sell neoliberalism and a neoconservative foreign policy to the general public.
That goes to another level of the issue. Psychology can help explain the process of deception, delusion, and disinformation. But it doesn’t necessarily explain what is causing it. We have to move up the food chain to find the source of the problem.