Backfire Effect – Cognitive Biases (Pt.7)
The backfire effect is the tendency to reject evidence, facts or information that goes against the beliefs we are attached to. Normally we tend to give ourselves and others more credit, that our perceptions of reality can change given more accurate or new information. But often that isn’t the case.
Information we first absorb and use has an anchoring effect in our worldview. Even if this is shown to be a misperception or misinformation, the discredited information is now irrelevant, with the belief being stronger and entrenched deeper.
This is how things function automatically. We aren’t aware of our biased tendencies nor the unconscious formation of beliefs, compared to the truth in reality. Admitting wrong is a threat to our worldview, forcing a correction to a then fractured perspective. We prefer to remain attached to how we currently accept and view things around us. Not only do we prefer to remain attached, but also attach ourselves even more when we feel threatened by an attack on information we hold dear.
We like consistency in life. It’s a natural part of psychology. Contradictions between our perception of reality, and the actual objective reality, can result in cognitive dissonance to alleviate this negative emotional state. This is where the backfire effect comes in to get the dissonance released but the actual false perception unresolved.
Feeling negative is not desired by consciousness as a whole. We generally like to believe whatever we want to believe. We like to believe we are right even when we are wrong, as we tend to not actually care for truth and what is right, preferring attachment to ourselves and our current information. This is part of the feel-good pleasure trap in consciousness. But if we feel-good about ourselves at the time, and not threatened to correct ourselves or admit wrong, then we are more likely to be willing to accept new information.
This ties into manipulation of people through emotional control. If you get people conflicting, agitated, controversial, in tense frictional emotional states, the less likely they are to listen to each other, and the more easily they can be controlled through their polarized myopic perspectives. Most, if not all, physical conflicts come from emotional friction.
Misinformation that supports our beliefs and false perceptions reinforces our sense of being right, of having a handle on things. This attachment leads us to have our beliefs dictate which facts we later accept, a motivated reasoning. Bad information is more easily accepted if it fits into those beliefs. The confirmation biases reinforce our sense of being right even more, and the likelihood to continue to reject new information.
Therefore, those most needing their beliefs corrected are less likely to do so. Couple this with the Dunning-Kruger effect (to be covered later), and it can be more understood why people have a hard time altering their socially engineered conditioning worldviews. People who know less, think they know more, and stick with this self-assessment even when contradictory information is presented. There is a difference between being uninformed and being misinformed, but not when it comes to being wrong yet thinking we are “right”.
With all the information the brain processes, we have cognitive shortcuts like intuition and inference, but also cognitive biases. Things go more quickly. Without these shortcuts, not much would get done. But that’s no excuse to not improve the quality of our thinking or correct ourselves.
The more self-aware we become with self-knowledge, to know ourselves, the more we can overcome these bio-nature automated unconscious-subconscious conditions and programmed modalities of being. Knowing reality, existence or truth is important in life. Knowing ourselves and our functionality allows us to more honestly know the world around us.