Confirmation bias
Confirmation bias, also called myside bias, is the tendency to search for or interpret information in a way that confirms one's beliefs or hypotheses. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
It's related to a phenomenon called
motivated reasoning:
The processes of motivated reasoning are a type of inferred justification strategy which is used to mitigate cognitive dissonance. When people form and cling to false beliefs despite overwhelming evidence, the phenomenon is labeled "motivated reasoning". In other words, "rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe."[2] This is "a form of implicit emotion regulation in which the brain converges on judgments that minimize negative and maximize positive affect states associated with threat to or attainment of motives."
Both are psychological strategies used to cope with
cognitive dissonance:
"the mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time, or is confronted by new information that conflicts with existing beliefs, ideas, or values."
Lest you think I'm dumping on you personally here, this applies to everyone on the planet. These strategies appear to be an intrinsic part of human nature, probably the result of a long series of psychological adaptations stretching back through tens or hundreds of thousands of years of evolution. Consequently, I am as prone to them as anyone else.
The best that we can do to combat their blinkering effects is first to recognize their operation in the views we form, then to expose ourselves deliberately to alternative and contradictory viewpoints, and above all to practice accepting the mental discomfort of the resulting cognitive dissonance. Which explains why you're not on my Ignore list.