Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

HuckleB

(35,773 posts)
Mon Apr 18, 2016, 12:05 PM Apr 2016

Know This First: Risk Perception Is Always Irrational.

http://undark.org/article/know-this-first-risk-perception-is-always-irrational/

"... (Examples of risk perception mistakes.)

For anyone outside the emotions that produced these choices, it’s hard not to feel frustration at hearing about them. It’s hard not to call them ignorant, selfish, and irrational, or to label such behavior, as some do — often with more than a hint of derision — “science denialism.” It’s hard, but it’s necessary, because treating such decision-making as merely flawed thinking that can be rectified with cold hard reason flies in the face of compelling evidence to the contrary.

In fact, the evidence is clear that we sometimes can’t help making such mistakes. Our perceptions, of risk or anything else, are products of cognitive processes that operate outside our conscious control — running facts through the filters of our feelings and producing subjective judgments that disregard the evidence. The behavioral scientists Melissa Finucane and Paul Slovic call this the Affect Heuristic; it gives rise to what I call the risk perception gap, the dangers produced when we worry more than the evidence says we need to, or less than the evidence says we should. This is literally built in to the wiring and chemistry of the brain. Our apparent irrationality is as innate as the functioning of our DNA or our cells.

...

The evidence from decades of research in a range of fields is convincing. If the definition of “rational” is “thinking based on facts or reason and not on emotions or feelings” (Merriam-Webster), then humans are decidedly not rational. The truth is closer to what Charles Darwin observed as he stood at a puff adder exhibit in the London Zoo, nose pressed against the glass, knowing he was safe but unable to keep from flinching whenever the snake struck: “My will and reason were powerless against the imagination of a danger which had never been experienced.”

Somehow, though, we continue to deny the evidence and cling to our anthropocentric faith in human intellectual power and the myth of our ability to use dispassionate, objective analysis to know “the truth.” In that belief, we sniff with superiority at those whose perceptions of risk don’t match the facts, as though we’re smarter because we can see what those science-denying dummies can’t.

..."


------------------------------------------------------------

A very worthy reminder.

6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

AuntPatsy

(9,904 posts)
1. Excellent article but in continuing to exhibit these irrational behaviors existing from
Mon Apr 18, 2016, 12:21 PM
Apr 2016

what the author seems to theorize is wired in, responsibility then becomes a mute talking point, one is left wondering, was their even a point to begin with........

HuckleB

(35,773 posts)
2. I guess the point is the hope that recognizing such traits...
Mon Apr 18, 2016, 12:39 PM
Apr 2016

... may motivate one to challenge one's choices, and, perhaps, make more accurate choices via that questioning. Even knowing that emotions will continue to play a role.

AuntPatsy

(9,904 posts)
5. The problem I see with that simplicity regarding human interactions is that ones early
Mon Apr 18, 2016, 01:04 PM
Apr 2016

as well as recent social interactions must play an important role in reshaping these hard wired reactions to just about every scenario that a person could be forced to deal with. Therefore harder if not impossible for some to make the what others see as the right choice...

So how does one deal with the simple truth that one mans sin is another mans glory,

A simple thought that in all honesty our differences should not be seen as threats but as learning tools for our species evolution....

I don't predict a favorable outcome for our species since history and present day theatrics continue to control the need to be victorious no matter the cost....



Igel

(35,282 posts)
6. His biases (or non-biases) notwithstanding,
Mon Apr 18, 2016, 03:28 PM
Apr 2016

what's said is accurate.

We have brains that engage in parallel processing. The first answer reached at is assumed to be true--we aren't usually aware of the parallel processing.

We have brains that (mis)generalize over past events. We forget usually forget facts that we don't think are important, but often that's because they don't confirm what we already think or want to think; we remember slights that we think are important. We forget things, period (but with a bias). We like to think we don't forget relevant or important things.

Our memories are flexible. We can start to remember things that we don't actually remember; we can revise memories. Invariably, more people remember voting for a popular president than actually voted for him. Even here, I've had people say that X never happened, and I've tracked down threads reporting not only what "never happened" but with extensive comments made by the people saying X never happened.

We substitute information. If a doctor has a stylish waiting room that's well maintained we take that as evidence of being a good doctor. If a receptionist or bank teller donates to the right cause, we think the receptionist is more competent and virtuous. If a good actor speaks, we assume that the actor, adept at repeating another's words, is qualified to judge GMOs or vaccinations.

In grading, if a good student makes a mistake we assume it's an accident; if a bad student makes the same mistake, it's a sign of incompetence. This often serves to reinforce group boundaries. This is called the "halo effect." It's a kind of attribution error, assuming that some practice reflects not circumstances and incidental things but a deep truth about a person.

We value the next 5 minutes more than 2020-2025. Present trivial advantage is worth much more than larger future advantage.

Critical thinking is evaluating an argument, a claim, a premise, facts, given the facts we know, awareness of facts we might know, and with full awareness of our innate biases. It assumes knowledge of the most common fallacies as well as our own flaws. Parroting adult's critical stances is commonplace among children. Learning the mechanisms and processes to shred others' (disliked) ideas and claims becomes fairly common among teenagers and undergrads.

Slow, conscious consideration of our thinking gives parallel processing time to produce a second answer. "I pick A ... No, on second thought, that's wrong. I'll go with choice B."

My practice of critical thinking is most important when it's applied to what I think and say. It's natural to criticize others I disagree with; no special skill or kudos for that. But breaking the assumption that I'm always right takes training and awareness, and is an on-going process.

As an undergrad I was taught enough critical thinking to defend myself against others. To examine my views just enough to be able to get up, make a claim, and recognize how others might attack it. But that's still not a search for truth but to defend my own ego. It wasn't under grad school where I was taught by some dedicated professor to worry about truth--it's better for an enemy to demolish my argument if he's right than for me to succeed and defend my views if they're wrong. (She was a sociopathic b****, but that doesn't mean she was wrong.)

A lot of people don't like to accept that their perceptions are, simply put, often wrong. "How dare you challenge my perceptions!" At the same time, competing perceptions are usually denigrated as hopeless misguided.

Latest Discussions»General Discussion»Know This First: Risk Per...