Japanese and Koreans invaded Asia. We apologize.

Confirmation bias Backfire effect

2013年09月30日 18時42分43秒 | Weblog
The Science of Why We Don't Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
―By Chris Mooney | May/June 2011 Issue





In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers (PDF). Our "reasoning" is a means to a predetermined end―winning our "case"―and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.



It's just that we have other important goals besides accuracy―including identity affirmation and protecting one's sense of self―and often those make us highly resistant to changing our beliefs when the facts say we should.



And it's not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place


In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views


And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts―they may hold their wrong views more tenaciously than ever.



Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it


The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?



Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn't trigger a defensive, emotional reaction.


これはわりに面白い記事で、他の記事で引用されていたもの。

 非科学的なことにはまる科学的な説明、というのであるが、証拠や議論を提示して人は説得されるか、というと、むしろ、結論先にありきの態度を持つ場合もおおく、、その結論に都合のいい証拠ばかり集めたり、都合の悪い証拠はつぶそうとしたり、そもそも都合の悪い人の意見は、科学的であると認めなかったり、それでも、冷静に証拠や冷静な議論を突きつけられると、よけい意固地になって自説にしがみついたりするが、それは、自分の立場、自分が何で、誰であるか、を不安定にするような、情報は受け付けたくないからである、と。

だから、人を説得する場合、相手のコアとなる信条を逆なでしないよう、受け入れられるような包装紙でくるんであげると、いい、と。




最新の画像もっと見る

コメントを投稿

ブログ作成者から承認されるまでコメントは反映されません。