Do we often make mistakes by convincing ourselves of what we want to believe ?

Hat tip to political betting at

 http://www1.politicalbetting.com/index.php/archives/2014/01/03/how-strong-political-views-can-impact-on-our-ability-to-analyse-data/

for an American article which attempted to provided objective and measurable evidence that people are more likely to make mistakes in their mathematical calculations if the result of those mistakes is to reinforce rather than contradict their strongly held views.

As a statistician I have significant concerns about the way the results of the study are presented, so I would not take the results as gospel. In particular the suggestion that people were 18% more likely to get a calculation wrong if this reinforced their personal particular opinions is rather oversimpllified on the evidence quoted. Nor would I agree that if the study is right - and the jury is well and truly out on that - it would only prove that "politics ruin our ability to think." Any other set of strong opinions, be it about about religion - believer or atheist - or about sport, or tastes in music, might well show the same effect.

What made me sit up an take notice of the article was not the statistics quoted but the graphs.

For a non-political question, (on skin cream) the graphs showing how accurately people with different levels of numerical ability interpreted two sets of data were very similar for Democrat and Republican members of the sample group and irrespective of what conclusion should actually have been drawn from the data.

However, for a political question (on gun control) the four lines on the graph were all different. The great majority of those Democrat voters in the sample of 1,111 respondents who generally had good numeracy skills correctly interpreted the set of data which supported gun control, but about half even of the most numerate Democrats wongly misinterpreted the set of data which did not support gun control.

By contrast the great majority of those Republican voters in the sample who generally had good numeracy skills correctly interpreted the set of data which did not support gun control, but about half even of the most numerate Republicans wongly misinterpreted the set of data which did support gun control.

The moral of this story is that if you are analysing a set of data and the results come out exactly with what fits your prior beliefs, you should try to check your analysis as carefully and objectively as possible to make sure you are not convincing yourself that the new evidence conveniently fits what you were already disposed to believe.

Comments

Jim said…
Confirmation Bias, much like the placebo effect on new medical drugs, is a very well understood principle in the scientific community. Great steps are taken to overcome it, that is why the peer review process is very, very slow.

it tends to have greatest effect on a strong or long held belief, or, quite naturally, one in which a person has an interest.

the interest could be financial, or could improve other situations. or indeed could hinder in case case of things looking the "other way"

also one of the easiest ways to slip when looking at data, is the fallacy that "correlation always implies causation" of course it does not.

another is the "black and white" fallacy. thus by showing something not to be the case, you have, by default shown something else to be the case.
To make this i will use a really obvious example from the USA in recent years. There was a great campaign to teach "intelligent design" as an alternative to evolution in high schools. When asked for evidence of intelligent design, however, all that was offered were (easily answered and refuted) "evidence" evolution was not correct, not once was evidence in favour of intelligent design offered.

so in short the answer to the question/title of your post "Do we often make mistakes by convincing ourselves of what we want to believe ?" is Yes, human beings do it all the time.
Chris Whiteside said…
Indeed.

The reason I don't support those who want "Intelligent Design" taught in science classes is not I personally disagree with the philosophical idea per se: it's the fact that it IS a philosphical idea which was masquerading as a scientific one. "Intelligent design" is not a scientific idea because it cannot be expressed as a testable proposition with which hard evidence could be shown to be consistent, or falsify.

I suspect most people would agree with your comment that human beings do this all the time.

The alarming thing about the american study quoted in Politican Betting is that, although I don't think they explained their results very well, they found evidence which not only supports your view, but suggests that more intelligent people fall into this trap MORE often than the less numerate.

Popular posts from this blog

Nick Herbert on his visit to flood hit areas of Cumbria

Quotes of the day 19th August 2020

Quote of the day 24th July 2020