I found this article from Ben Goldacre fascinating. A group published a study in the journal Political Behaviour, their study was focussed on an article that restated an opinion from President Bush that there were Weapons of Mass Destruction in Iraq. The article was shown to a group of people, some who were pro-war, some anti-war. One group of people were given an article with a correction stated that the opinion was based on incorrect evidence, they other did not. What is so interesting about this is that whilst the correction had almost no impact on those who disagreed with the statement that Iraq had WMD, for those who did believe that the statement was correct, the correction reinforced the evidence. I don't think this is anything to do with the political bias of the people involved; it could easily have been the other way round, but it reminds me of something I increasingly think about human beings. Humans like clear distinctions and don't like facts that muddy the waters, we tend to think in big categories not qualifications and this I suspect effects our politics, but more importantly our own epistemologies- the explanations we prefer to use to explain the mysteries of our lives and those we discard.