Tuesday, 3 February 2015

Chapter 7.                                   Part E

A human mind experiences much cognitive dissonnance when it keeps observing evidence that does not fit any of its mental models. The person who is attempting to explain a bit of observed evidence that is inconsistent with his worldview, i.e. the person trying to hang on to on his background beliefs and to shut out the new theory his colleagues are talking about, keeps trying to tell himself that the observations of this evidence can't be correct. Some systemic error must be leading those other researchers to keep thinking they have observed (E), but they must be wrong. (E) is not what they say it is. "That can't be right," he says.

In the meantime, his more subversive colleague down the hall is arguing, even it is only in her own head, "I know what I saw. I know how careful I've been. (E) is right, therefore, the probability of (H), at least in my mind, has just grown hugely. And it's such a relief to see a way out of all the cognitive dissonnance I have been experiencing for the last few months. I get it now. Wow, does this feel good!"   

Settling a score with a stubborn bit of old data that refused to fit into any of a scientist’s models of reality is a bit like finally whipping a bully who picked on him in elementary school: not really that significant logically, but still very satisfying.

   Normally, testing a new hypothesis involves performing an experiment which will generate new evidence. When I do the experiment, if the experiment delivers new evidence that was predicted by the hypothesis, but not by my background set of concepts, then the hypothesis, as a way of explaining the real world, seems more likely or probable to me. The new evidence “confirms” the hypothesis.

   But I may also decide to try to use a hypothesis and the theory or model that it is based on to explain some old, problematic evidence. I will be looking to see whether what the hypothesis and its base theory predict did in fact occur in the old evidence situations. If I find that the new hypothesis and the theory that it is based on – this theory that I am considering adopting as one of my background concepts and thus accepting into my regular thinking patterns – do successfully explain that problematic old evidence, what I am actually confirming is not just the hypothesis/theory, but also the consistency between the evidence, the hypothesis, and even my background set of assumptions and concepts.






 
                       Harry Houdini with his "vanishing" elephant, Jenni

    And no, it is not obvious that evidence seen with my own eyes is 100% reliable, not even if I have seen a particular phenomenon repeated many times. Neither my longest held, most familiar background concepts nor the ordinary sensory data that I see in everyday experience, are trusted that much. If they were, then I and all humans who trusted gravity and light and human anatomy would be unable to watch a good magic show without having a nervous breakdown. Elephants disappear, men float, and women get sawn in half. By pure logic, if my most basic concepts were believed at the 100% level, then either I would have to gouge my eyes out or go mad. 





   But I know, and I confidently tell my kids, that it is all a trick of some kind. And I choose, for this one night, to suspend my desire to harmonize all of my sense data with my set of background concepts. It is supposed to be a night of fun and wonder. If I did figure out how the trick is done, I would ruin my kids’ fun ...and my own.

No comments:

Post a Comment

What are your thoughts now? Comment and I will reply. I promise.