A very good TED Talk by Tali Sharot on The optimism bias tells us something many of us have known (or shall we say suspected) already for a long time: the brains of healthy (non-depressed) people are fundamentally tilted towards optimism. For example, even when told that 40% of all marriages break, newly married people still don’t think there is a possibility they’ll get divorced. Likewise we underestimate the likelihood that bad things will happen to us personally. Sharot suggests that when doing financial budgets or when in dangerous situations, we should remember that we have this innate bias and calculatingly adjust for it. But most of the time, the human optimism bias is a good thing. Otherwise we would never dare to venture into anything new or risky.

The flipside of all this optimism is obviously that we always think we’re right. There is a great TED Talk by Kathryn Schulz: On being wrong. We will do almost anything to avoid having that nasty feeling creep up on us when we realize we were wrong. When somebody disagrees with us, we rather think that he or she either:

  1. doesn’t have the same information as we do,
  2. is plain stupid, or if neither of those apply,
  3. that he’s plain evil.

Rather seldom the thought crosses our minds that we ourselves might be wrong. Or that different people have simply different models of the world that they think in and in which they integrate new information, and that there might be a multitude of valid worldviews. So we should pause more often and think about all the complexities of our world, and allow ourselves to recognize that we may not know, or that we may be wrong.

To the smart fellow who chips in now and says we should just abide to the scientific method and reason objectively: 1) objectivity does not exist, see intersubjectivity. 2) read the closing words of this unsettling article by Jonah Lehrer in The New Yorker about the ‘decline effect’ when trying to reproduce the results of scientific studies in fields such as medicine or psychology.

Although many scientific ideas generate conflicting results and suffer from falling effect sizes, they continue to get cited in the textbooks and drive standard medical practice. Why? Because these ideas seem true. Because they make sense. Because we can’t bear to let them go. And this is why the decline effect is so troubling. Not because it reveals the human fallibility of science, in which data are tweaked and beliefs shape perceptions. (Such shortcomings aren’t surprising, at least for scientists.) And not because it reveals that many of our most exciting theories are fleeting fads and will soon be rejected. (That idea has been around since Thomas Kuhn.) The decline effect is troubling because it reminds us how difficult it is to prove anything. We like to pretend that our experiments define the truth for us. But that’s often not the case. Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.

While I always say that it’s problematic to apply the term “prove” or “proved” to anything other than pure mathematics (in science, theories can only be substantiated or falsified), I agree with the point being made. In anything that involves living beings (as opposed to e.g. physics or chemistry), interactions pretty quickly become so complex that it is hard to tell anything with certainty. That doesn’t mean that the scientific method might not be the best (or shall we say one very good) way to approach these problems as well. But we should keep in mind that just because some theory has been substantiated by a scientific study or even two, that doesn’t mean that it’s the absolute truth.

So the bottom line is that it is good to question things every once in a while, to remember that we actually know pretty little, to remember that we’re wrong way more often than we think. And to really consider other people’s worldviews and try to learn from them. But at some point, if you want to stay sane, you have to start believing in something again. How could you ever start a venture or bring any change to the world without believing in it in the first place? That’s when our innate optimism bias is essential. So that we do something – even when all the estimates say that the odds are low – to see whether it might be possible nonetheless. And sometimes it is. Just sometimes it is.