Barbara Ehrenriech’s book Bright-sided starts with an interesting dilemma in breast cancer treatment. On the one hand, your odds of surviving–say–stage 4 breast cancer is quite low (22%). On the other hand, there is evidence that optimism and hopefulness will increase your chances. Being optimistic won’t increase your chances above 50%, but it will help.
So: if you are diagnosed with stage 4 breast cancer, what should you believe? Should you believe that your chances are 22%–pretty low–and allow yourself to feel the sense of mortality, loss, and despair that belief may provoke? Or should you believe that your chances of survival are quite high or guaranteed by God’s divine grace or some untested medical trial–and thus increase your odds a bit?
We have, then, at least two reasons to adopt a belief: the best evidence and the practical effects. Allowing considerations like health benefits to cause us to overestimate the odds of some outcome is sometimes referred to as “pragmatic encroachment.” There are lots of reasons to allow pragmatic considerations to encroach on our purely evidentiary reasons for believing: the classic example is Pascal’s wager, where the cost of skepticism about God’s existence outweigh the benefits. You might also find that beliefs that are personally disadvantageous are easier to deny than beliefs that are advantageous: for instance, if you make a lot of money at your job, you may have a hard time accepting that you are not very good at it or that you are overpaid. (This could be true of both hedge fund managers and teachers.) If you benefit from white or male or class privilege, then you may not want to believe that your achievements are the result of systematic inequalities.
It’s also the case that if you’re excited about a research program or a public policy, that excitement and passion is a kind of reason to believe that the program or policy will be effective. But it’s a non-epistemic reason and there’s good reason to discount it: both for others who are potentially infected by your excitement and for yourself in quiet moments of contemplation. It’s still a tricky thing to decide what to do with those doubts because while “I want this to work” is not the same as “this will work” it’s also true that “This probably won’t work” isn’t the same as “this will not work.” Overconfidence spurs us to take both important risks and stupid ones. It may be that we can’t weed out the stupid ones in advance, which is why I call this a dilemma and not a fallacy or a bias.