Warning Signs: Beliefs that Signal Loyalty or Ability

My last post has generated some controversy on Facebook, where the audience is a bit more diverse, faith-wise, than those who read the blog. I thought it might be useful to continue pressing on the critique of instrumental beliefs with an instructive list of warning signs that your opinions are primarily instrumental, from Robin Hanson. How many of us derive our beliefs about the economy by negating whatever is popularly prescribed on Fox News? How often do you find yourself thinking that Glenn Beck or John Boehner may have a point? (Even a stopped clock, etc.) How often do you reject a line of research because you can’t think of a good journal to publish it in? How often do you acknowledge that your opinions on an important matter are fairly conventional? If we notice these kinds of patterns, shouldn’t we wonder if our non-theological beliefs follow a similar structure to other kinds of instrumental beliefs?

Hanson’s list of warning signs:

  1. You find it hard to be enthusiastic for something until you know that others oppose it.
  2. You have little interest in getting clear on what exactly is the position being argued.
  3. Realizing that a topic is important and neglected doesn’t make you much interested.
  4. You have little interest in digging to bigger topics behind commonly argued topics.
  5. You are less interested in a topic when you don’t foresee being able to talk about it.
  6. You are uncomfortable taking a position near the middle of the opinion distribution.
  7. You are uncomfortable taking a position of high uncertainty about who is right.
  8. You care far more about current nearby events than similar distant or past/future events.
  9. You find it easy to conclude that those who disagree with you are insincere or stupid.
  10. You are reluctant to change your publicly stated positions in response to new info.
  11. You are reluctant to agree a rival’s claim, even if you had no prior opinion on the topic.

Tyler Cowen adds this, which helpfully sums up the way in which status games (and honor talk!) can blind us to the truth:

12. You feel uncomfortable taking a position which raises the status of the people you usually disagree with.

The only problem I have with this list is that it doesn’t model instrumental beliefs among academics quite as well, because we have a different incentive structure and are a bit better at creating opportunities to converse on neglected topics if we can prove to our fellows that we have important insights. But even there, we might question our own motives when, for instance, we find ourselves taking a provocative and unpopular view.

Hanson’s frequent point in his blog is that many of our beliefs, and not just our beliefs about the divine, are primarily instrumental ones. He calls this the “homo hypocritus” hypothesis. Given the popularity of ideological and psychoanalytic analyses among continental philosophers, I think my friends would benefit from checking out his blog “Overcoming Bias.” Hanson, on the other hand, would benefit from reading more Foucault or (if he’s doing so, which his writing frequently suggests) from acknowledging his scholarly non-economist influences more often.

One topic that I think deserves a lot more attention in political philosophy and ethics is the status of the biases and heuristics research coming out of the social sciences, especially psychology but to a lesser extent behavioral economics. So this post is part of my effort to raise the relative status of such questions among my own philosophically-inclined readership.

2 thoughts on “Warning Signs: Beliefs that Signal Loyalty or Ability”

Second Opinions