Cowen’s Epistemic Portfolio Theory and Flaubert’s Maxim

A portfolio theory is a way to minimize risk by diversifying one’s commitments or investments, using hopefully countercyclical strategies so that some part of your portfolio is always growing. Tyler Cowen suggests an amusing epistemic portfolio theory:

That is, most people have an internal psychological need to fulfill a “quota of dogmatism.”  If you’re very dogmatic in one area, you may be less dogmatic in others.  I’ve also met people — I won’t name names — who are extremely dogmatic on ethical issues but quite open-minded on empirics.  The ethical dogmatism frees them up to follow the evidence on the empirics, as they don’t feel their overall beliefs are threatened by the empirical results.

Some people, if they feel they must always follow the evidence, respond by skewing their interpretation of that evidence.

There’s a lesson here.  If you wish to be a more open-minded thinker, adhere to some extreme and perhaps unreasonable fandoms, the more firmly believed the better and the more obscure the area the better.  This will help fulfill your dogmatism quota, yet without much skewing your more important beliefs.

Frankly, I suspect Cowen is recapitulating Gustave Flaubert. Here’s Flaubert:

Be regular and orderly in your life like a bourgeois, so that you may be violent and original in your work.

Of course, Flaubert’s maxim is a kind of portfolio of habits related to personal and professional radicalism, where Cowen argues for a specifically epistemic portfolio. But his inclusion of ethical beliefs suggests they may not be far off. I find that I am mildly suspicious of the claim, and I’m leaning towards the belief that dogmatism or fallibilism will tend to aggravate themselves. Why can’t we adopt Bayesean or fallibilist commitments slowly, expanding them as we find the time and energy?

Consider Descartes in the Meditations: must we be suspicious of the authenticity of Cartesian doubt in order to maintain portfolio theory? Descartes looks like a kind of dogmatist about self and God, after all… though that could either mean that he is an example of a portfolio doubter, or that he didn’t work sufficiently carefully through the doubts and had a kind of epistemic bubble and crash.

What I imagine is the opposite of Cartesian doubt: rather than doubt everything all at once and risk epistemic shocks and a resurgence of unemployed credulity, we work on sustainable growth in GDP: Gross Doubt Production. The goal is to root out our dogmatisms throughout a lifetime, growing milder and less certain with age.

In short, where Cowen adopts an epistemic Keynesian model, I’m advocating epistemic Hayekianism:

Advice

Ever since the markets became front page news, I’ve been caught in some sort of economics blog vortex. At this point, most of my reading is no longer directed towards macro-economic issues and institutional critique, but rather focuses on the economics department at George Mason. The problem is that it seems like these people really do know more about some things of general interest than ordinary folks.

So when Bryan Caplan started advising his colleagues on what to do this year (making hypothetical resolutions for them) I especially perked up when he suggested that Tyler Cowen write a book of advice:

Tyler Cowen should write that I call a “book of answers” with the working title Social Intelligence: What I Know About People That You Don’t. The key point of departure: The goal of the book is not to “get readers to ask themselves questions,” but to convey definite answers that Tyler defends without irony.  If you think this goes against his nature, I’ve seen him do this many times first-hand – just not in print.

Cowen apparently agrees with Caplan’s assessment, and responded with some advice about advice:

You don’t know what a person really thinks until you hear his or her advice.  Along these lines, if you really want to know what a person thinks, ask for advice and he or she will open up.

Ben Casnocha jumped in with 14 thoughts about advice, the best of which is:

Even if you know the other person is biased, studies show you still don’t discount that bias enough. Your car mechanic wants to sell you more parts, and you know that he wants to do that, but we still don’t discount his advice as much as we should.

Advising seems to be the space most often shot through with status games, power relations, and biases. This is a pretty standard cautionary line in psychotherapy and psychoanalysis: being consulted  encourages us to forget our fallibility, because an intimate request for action items short-circuits the standards of public justification that would normally guide a person seeking the truth.

We ought to be most cautious when our own advice is sought: the risk is that, as an advisor, we will trick ourselves into believing that our consultor has knowingly and legitimately granted us status as ‘The One Who Knows’ and thus not subject our own judgments to appropriate testing and skepticism. At the same time, the consultor who really doesn’t know the right answer (rather than using advice-seeking as a method to develop trust) will experience the lack of qualification of personal advice and be inclined to assume that the advisor has a legitimate expertise beyond prejudice and preference. As a result, two people (or many people) move from probably-justified uncertainty to probably-unjustified certainty through a method that reflection shows is not trustworthy.

On this basis, I suspect that the best advice is the most tentative advice, which regularly and honestly signals its own fallibility. (But beware false modesty!) Giving and receiving this sort of advice is most likely to model an authentic inquiry in which participants will continue seeking beyond the initial consultation.

Appreciative Thinking

I’ve been having a debate on a friend’s Facebook page about the value of Martha Nussbaum’s work (I’m a fan) and serendipitously I found this post on “appreciative thinking” via Tyler Cowen. It’s a kind of inverted critical thinking, from Seth Roberts:

When it comes to scientific papers, to teach appreciative thinking means to help students see such aspects of a paper as:

  1. What can we learn from it? What new ideas does it suggest? What already-existing plausible ideas does it make more plausible or less plausible?
  2. How is it an improvement over previous work? Does it use new methods? Does it use old methods in a new way? Do it show a better way to do something?
  3. Did the authors show good taste in their choice of problem? Is this a problem both important and possibly solvable?
  4. Are details done well? Is it well-written? Is the context of the work made clear? Are the data well-analyzed? Does it make good use of graphs? Is the discussion imaginative rather than formulaic?
  5. What’s interesting or enjoyable about it?

That sort of thing. In my experience few papers are worthless. But I’ve heard lots of papers called worthless.

The framing for these rules is worth looking at as well. Obviously, a lot of these skills are part and parcel of any true critical thinking or good close reading, but it’s nice to see folks emphasizing the positive element of reading. “Appreciative thinking” also seems like a good way to introduce a version of the principle of charity that Augustine describes in his On Christian Doctrine. The nice thing about this is the way it’s framed as a “checklist skill,” the kind you can put on your syllabus and design assignments around.

Anyway, it doesn’t exactly resolve the issue of Martha Nussbaum, but it does suggest some perspectives from which her work might be valuable even if some of her conclusions are also wrong.