Cultural Cognition is Not a Bias

Some recent posts by Dan Kahan on the subject of “cultural cognition” deserve attention:

(Cultural cognition refers to the tendency of individuals to conform their beliefs about disputed matters of fact (e.g., whether global warming is a serious threat; whether the death penalty deters murder; whether gun control makes society more safe or less) to values that define their cultural identities.)

There’s no remotely plausible account of human rationality—of our ability to accumulate genuine knowledge about how the world works—that doesn’t treat as central individuals’ amazing capacity to reliably identify and put themselves in intimate contact with others who can transmit to them what is known collectively as a result of science.

Indeed, as I said at the outset, it is not correct even to describe cultural cognition as a heuristic. A heuristic is a mental “shortcut”—an alternative to the use of a more effortful, and more intricate mental operation that might well exceed the time and capacity of most people to exercise in most circumstances.

But there is no substitute for relying on the authority of those who know what they are talking about as a means of building and transmitting collective knowledge. Cultural cognition is no shortcut; it is an integral component in the machinery of human rationality.

Unsurprisingly, the faculties that we use in exercising this feature of our rationality can be compromised by influences that undermine its reliability. One of those influences is the binding of antagonistic cultural meanings to risk and other policy-relevant facts. But it makes about as much sense to treat the disorienting impact of antagonistic meanings as evidence that cultural cognition is a bias as it does to describe the toxicity of lead paint as evidence that human intelligence is a “bias.”

Look: people aren’t stupid. They know they can’t resolve difficult empirical issues (on climate change, on HPV-vaccine risks, on nuclear power, on gun control, etc.) on their own, so they do the smart thing: they seek out the views of experts whom they trust to help them figure out what the evidence is. But the experts they are most likely to trust, not surprisingly, are the ones who share their values.

What makes me feel bleak about the prospects of reason isn’t anything we find in our studies; it is how often risk communicators fail to recruit culturally diverse messengers when they are trying to communicate sound science.

The number of scientific insights that make our lives better and that don’t culturally polarize us is orders of magnitude greater than the ones that do. There’s not a “culture war” over going to doctors when we are sick and following their advice to take antibiotics when they figure out we have infections. Individualists aren’t throttling egalitarians over whether it makes sense to pasteurize milk or whether high-voltage power lines are causing children to die of leukemia.

People (the vast majority of them) form the right beliefs on these and countless issues, moreover, not because they “understand the science” involved but because they are enmeshed in networks of trust and authority that certify whom to believe about what.

For sure, people with different cultural identities don’t rely on the same certification networks. But in the vast run of cases, those distinct cultural certifiers do converge on the best available information. Cultural communities that didn’t possess mechanisms for enabling their members to recognize the best information—ones that consistently made them distrust those who do know something about how the world works and trust those who don’t—just wouldn’t last very long: their adherents would end up dead.

Rational democratic deliberations about policy-relevant science, then, doesn’t require that people become experts on risk. It requires only that our society take the steps necessary to protect its science communication environment from a distinctive pathology that enfeebles ordinary citizens from using their (ordinarily) reliable ability to discern what it is that experts know.

Second Opinions