Some recent posts byÂ Dan KahanÂ on the subject of “cultural cognition” deserve attention:
(Cultural cognition refers to the tendency of individuals to conform their beliefs about disputed matters of fact (e.g., whether global warming is a serious threat; whether the death penalty deters murder; whether gun control makes society more safe or less) to values that define their cultural identities.)
- Nullius in verba? Surely you are joking, Mr. Hooke! (or Why cultural cognition is not a bias, partÂ 1)
Thereâ€™s no remotely plausible account of human rationalityâ€”of our ability to accumulate genuine knowledge about how the world worksâ€”that doesnâ€™t treat as central individualsâ€™ amazing capacity to reliably identify and put themselves in intimate contact with others who can transmit to them what is known collectively as a result of science.
- The cultural certification of truth in the Liberal Republic of Science (or part 2 of why cultural cognition is not a bias)
Indeed, as I said at the outset, it is not correct even to describe cultural cognition as aÂ heuristic. AÂ heuristic is a mental â€œshortcutâ€â€”an alternative to the use of a more effortful, and more intricate mental operation that might well exceed the time and capacity of most people to exercise in most circumstances.
But there isÂ noÂ substitute for relying on the authority of those who know what they are talking about as a means of building and transmitting collective knowledge. Cultural cognition is no shortcut; it is an integral component in the machinery of human rationality.
Unsurprisingly, the faculties that we use in exercising this feature of our rationality can be compromised by influences that undermine its reliability. One of those influences is the binding of antagonistic cultural meanings to risk and other policy-relevant facts. But it makes about as much sense to treat the disorienting impact of antagonistic meanings as evidence that cultural cognition is aÂ biasÂ as it does to describe the toxicity of lead paint as evidence that human intelligence is a â€œbias.â€
Look: peopleÂ arenâ€™tÂ stupid. They know they canâ€™t resolve difficult empirical issues (on climate change, on HPV-vaccine risks, on nuclear power, on gun control, etc.) on their own, so they do the smart thing: they seek out the views of experts whom they trust to help them figure out what the evidence is. But the experts they are most likely to trust, not surprisingly, are the ones who share their values.
What makes me feel bleak about the prospects of reason isnâ€™t anything we find in our studies; it is how often risk communicators fail to recruit culturally diverse messengers when they are trying to communicate sound science.
The number of scientific insights that make our lives better and thatÂ donâ€™tÂ culturally polarize us is orders of magnitude greater than the ones that do. Thereâ€™s not a â€œculture warâ€ over going to doctors when we are sick and following their advice to take antibiotics when they figure out we have infections. Individualists arenâ€™t throttling egalitarians over whether it makes sense to pasteurize milk or whether high-voltage power lines are causing children to die of leukemia.
People (the vast majority of them) form the right beliefs on these and countless issues, moreover, not because they â€œunderstand the scienceâ€ involved but because they are enmeshed in networks of trust and authority that certify whom to believe about what.
For sure,Â people with different cultural identities donâ€™t rely on the same certification networks. But in the vast run of cases, those distinct cultural certifiersÂ do convergeÂ on the best available information. Cultural communities that didnâ€™t possess mechanisms for enabling their members to recognize the best informationâ€”ones that consistently made them distrust those who do know something about how the world works and trust those who donâ€™tâ€”just wouldnâ€™t last very long: their adherents would end up dead.
Rational democratic deliberations about policy-relevant science, then,Â doesnâ€™tÂ require that people become experts on risk. It requires only that our society take the steps necessary to protect its science communication environment from a distinctive pathology that enfeebles ordinary citizens from using their (ordinarily) reliable ability to discern what it is that experts know.