ArsTechnica covered a new study about how the public perceives scientific conclusions and consensus. As you may imagine, climate change is a hot topic:
… The authors favor a model, called the cultural cognition of risk, which “refers to the tendency of individuals to form risk perceptions that are congenial to their values.” This wouldn’t apply directly to evolution, but would to climate change: if your cultural values make you less likely to accept the policy implications of our current scientific understanding, then you’ll be less likely to accept the science.
But, as the authors note, opponents of a scientific consensus often try to claim to be opposing it on scientific, rather than cultural grounds. “Public debates rarely feature open resistance to science,” they note, “the parties to such disputes are much more likely to advance diametrically opposed claims about what the scientific evidence really shows.” To get there, those doing the arguing must ultimately be selective about what evidence and experts they accept—they listen to, and remember, those who tell them what they want to hear. “The cultural cognition thesis predicts that individuals will more readily recall instances of experts taking the position that is consistent with their cultural predisposition than ones taking positions inconsistent with it,” the paper suggests.
Having attended my fair share of climate change talks, I can say that dissenters do offer scientific arguments as to why the climate models are wrong. But they also cite things like the rise of the Chinese/Indian middle class, the prohibitive expense of abatement, and other economic or political roadblocks.
The study identifies two broad groups: egalitarians who want and expect government regulation on topics of importance, and individualists who tend to be pro-business and anti-regulation (read: left and right-wingers). What’s fascinating, is that both groups seem to practice self-reinforcement of their existing ideas. When shown fictitious book excerpts from “experts” describing the extremes of the climate change debate,
Egalitarians accepted the expertise of someone suggesting climate change was high risk at a rate of 88 percent; the low-risk author was rated an expert a bit less than half the time. For individualists, in contrast, the low-risk author was considered an expert 86 percent of the time; the high-risk author was rated an expert by less than a quarter of participants.
I guess this is not so surprising – we believe what we want, and we want what we believe. This basic human tendency frames everything from politics to the brand of soda we prefer. What I want is a study that shows us how to reach a mutual understanding and begin whatever solutions seem necessary. What we need is a healthier framework for holding our debates, and people honest and open enough to change their minds when presented with sufficient evidence.