What if there was a set of questions that could predict with a high degree of accuracy your political views on a variety of issues? Social scientists suggest that we process information based on our pre-existing worldviews. In other words, our cultural outlooks shape our thinking. Cultural Cognition Theory suggests that this can be used to predict perspectives and help us understand how they form.
Hotbed issues such as climate change continue to draw political battle lines among the general public, despite scientific consensus. Even neutral information is processed through our own individual political filters. But why? Addressing this question is vital for understanding public perceptions of risk and building support for crucial new policy. Is it a lack of credible information, a failure to communicate evidence effectively, or something else entirely?
Dan Kahan is a distinguished professor of law and psychology at Yale University whose research has been focused on risk perception, science communication, and applications of decision science to law and public policy. He is part of the Cultural Cognition Project, examining the impact of group values on perceptions of risk. Across a number of studies, his research has explored public divergence over climate change and scientific expertise in general.
The cultural theory of risk was developed by Mary Douglas and Aaron Wildavsky in the 1970s, asserting that people form risk perceptions and beliefs that are influenced by and harmonious with their ways of life. A simple example is the “white male effect”, which is a propensity for Caucasian men to perceive social threats as less significant than do women and minorities.
Kahan’s research has concluded that people form perceptions of risks to society that emphasize their worldviews and cultural outlooks. Thus, political polarization occurs surrounding contentious issues despite the presence of empirical data and scientific consensus. In analyzing how and why these perceptions form, this kind of research can offer insights into the best ways to shape and inform public opinion on risks to society, and to develop and implement better policy.
Intuitively, support for public policies that address societal risks like green technology, vaccinations and gun control should increase as people become aware of and sympathetic to these issues. The problem is that facts are less important than values in the formation of perceptions, and Kahan argues that “identity-protective cognition” causes people to dismiss information that conflicts with their values as a kind of “identity self-defense mechanism”.
Cultural cognition is evaluated through attitudinal scales, which Kahan says “should be thought of as measures of latent or unobserved dispositions, for which the items that make up the scales are simply observable indicators.”
Two continuous scales rank attitudes along two dimensions, referred to as “grid” and “group” ways of life. The first scale, “Hierarchy-egalitarianism”, runs from “high grid” individuals who support the maintenance of status-based systems through to “low grid” individuals who believe entitlements should be based on merit rather than position.
On the second scale, “Individualism-communitarianism”, individuals classed as “weak group” expect to fend for themselves while those classed as “strong group” value solidarity over competitiveness. Responses in agreement or disagreement with value statements are aggregated to form continuous “Hierarchy-egalitarianism” and “Individualism-communitarianism” worldview scores.
Cultural cognition research has revealed a tendency for people to perceive knowledge, honesty, and shared interest in experts who they believe to share their values. A common idea in science communication is that evidence of environmental threats has been ineffectively conveyed to the public, or that scientific literacy is too low. However, it has been shown that polarization over environmental threats is actually greatest among the science-literate. Dramatic public division on these issues is not a result of incomprehension, but instead stems from a distinct cultural conflict of interest.
A 2010 study on perceptions of HPV vaccine risk showed that people will selectively accept evidence to validate previously held beliefs, which suggests that even a balanced argument may increase polarization in people with opposing values. People also base their perceptions of expert credibility on values rather than the content of any argument. The study showed that if a person hears an argument they are predisposed to reject being made by an advocate whose values they share or vice versa, polarization shrinks to insignificance.
Kahan’s research demonstrates that bombarding the public with information or expert evidence on social risks can create a backlash and thus become counterproductive. This is likely to occur in people regardless of their political party or cultural belief system. To reduce combative polarization, it is more effective to present a culturally congenial solution that fits within prescribed worldviews.
As Kahan puts it, “don’t try to convince people to accept a solution by showing them there is a problem. Show them a solution they find culturally affirming, and then they are disposed to believe there really is a problem in need of solving.”
Cultural cognition theory has useful applications in the context of Earth Sharing and Henry George’s ideas about Land Value Taxation. While presenting any policy argument based on a demonstrable problem is liable to be rejected on the basis of predetermined values, presenting the same policy argument framed around the solution and decorated with sympathetic values is likely to succeed. Proponents of significant political change are too often focused on highlighting risks that they believe need to be addressed, failing to speak to people’s core values. In the absence of a framework of values, the substance of the message is lost to partisan interpretations of the supposed risk.