Theories mentioned in the living literature review and from the published literature on vaccine hesitancy.
A theory that suggests people seek out, favor and recall information that confirms their existing beliefs. This is particularly true for issues that are politically and emotionally charged and that are connected to deeply held beliefs.
Construal Level Theory
Describes the relation between psychological distance and the extent to which people’s thinking (e.g., about objects and events) is abstract or concrete. The general idea is that the more distant an object is from the individual, the more abstract it will be thought of, while the closer the object is, the more concretely it will be thought of. In CLT, psychological distance is defined on several dimensions—temporal, spatial, social and hypothetical distance being considered most important, though there is some debate among social psychologists about further dimensions like informational, experiential or affective distance.
Cultural “tightness and looseness”
Refers to the extent a culture has tolerance for deviance and norms. For instance, a “tight” culture will have low tolerance for deviance and strong norms, while a “loose” culture will have a higher tolerance for deviance and weaker norms. Examples of tight cultures include anthropological studies of Israel Kibbutz to looser cultures of the Kung Bushman, Cubeo, and the Skolt Lapps (Pelto, 1968). Examples of modern cultures and developed countries of tight societies, like those of Japan, Singapore, and Pakistan provide strong norms and monitoring systems to detect deviations, which are severely punished. As such, these societies value order, formality, discipline, and conformity (Gelfand et al., 2006, 2011; Pelto, 1968). In contrast, norms in loose societies like those of Brazil, Israel, or the United States are more ambiguous, deviations from norms are tolerated, and punishments for deviations are less severe (Gelfand et al., 2013, p.499).
Cultural tightness–looseness has its theoretical roots in multiple disciplines, including anthropology (Pelto, 1968), sociology (Boldt, 1978a, 1978b), and psychology (Berry, 1966, 1967), and contrasts cultures that have strong norms and little tolerance for deviance with those that have weak norms and high tolerance for deviance (Gelfand et al., 2006; Gelfand et al., 2011; Harrington & Gelfand, 2014; Roos, Gelfand, Nau, & Lun, 2015; Triandis, 1989). Research has shown that nations vary widely in tightness–looseness and that the construct is distinct from cultural values (Carpenter, 2000; Gelfand et al., 2011)” (Aktas et al., 2015, p.2).
Mert Aktas, Michele Gelfand, and Paul Hanges. (2015). Cultural Tightness–Looseness and Perceptions of Effective Leadership. Journal of Cross-Cultural Psychology 1–16. Available from: https://www.researchgate.net/publication/282324344_Cultural_Tightness-Looseness_and_Perceptions_of_Effective_Leadership [accessed Sep 03 2020].
Source comparing modern countries on tightness and looseness: Gelfand, M. J., LaFree, G., Fahey, S., & Feinberg, E. (2013). Culture and extremism. Journal of Social Issues, 69, 495-517.
Curse of Knowledge
The curse of knowledge is a cognitive bias that occurs when an individual, communicating with other individuals, unknowingly assumes that the others have the background to understand.
Birch SAJ, Bloom P. The Curse of Knowledge in Reasoning About False Beliefs. Psychological Science. 2007;18(5):382-386.
A logical fallacy where an equivalence is drawn between two subjects based on flawed or false reasoning. A false equivalent argument often simultaneously condemns and excuses both sides in a dispute by claiming that both sides are (equally) guilty of inappropriate behavior or bad reasoning. The argument can appear to be treating both sides equally, but it is generally used to condemn an opponent or to excuse one’s own position (partly paraphrased from source: Palomar College)
Thomas Patterson wrote about the danger of false equivalencies in news coverage of 2016: [F]alse equivalencies are developing on a grand scale as a result of relentlessly negative news. If everything and everyone is portrayed negatively, there’s a leveling effect that opens the door to charlatans. The press historically has helped citizens recognize the difference between the earnest politician and the pretender. Today’s news coverage blurs the distinction” Source: Thomas E. Patterson (December 7, 2016). “News Coverage of the 2016 General Election: How the Press Failed the Voters”
People are averse to information that makes them feel bad, obligates them to do something they do not want to do, or threatens their deeply held values, worldviews and/or identity. Also known as the ostrich effect and information avoidance.
A psychological framework derived in the 1960s that aims to induce pre-emptive resistance against unwanted persuasion attempts. Papageorgis & McGuire (1961) explain: “A previous study…showed that strong initial beliefs are more effectively immunized against persuasion by pre exposing them to counterarguments…The present study tested the hypothesis that preexposure to refutations of some counterarguments against the belief would have a generalized immunization effect, making the beliefs more resistant to strong doses not only of the specific counter arguments…but also of alternative arguments against the given belief…As expected, the beliefs proved highly vulnerable to the strong counterarguments when there was no prior immunization. Immunization had a direct strengthening effect on the beliefs and also substantially reduced the effect of the subsequent strong counterarguments.”
Source: Papageorgis, D., & McGuire, W. J. (1961). The generality of immunity to persuasion produced by pre-exposure to weakened counterarguments. The Journal of Abnormal and Social Psychology, 62(3), 475–481. https://doi.org/10.1037/h0048430
Moral foundations theory
With roots in sociology and social psychology, going back to Emile Durkeim, scholars in the 1990s-2000s coined “Moral Foundations Theory” which proposes that several innate and universally available psychological systems are the foundations of “intuitive ethics.” Each culture then constructs virtues, narratives, and institutions on top of these foundations, thereby creating the unique moralities we see around the world, and conflicting within nations too. The five foundations for which they think there currently is evidence for are: 1) Care/harm; 2) Fairness/cheating; 3) Loyalty/betrayal; 4) Authority/subversion; 5) Sanctity/degradation; 6) Liberty/oppression. There might be more that research will delve into.
This finding is important for framing arguments as Feinberg and Willer (2015) tested–claiming that where frames that were targeted at someone’s morality there were more likely to have success in changing minds.
Feinberg, Matthew and Robb Willer. (2015). From Gulf to Bridge: When Do Moral Arguments Facilitate Political Influence? Personality and Social Psychology Bulletin. Volume: 41 issue: 12, page(s): 1665-1681. https://journals.sagepub.com/doi/full/10.1177/0146167215607842
Haidt, Jonathan. (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Vintage; 1st Edition (March 13, 2012)
Although it is debated whether this is a new concept in behavioral economics or not it has recently been brought to the forefront of intellectual endeavors and also political policy by Richard Thaler and Cass Sunstein’s book “Nudge: Improving Decisions About Health, Wealth, and Happiness” (2008). Nudge theory is a concept in behavioral economics, political theory, and the broader behavioral sciences purporting positive reinforcement, defaults, indirect suggestions, all while still allowing freedom of choice, are ways that can influence behavior and decision making, especially aimed at issues of compliance.
Thaler and Sustein define their concept: A nudge, as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives. To count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not” (2008).
Richard Thaler and Cass Sunstein. (2008). Nudge: Improving Decisions About Health, Wealth, and Happiness. Penguin.
Pre-bunking (not debunking)
Because once doubt settles in it is hard to dislodge, pre-bunking “shoots first.” This new theory (2000s) argues that providing people with correct information before they are exposed to false information can reduce their incidence of believing false information.
Recent work on pre-bunking that is gaining traction and widely cited is by Sander Van Der Linden & Jon Roozenbeek. In a recent study, they surveyed 2,000 people first, asking them how big the scientific consensus on climate change is – without looking at any documents (one document was true about scientific consensus, another was a false petition saying climate change was not a consensus and that 31,000 American scientists disagreed with climate change, and a third was a brief refuting the petition).
The results were intriguing and displayed the potential power of pre-bunking. When participants first were asked about the scientific consensus on climate change, the researchers calculated it to be around 72% on average. But they then changed their estimates based on what they read.
When the researchers provided a group with the ‘truth brief’, the average rose to 90%. For those who only read the petition, the average sank to 63%. When a third group read them both – first the ‘truth brief’ and then the petition – the average remained unchanged from participants’ original instincts: 72%. Thus, there is evidence that pre-bunking may be an effective method against false reports/news.
Sander Van Der Linden & Jon Roozenbeek. (2019). The new science of prebunking: how to inoculate against the spread of misinformation. On Society.
Prospect theory, comes out of behavioral economics, and is credited to Daniel Kannaman and Amos Tversky and their 1979 paper “Prospect Theory: An Analysis of Decision under Risk” in which they argue individuals assess gains and losses in asymmetric ways–in other words there is more aversion to loss than towards gains. This tendency, they argue, contributes to risk aversion in choices involving sure gains and to risk seeking in choices involving sure losses.
This has real world effects in that the overweighting of low probabilities may contribute to the attractiveness of insurance and gambling.
This theory stands in contrast to expected utility theory which expects people to act the same in terms of loss and gains and to always try to maximize utility, yet prospect theory holds up under rigorous studies in the real world, as opposed to expected utility theory.
Kahneman, Daniel; Tversky, Amos (1979). “Prospect Theory: An Analysis of Decision under Risk”. Econometrica. 47 (2): 263–291.
Post, Thierry; van den Assem, Martijn J; Baltussen, Guido; Thaler, Richard H (2008). “Deal or No Deal? Decision Making under Risk in a Large-Payoff Game Show”. American Economic Review. 98 (1): 38–71
Pseudo Inefficacy/Psychic Numbing
People are less likely to take action when they feel like their actions will not make a difference. Feelings of efficacy- power, agency, ability and effectiveness are important for increasing action. When a problem feels too big (or the devastation is hard to comprehend) we engage in psychic numbing (disengagement) as a cognitive defense.
Albert Bandura is one of the leading psychologists advocating self-efficacy and its power to influence behavior (see Bandura, 1997; Bandura, 1999; Bandura and Locke, 2003). Claiming: “Among the mechanisms of human agency, none is more central or pervasive than beliefs of personal efficacy. Whatever other factors serve as guides and motivators, they are rooted in the core belief that one has the power to produce desired effects; otherwise one has little incentive to act or to persevere in the face of difficulties. Self-efficacy beliefs regulate human functioning through cognitive, motivational, affective, and decisional processes (Bandura, 1997)” (Bandura and Locke, 2003, p.87).
Bandura, A., & Locke, E. A. (2003). Negative self-efficacy and goal effects revisited. The Journal of Applied Psychology, 88(1), 87–99.
Bandura, A. (1999). Social Cognitive Theory: An Agentic Perspective. Asian Journal of Social Psychology, 2(1). https://doi.org/10.1111/1467-839X.00024
Social Identity Theory
People derive their sense of self from membership in a group (s). How someone defines themselves can tell us what they pay attention to, who they trust, who their influencers are and the norms of the group. We act differently toward people in and out of our membership groups based on group status and affiliations. We use cues and cultural symbols to assess who is in and who is out of the group.
Social Norms Theory
Social norms are informal and formal rules that govern how we act and what we see as normal and taboo. A social norms approach to change focuses less on changing beliefs and more on changing perceptions of what other people like us do. Our behavior is influenced by those around us. If we think something is a social norm (or becoming one), we will update our own actions to fit in.
People have different worldviews that guide how they think the world works and therefore the messages and solutions they support. Some people are more egalitarian and others are more individualistic. Worldviews fall on a continuum and what we believe can change by issue. It is important to identify where a community is to build messages that will resonate.