Lately I’ve been thinking about the psychological aspects of anticipation and related practices. In the final chapter of my PhD thesis (which is under examination) I noted the need for further inquiry into the ways that knowledge practices are shaped by the cognitive tendencies and limits of human beings. Related to this, over the past month I purchased several books by psychologists, cognitive scientists and neuroscientists which I am slowly working my way through.
This post outlines and briefly considers one theory that strikes me as highly relevant: the theory of cognitive dissonance which was first proposed by Leon Festinger, an influential social psychologist.
Festinger originally developed the theory when he infiltrated and studied a group who believed the world would end on December 21, 1954 (see the book When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World). The surprising thing about this group is that the commitment of most true believers (in the group) increased after the prophecy didn’t come to pass and they developed even greater faith in the group leader’s mystical abilities. Such behaviour doesn’t make any sense if we consider human beings to be rational creatures who – as independent rational agents – rationally evaluate their beliefs and make decisions in a logical manner.
A central claim that informs dissonance theory is the idea that “the human mind yearns for consonance and rejects information that questions our beliefs, decisions, or preferences” (Tavris & Aronson, 2015, p.299). In other words, “people strive to make sense out of contradictory ideas and lead lives that are, at least in their own minds, consistent and meaningful” (p.16). The theory proposes that when a person holds psychologically inconsistent cognitions this produces mental discomfort which the human mind will try to reduce as much as possible, such as by rejecting dissonance-creating information or by reducing the perceived credibility of such information. The cognitive process of self-justification and associated self-protecting cognitive biases (which can operate without our conscious awareness) are argued to be central to these mental processes. Tavris and Aronson (2015, p.12) argue that cognitive dissonance is a “hardwired psychological mechanism that creates self-justification and protects our certainties, self-esteem and tribal affiliations”.
Tavris and Aronson (2015) provide a simple illustrative example of someone who is cigarette smoker who also believes that “smoking is a dumb thing to do because it could kill me”. Some such people will reduce the resulting dissonance by quitting, but others (e.g. if they tried to quit smoking but failed) may try to reduce dissonance by convincing themselves that smoking isn’t so harmful to their health and/or has redeeming benefits.
In their book on cognitive dissonance Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Act Tavris and Aronson (2015) highlight the role of self-justification which they argue is driven by the desire to reduce dissonance. They argue that self-justification is a major barrier to learning as it can prevent people from “even acknowledging that we made mistakes or foolish decisions” (p.41) and, consequently, it also impairs future action.
Of particular interest to me is Tavris and Aronson’s argument that “dissonance theory exploded the self-flattering idea that we humans, being Homo sapiens, process information logically” (p.21). Rather, they argue the way we process information is determined by whether it is consonant with our dominant beliefs and produces dissonance. Related to this they argue that “in a sense, dissonance theory is a theory of blind spots – of how and why people unintentionally blind themselves so that they fail to notice vital events and information that might make them question their behaviour or their convictions” (p.54). The book presents a huge number of fascinating case studies the authors’ claim conclusively demonstrates this.
Numerous examples in sustainability contexts come to mind. For instance, climate denialists – particularly those who have publicly committed to such a position (e.g. on social media or in their communities) – are strongly motivated to dismiss the latest evidence of human-induced climate change (and/or evidence of its harmful effects) and to exclude from consideration forecasts of negative future scenarios and literature on such risks. Or we can consider peak oil activists, many of whom remain just as committed to their beliefs despite numerous failed predictions. Similarly, I cannot think of a single main participant in the ‘Cassandra/Cornucopian debate’ of the 1970s and 1980s (link) who significantly altered their views; most continued to reject to the core arguments of their opponents and tended to dismiss or ignore disconfirming evidence.
Tavris and Aronson’s arguments about such information processing tendencies resonated with some of my earlier studies and work in marketing and advertising, particularly research I’ve read on how people process advertising and the mental processes that influence advertising effectiveness. For instance, some recent studies have challenged previous understandings of the cognitive processes that are involved which underplayed the role of semi-automatic and automatic cognitive processes. Similarly, Tavris and Aronson argue dissonance reduction is a largely automatic cognitive process.
In my PhD research I reached similar conclusions, particularly with respect to interpretations of scenario plausibility. For example, whether a scenario was rejected or viewed as credible/plausible was influenced by whether it was consonant with their beliefs. These finding suggest that people process anticipatory knowledge in similar ways of other information.
Some further ideas on the relevance of dissonance theory are briefly noted below.
Expectation formation is something I think needs much greater attention, such as related to the beliefs that motivate people in sustainability-related movements. Dissonance theory might be able to help shed light on the development of more extreme beliefs, why entrenched beliefs are so difficult to change, and what might enable people to review/alter their beliefs.
For example, over the past decade or so extreme positions have become much more common in environmental thought as more emphasis has been placed on the prospect of significant climate ‘breakdowns’, various forms of crisis and collapse that are argued to be imminent (e.g. related to resource limits), and/or a broader ‘planetary crisis’. In essence, many intellectuals and activists believe that the world stands on the brink of chaos and that major tipping points will inevitably be passed. Back in 2010, Australian scientist / activist Tim Flannery observed similar trends and argued that civilisational collapse style expectations/beliefs had become fashionable (Flannery, 2010).
Some such positions have a basis in science and evidence, and therefore should be taken seriously (not simply dismissed without due consideration). However, I’ve long suspected that more is often going on in such expectation formation, particularly psychological processes along with the influence of sociocultural factors.
Dissonance theory could be used or adapted to better understand this. It suggests that self-justification plays important roles, such as where expectation formation becomes more intense partly in response to peoples’ choices (e.g. to assure themselves they made the right call) and this may have reinforcing dynamics in terms of the strengthening of belief systems and subsequent actions. For example, consider the situation where someone quits a lucrative corporate career to join the climate change campaigning organisation. Plausibly they would avoid information which could lead them to question this decision and also develop stronger beliefs about human-induced climate change in order to avoid or dissonance about whether they made the right career choices. Related to this I’ve frequently observed that peoples’ beliefs often seem to become more extreme after they’ve made major life choices, perhaps aiming to produce increased consonance (between their actions and beliefs). Dissonance theory further suggests this often has more to do with reducing (or avoiding) dissonance than the rational basis of those beliefs.
Tavris and Aronson’s arguments in Mistakes Were Made (But Not by Me) further suggests that expectation formation should be understood as a gradual process. They point to a gradually intensifying sense of certainty which further closes the mind to other (future) possibilities and then shapes interpretations of relevant confirming and disconfirming evidence (as part of this process). If we consider current debates about renewable energy (e.g. see an earlier post) there is evidence of this. Such psychological mechanisms are a barrier to the open minds and greater humility many folks are calling for.
The question of whether, and under what conditions, Homo sapiens, process information logically is also an important one to consider with respect to other expectation dynamics. For example, how open are people to disconfirming evidence (which questions their expectations)? Does exposure to such information cause dissonance and, if so, how do people try to reduce this and with what consequences? How do existing beliefs influence interpretations of ambiguous evidence (e.g. ‘weak signals’ of change) and related actor expectations? How are these psychological processes shaped by an individual’s membership to social groups? And, lastly, where information processing tends to be illogical, how can this be influenced (so that it’s more logical)? Is this possible?
Aside from insights into expectation dynamics, such inquiry could shed more light on the effects of forward-looking studies and inform the way projects are designed, conducted and reported.
There is also the obvious opportunity to study how people deal with failed prophecies. For example, lots of predictions have been made about major crises that will soon be encountered (e.g. due to physical limits) and related key claims, both in sustainability contexts as well as with respect to the employment implications of automation or many other issues such as the dramatic predictions made about the Australian housing market (e.g. see this episode of Four Corners). Should such crises not eventuate when/as predicted, psychological theory provides the basis of number of theoretically-informed hypotheses about peoples’ responses which could be explored.
Flannery, Tim. (2010), Here on Earth: An Argument for Hope, Text Publishing, Melbourne, Australia.
Tavris, T. & Aronson, E. (2015), Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, Mariner Books.