A new book popularising and discussing cognitive science, The Knowledge Illusion: Why We Never Think Alone, recently got my attention. It focusses on recent research findings that people tend to radically overestimate how much they know and, linked with this, greatly overestimate their knowledge of how things work (e.g. fairly simple things like how modern toilets work, as discussed in the book). Drawing on cognitive science, its authors Sloman and Fernback suggest the human thinking is both powerful and shallow and point to what they argue is the necessarily communal nature of intelligence and knowledge.
Steven Sloman and Philip Fernback are cognitive scientists who point to what they describe as the “darker side” of cognitive science in terms of what such research has revealed. This research points to major cognitive limitations (e.g. finite memory, limited reasoning, etc.) and Sloman and Fernback reach the conclusion that “we live in an illusion of understanding” and, therefore, don’t tend to “realize the depth of our ignorance” (p.10).
One of the key bits of the book that got my attention is their argument that “thinking is a social entity” (p.206). “Cognitively speaking, we’re a team” (p.263), they argue.
Related to this, they argue that we can’t (cognitively) deal with the actual complexity of the world, and consequently “instead of appreciating complexity, people tend to affiliate with one or another social dogma” (p.16). They then add:
Because our knowledge is enmeshed with that of others, the community shapes our beliefs and attitudes. It is so hard to reject an opinion shared by our peers that too often we don’t even try to evaluate claims based on their merits. We let our group do our thinking for us. Appreciating the communal nature of knowledge should make us more realistic about what’s determining our beliefs and values (p.16).
The book further develops this central concept of a community of knowledge. The essence of their idea is that people seek to “overcome the weakness and error inherent in our intuitive causal models by deliberating in step with our community” (p.80). The community of knowledge that one is part of “fills in the vast majority of details in our knowledge” (p.224). “Everyone’s understanding – that of scientists and non-scientists alike – is dependent on what others know” (p.224). A simple illustrative example is the way that people in a scientific discipline/field tend to influence and build-on each other’s work to the extent that attribution of specific discoveries (to particular individual[s]) can often be unclear/difficult.
Being part of a community of knowledge can also be helpful in terms of not having to keep all one’s knowledge in your head, amongst many other benefits.
Sloman and Fernback also suggest it contributes to other problems such as science communication issues (e.g. where peoples’ beliefs are shaped more by communities they’re part of than by “rational, detached evaluation of evidence” p.160) and intensified opinion on political issues which can become highly polarised. “The illusion of explanatory depth enables people to hold much stronger positions than they can support” (p.175), which they suggest is also reinforced by communities of like-minded people.
Some of the broader implications are important. For example, they conclude that science literacy enhancement efforts need to focus more on addressing “the consensus of the community [that an individual is part of]” or, alternatively, must seek to “associate the learner with a different community” (p.163).
The short concluding chapter, subtitled “Appraising ignorance and illusion”, is a thought-provoking discussion. Sloman and Fernback conclude that ignorance is both “our natural state” (p.257) and inevitable. The book spells out some of many implications of being/living in such a ‘state’, such as the implications of believing we understand things better than we actually do. But they also contend that such illusions can be a positive for enabling or motivating action. For instance, illusions “can motivate us to attempt things we wouldn’t otherwise attempt” (p.261). Moreover:
The knowledge illusion gives people the self-confidence to enter new territory. Great explorers must believe they know more than they do to undertake novel adventures… Many great human achievements are underwritten by false belief in one’s own understanding (p.263).
If all this sounds interesting I encourage you to check out The Knowledge Illusion. It assumes no prior knowledge of cognitive science and is thus quite an accessible book.