7 Comments

  1. You’ve hit on a number of behaviors that, in your final table of questions, show the value of brainstorming, collaboration, and (increasingly) cross-disciplinary research. There is a confluence of things that a pessimist might list like rhetoric, mythology, myopia, and politics — which an optimist might instead list respectively as craft, modeling, focus, and goal-seeking. Every one of these things goes through fads, innovations, and phases — but clearly there is lots of room always for debate. The most important thing to have available is a concept-testing environment. Part of that environment should be things like the table of questions you have provided as a tool. Part of the environment should probably also be competition ( with which we are extremely familiar in the form of the courtroom trial). And part of it is probably just the discipline of following clues (and here the key thing is that a clue may or may not be evidence but you use it anyway) — good old detective work. The direction of all this is about arriving at a “good” hypothesis instead of at a “preferred” hypothesis.

  2. Thanks for condensing these issues in a practical way. I’d like to use the table, with the proper acknowledgements of course, for the work we undertake to learn about the future.

    Heuristics are also a trap when trying to explore how the future may unfold. I’d be interested in your thoughts about how to overcome many of them?

  3. Stephen McGrail

    Hi Sheryl, thanks for visiting, yes you’re most welcome to use the table etc, acknowledgement is appreciated. I hoped others would find the analysis useful.

    Re: your question, Malcolm’s response contains some useful thoughts. I’ve noticed that some of these issues can be most problematic in homogenous groups, or organisations with strong cultures and/or hierarchies (which can lead to unchallengeable orthodoxies).

    I think building greater awareness of these issues is a good first step to overcoming and addressing them. Obvious, yes, but also fundamental.

    In my experience, denial can be overcome by the creation of ‘safe’ spaces in which people can discuss and examine their fears. With the school teachers example, this enabled them to then begin to reframe them and to move on to discuss opportunities.

    Re: denial Ravetz and Ramirez discuss these issues in a paper on “feral futures” in the journal Futures (Volume 43, Issue 4). Specifically, they discuss how early warnings/weak signals can create ‘cognitive dissonance’, leading to denial. Their paper has some fascinating suggestions, such as about the possible roles of Zen mindfulness. It’s well worth a read.

    I’ve also seen that participatory scenario building exercises can be helpful for some of these issues (e.g. #5), if a sufficiently diverse group is involved. Adam Kahane’s books discuss this.

    Specific methods are also useful for countering particular futures thinking issues. E.g. horizon scanning (also termed ‘weak signal’ monitoring and analysis) can be used to counter context insensitivity. But it also raises its own issues, such as whether the signals are taken seriously and needing to deal with potential information/cognitive overload (issue #4).

    Beyond this, institutional factors should be considered. In some organisations people feel unable to voice and explore “left-field” ideas, fearing they’ll lose credibility etc. Traditional forms of decision analysis also tend to privilege a focus on the “most likely future”, which can lead to an extrapolation approach and conventional thinking (e.g. with biases like zeitgeist bias coming into play). So cultural change is often part of the challenge.

    However, I would also caution that unpredictability must also be taken more seriously (in my opinion). I agree with Nassim Taleb that true “black swan” events are inherently unpredictable and, thus, only make sense in hindsight. Although many people will fraudulently claim they did predict them. Additionally, in we take complexity theory seriously, many complex adaptive systems will always be highly unpredictable – especially over the medium- to longer-term. So, perhaps, denial (#1) can also take the form of denying unpredictability.

    Are you able to share any of your strategies/practices? Also, Malcolm: anything to add?

    • Patrice Romzick

      Stephen,
      You’ve elevated my thinking in your post.
      I think you grasp the huge challenges of freeing up people’s thinking at the top of organizations while they still lead businesses whose people must focus their daily work on the here-and-now in order to deliver revenues, profits and customer service.
      Have you any advice on how leaders bridge that gap? Can you point to any leader/organization who is in the process of navigating a major intervention and still keeping “the markets” satisfied?
      Thanks.

  4. Heuristics is an issue where one is required to place a bet, but you do actually get to ask for new cards. That is, given the resources to continue making an effort, the best way to make heuristics useful is to run several heuristic models concurrently, and run them all repeatedly. These days, the idea that heuristics are fundamentally (but not exclusively) formed with experience should simply embrace the possibility that multiple perspectives (frameworks of experience) can inform each other. I’m oversimplifying again, and borrowing from both chaos theory and soccer, but the continual comparison and overlaying provides an “Attractor” that we can use as a more reliable pattern than a “trend”. In plain English, it is like saying that “tendencies” in an environment are more reliable than “trends” in a circumstance. It shifts thinking more towards environmental variables. So the practical aspect of this is to model environments and to compare different ideas of “rational behavior” that could be (or have been) occurring in a selected (modeled) environment. Not so different from what you’ve already pointed out, Stephen. I actually think, however, that this advocacy of multiple perspectives calls up the problem of understanding the difference between intelligence and expertise. Pushing this way out to the edge, I would use intelligence to try to forecast the environment; I would use expertise to try to forecast behavior. Intentionally different uses of different kinds of “information”. Within the organization, culture heavily predisposes how the information becomes available. If there are traps in heuristics, they may very well lie in attitudes people have about “information as property” or “information as assets”. For example, the trap here is in the terms offered for acquiring the information…

  5. Great analysis Stephen. Over time one of the strategies I built was to force myself to immerse myself somewhat systematically in information flowing from what are my “horror” perspectives. Keeping abreast (somewhat) of the direct-opposite viewpoints and/or psyche. It’s a painful way of taking a wake up call on the diversity of the species, and I still wonder how much of a “reality” filter I then apply to screen out edge-dwellers in conflict with my views in the final analysis, minimising both power and potential activities, thus influence. Still, one can but try, and the indicators you describe are valuable.

    • Stephen McGrail

      Hi Karen, thanks for the feedback! Those are interesting personal foresight strategies – sounds a bit like strategies that can be used to avoid or reduce ‘confirmation bias’. That is, people often reach a conclusion FIRST and only thereafter gather facts in such a way as to support their pre-conceived conclusion. (I believe Richard Slaughter’s book Biggest Wake-up Call in History is a first-rate example of confirmation bias). Two strategies to address this are: specifically looking for data that questions or contradicts your assumptions, hypotheses and conclusions; and making a careful habit to skeptically re-think our prior interpretations and conclusions.

Leave a Reply

Your email address will not be published. Required fields are marked *