Some important reasons why most trend and futures analysis is of very little value

Having spent the best part of 15 years working in roles where I’ve conducted and taught trend analysis and scenario analysis and read the work of many others who work in this area, I’ve concluded that a lot of this sort of analysis (including lots of my own work) is often of very little value. This may seem like a strange conclusion to reach. This post outlines some reasons why I’ve come to this conclusion with a focus on the ways that trend and futures analysis are often done, by whom, and with what effects.

The value

First, there is some value which I’d like to acknowledge. For example, these activities encourage greater awareness of change and the collection of related data which can be useful (e.g. from a strategy or decision-making point of view). To the extent that such an “empirical attitude” is fostered, for lack of a better term, great. Trend and ‘weak signal’ monitoring services and systems can be useful in this respect, and some academics have offered methods and frameworks for making these more robust (e.g. here). These approaches may, in some cases, have related benefits such as seeing new opportunities more rapidly that your competitors, or spotting risks earlier so as to enable a proactive approach.

In the courses I taught we certainly tried to achieve and to foster all of the above.

Important problems

Where these practices almost always fail, or at least significantly fall down, is when folk try to move from description to interpretation and then on to anticipation. For example, making “sense” of weak signals is extremely challenging – they can be strategic disinformation, as often as useful strategic intelligence. More broadly, interpretation requires an understanding of why events occurred or why a particular trend has emerged. Such understandings can inform forward reasoning about whether the trend can be reasonably expected to continue or accelerate in the future, or the various ‘weak signals’ you’ve picked up are significant signals of particular future changes or events or, instead, can be ignored.

So the first key problem is that of meaningful interpretation. Most work tends to fall into the category of “armchair philosophising” (rather than insightful analysis). Practitioners and executives try to do the work of social scientists and tend to fail miserably. In some cases, more informal data collection enabled by social networks and observational forms of data collection can improve this. However, most often this aspect is very poor. One solution is to explicitly seek out and apply relevant social scientific theory; however, this is often difficult and challenging work that folk have little or no relevant training for.

More often than not people fall back onto basic heuristics such as ‘what goes up must come down’ and ‘what goes around comes around’ (both of which suggests skepticism about the continuation of a trend and a future counter-trend / returning to an earlier state), or the S-curve model of change, or ‘from little things big things grow’ (which takes an optimistic view of emerging novelties and suggests the opposite expectation to heuristics like ‘what goes up must come down’). Given the complexity that typically must be grappled with in applied social analysis it is not surprising that simplifying heuristics are used.

Related challenges noted by Day and Schoemaker are that such analysis can be “compromised by information overload, organizational filters and cognitive biases”.

The second, closely related key problem is that of anticipation. For example, there are often a wide range of contrary, yet legitimate views on the future of a particular issue or trend(s) that the analyst needs to try to make sense of. Irreducible uncertainties also typically thwart such analyses and the search for clear answers or the certainties that decision-makers may be seeking – although most practitioners would argue that their methods and practices are about better coping with uncertainty, not removing it (something I wholeheartedly agree with). Most scenarios that I’ve read are fairly superficial, more or less entertaining “stories” which are being told for a range of reasons (which can be driven by the storyteller’s own agenda, more so than clear causal analysis) and rarely provide new strategic insights.

What to do? What are some ways forward?

The first key point is that these tasks are too often underestimated. People often think that useful, robust scenarios can be developed in a one-day scenario workshop, or that anyone can be a skillful, insightful trend analyst. Neither assumption is true. Consequently, one part of a way forward is to not make these mistakes and to recognise the complexities and difficulties that are inherent to these tasks.

Personally, one task I assigned myself was to become a more knowledgeable in applied social research methods and relevant theories (this will be a decades-long personal research project). Trend analysts make complex assessments of social, psychological and cultural processes and factors related to the phenomena they are studying – often with zero or very little theoretical basis for their assessments.

An even more fundamental action is to recognise the need to develop a more sophisticated understanding of the nature of both “the social” (i.e., the social worlds that we study and observe such as organisations, markets, communities and cities) and social scientific explanation. As philosopher of the social sciences Daniel Little has argued too many areas of social scientific research are “motivated by bad analogies with the natural sciences” such as when false assumptions are made that “the goal of social science research should be the discovery of generalizations” across types of phenomena (like the physical laws developed by scientists), or “facile assumptions [are made] about ‘social structures’ in analogy with ‘physical structures’” (the quotations are from this very interesting book chapter).

I’ve lost count of how many papers and analyses I’ve read where practitioners uncritically apply social scientific theories (e.g. as part of a scenario study), or mistakenly try to identify broad generalisations or uncritically extrapolate these theories beyond the time and place in which they were constructed (which is technically termed the ‘external validity’ problem). The point is that unless we adequately engage with these issues we have little hope of deeply understanding the trends and changes we are studying (e.g. socio-cultural trends), meaningfully interpreting them, and anticipating future developments. More often, our “analysis” will be strategic disinformation and not a good basis for decision-making.

2 Comments

  1. Ruben Nelson

    First, I share the view that much (most?) “futures” work is somewhat shoddy. As one who has been in this work since 1960, it was ever thus. Penetrating and grounded work has been and is still rare.

    Second, I agree that there are several reasons for this condition and for the the lack of attention to it. Some follow:

    A. Stephen McGrail’s point that most practitioners have too thin a background in either the humanities or the social sciences, let alone both. Too, true. A two year “Masters” course is unlikely to do the trick. Jim Dator’s PhD program in Hawaii stands out as an exception.

    B. There are several reasons we as societies do not demand higher standards. A major reason is that in a democracy it is felt, rightly at some level, that the future belong to all citizens. It should not be shut off behind wall for “experts” only. Put another way, we have not figured out the role of experts in futures work vis a vis citizens. This is also true in many fields. Another reason is that as societies we do not yet believe that understanding “the social” requires that long and hard work that Stephen suggests. Sadly, in 2014, it does.

    C. The fact is that few of those who fund futures work know enough about reliable futures work to tell the difference between schlock and substance. Most folks judge the quality of the work by its sponsors. It seems reasonable to assume that if a major corporation, foundation, think tank, government agency or university has funded and published the work,that then it must be sound, eh? Sadly, this is not the case. Equally sadly, fewer “futures professionals” are willing to bell the cat and say so. The result is that much “thin” futures work depends on and quotes other equally “thin” futures work. We take in each other’s laundry, so to speak. Of course, this is not unique to futures work, but we are among the culprits.

    D. Sadly, it is also the case that few significant organizations have a culture with a enough of a commitment to learn to keep learning when learning is painful, embarrassing and even humiliating. So even most futures learning reinforces the deep and largely unseen trajectory of our organizations and whole societies.

    Enough for now. Thanks, Stephen for opening up this line of thought. It matters. Our future may even hang on it. Or so it seems to me.

    PS If any reader is interested, I have written about the need for and nature of the next generation of foresight. See Futures, May 2010 or send me a note offline. rubennelson@shaw.ca

  2. Stephen McGrail

    Hi Ruben,

    Thank you for this insightful contribution – I find it heartening that others are thinking deeply about these issues. I also reviewed your paper on “Foresight 2.0” that you published in Futures. Some elements of the proposed “road ahead” resonated, including: the centrality of reflexivity; (re)formulating “foresight” practices with a social constructivist epistemology; and the importance of imagination (as per the discussion of the 9-11 Commission report).

    One comment I’d question is whether Jim Dator’s PhD program in Hawaii is an exception (re: practitioners having too thin a background in either the humanities or the social sciences). It clearly provides a deeper introduction to relevant social scientific knowledge and has some influential, insightful graduates. But it still seems too “thin” to me…

    Warm regards,
    Stephen

Leave a Reply

Your email address will not be published. Required fields are marked *