Foresight is all about paying attention to certain insights, signals, and trends. It also means disregarding or prioritizing some sources of data over others, correctly interpreting information to reach conclusions, and using all that data to make decisions about the future.
As with all thinking processes, we, as humans, tend to have all sorts of cognitive and emotional biases – flaws and prejudices in our mental models – when it comes to foresight. And these biases apply to every business leader, innovation manager, and trend scout.
Put another way – and sorry for the spoiler – there’s no such thing as the supposedly rational “homo economicus” and no leader, manager, or foresight practitioner is consistently rational.
For anyone attempting foresight, it’s crucial to recognize that biases are present every step of the way. Our cognitive and emotional biases are there when we spot future trends or emerging market needs, when we interpret these trends and needs, when we create and test scenarios, when we build strategies and plans, and when we make any future-related decisions.
Biases affect the insights gained through internally sourced knowledge or the signals gained from external sources, the content in trend and technology radars and how it’s evaluated, and the areas that gain the attention of decision makers when it comes to development, resource allocation, and investments (for example, emerging areas for R&D investments) in their organizations.
Emotional-cognitive biases influence the way in which we selectively search for and interpret data individually and in groups, and how we feel about that data.
At around 200, the list of biases identified by different branches of science is lengthy. Some of these biases affect how we formulate beliefs, reason, make decisions, and behave in general while other biases enhance or impair our ability to recall.
In their recent article, Schirrmeister, Göhring & Warnke (2020) identified 17 biases that influence one foresight method alone – the scenario process. These biases affect the smoothness of information processing during the scenario process, the influence of belief on information processing, the selection and integration of data, and interaction within groups.
The following are just a few examples of the biases inherent in the scenario process:
Information might be assessed as relevant solely on the basis that it is already known, whereas unknown alternatives might be deemed irrelevant.
The present is seen as stable, and the future is expected to be a linear continuation.
The tendency to see past events as predictable and unavoidable.
Overestimation of the probability of pleasant alternative futures and/or underestimation of the probability of unpleasant events.
Information is selectively perceived or remembered and interpreted according to one’s expectations.
Avoiding choosing alternatives with uncertain outcomes if alternatives with secure outcomes are available.
Self-censoring of opposing opinions within a group.
Bias is an element of the foresight process in the same way that uncertainties and data are both constituents of the process. Although we cannot entirely avoid cognitive-emotional biases when thinking and planning for the future, we can become aware of the biases we display when we’re embarking on foresight and making choices and decisions about the future, and use foresight to challenge choices and decisions that we’ve already made. And there are ways to mitigate the harmful impact of our biases.
Here, we offer some practical tips on how to keep biases in check and minimize their negative impact during the foresight processes.
Insights from the internal community and signals from external sources provide important data for the foresight process. To ensure the collective insights and signals are not simply the “knowns”, are focused too narrowly, only offer evidence for linear progression, or are cherry-picked, foresight teams and leaders could:
For example, the future of mobility versus the future of cars offers a wider breadth for gathering insights and signals.
Utilize both - qualitative and quantitative – and involve many stakeholders in the process.
Invite people to discuss, vote, like, and evaluate insights and trends. Here, platforms such as HYPE Strategy offer many features for engaging with the “hive mind.”
Trend and technology radars and scenarios can help to portray alternatives for the future and to make connections between future developments or uncertainties. To avoid establishing connections between unrelated topics, over- or underestimating the probability and impact of future developments, and dealing with uncertainty, foresight teams and leaders could:
Invite internal and external subject-matter experts to evaluate the trends, technologies, and key uncertainties in the scenarios. For example, the Delphi method is a widely used way to engage with stakeholders with varied perspectives, to seek their input, and to arrive at a consensus. In scenario building or in evaluating the plausibility of two future outcomes happening around the same time, subject-matter experts can help to assess consistency among possible future events.
Share their analysis publicly and transparently for open discussion with customers/end-users, suppliers, and partners.
Foresight is seldom undertaken for the sake of “futuretainment.” Often, foresight feeds business decisions, such as strategy building or business development. To ensure diversity of ideas and opinions in decision-making, and to avoid the self-censoring of opposing opinions and the escalation of commitment (the sunk-cost fallacy), foresight teams and leaders could:
Give individuals time to write down their ideas and points of view before brainstorming or group discussion in a decision-making meeting.
Assign roles to different people involved in the decision-making. For example, in the “Disney method,” everyone looks at an idea or a decision from the point of view of a dreamer, realist, or spoiler.
Do a pre-mortem of the choices and decisions by imagining that they have failed significantly, and work backwards to identify the obstacles and risks that led to the imagined failure.
Biases cannot be removed from our thinking processes, but they can be recognized, openly shared, and, to a certain extent, overcome.
No leader, manager, or foresight practitioner is consistently rational.
To spot, share, and overcome biases, consider trying out the “Stinky Fish” method in your next meeting or workshop. The stinky fish is a metaphor for "that thing that you carry around but don’t like to talk about - but the longer you hide it, the stinkier it gets." By putting "stinky fish" on the table, team members can relate to each other, become more comfortable sharing and spotting one another’s biases, and uncover areas for learning and development.