Systems intelligent foresight

In December 2014, Torsten Slok from Deutsche Bank posted a graph showing how Wall Street economists had been consistently wrong in their forecasts about the 10-year interest rate. While the forecasts shown in the graph may be extreme examples of linear thinking about the futures, many forecasts fall in the same trap of extrapolating or failing to account for the feedback and interaction between multiple factors.

Forecasts often show this type of "systems idiocy" quite clearly, but other examples are more subtle and have to do with implicit assumptions about the linearity of the future. These are often linked to some taboo topics such as economic growth. The implicit assumption is that things will keep getting better, or, if it is a more pessimistic or apocalyptic vision, things will only go downhill. Attached to these are often one-solution-fits-all type of implications, like "this technology will save us all or be the doom of us all".

Most foresight and futures thinking luckily is not systems idiotic. A famous example of applying systems thinking is the famous Limits to Growth report, which used system dynamic models to give an idea of the consequences of prevailing thinking, to surface the counter-intuitive behavior of the global economic system. Many of the methods used on foresight have elements in them that encourage systems thinking, such as the futures wheel.

Systems thinking has been influential in foresight since the 1960s, and new concepts such as "Innovation System Foresight" have been proposed also recently. Foresight thus has deep roots in systems thinking. But the question is then to what extent is it systems intelligent? What can the concept of systems intelligence add to current foresight approaches?

Systems intelligence is defined as

...intelligent behaviour in the context of complex systems involving interaction and feedback. A subject acting with systems intelligence engages successfully and productively with the holistic feedback mechanisms of her environment. She perceives herself as part of the whole, the influence of the whole upon herself as well as her own influence upon the whole. Observing her own interdependency with the feedback-intensive environment, she is able to act intelligently.

The concept of systems intelligence was proposed in 2004 by Raimo Hämäläinen and Esa Saarinen, and it has been further developed in the systems intelligence research group at Aalto. Last november they published a book called “Being Better Better”, which elaborates on eight aspects of systems intelligence: systems perception, attuning to systems, others and self, reflection, positive engagement and attitude, spirited discovery, effective responsiveness and wise action.

I think that foresight could benefit from a more systems intelligent approach. This applies to both thinking about futures, that is to what future-orientation is, and to the practice of foresight, that is the design, facilitation, implementation, documentation etc. of foresight exercises. Below are six examples of what systems intelligence could mean in the context of foresight. All quotes are from the book "Being Better Better".

Systems perception: the Bird and the Worm

"We act systems intelligently when we remind ourselves to take the perspective of both the bird and the worm, when we see the connections between ourselves and the big picture"

In foresight it is usually understood that there is a need to look at the bigger picture, to see the connections between different issues. This is the bird’s eye view. However, this is only one view to the topic – or to the system – and one that implies objectivity and control. The other which is perhaps less present in foresight is the worm’s eye view, the grass-root level view connected to the everyday life of the people involved.

While there are approaches with an emphasis also on the individual – such as transformative scenario planning by Adam Kahane or Integral Futures by Richard Slaughter – much of foresight tends to objectify the future, to present images of the future that are somewhat general. Connection between personal level and these general futures is often lacking. This is discussed both in terms of scaling the scenarios and assessing the impact of foresight. A systems intelligent view includes both of these perspectives and emphasises the connections between them.

Control and no control

"Part of being systems intelligent is acting as if we can make happen whatever it is we want to make happen, knowing that we cannot and yet being willing to work with whatever does happen"

The bird’s eye and worm’s eye view also point out different approaches to control. The systems intelligent view is that control is not possible, but we should nonetheless aim to change the system and be sensitive to the changes in the system. In the European Foresight Platform foresight is described as thinking, debating and shaping futures. Systems intelligence view brings new light to this: instead of separate things, thinking, debating and shaping are more interconnected and can happen at the same time. The aim is thus to shape the future on the level of the worm via debating, informed by the bird’s eye view of thinking about the futures.

Systems intelligence also brings a slightly alternative view to shaping the future. It is not about shaping something that is disconnected from us. The system is not separate from us. This means both that we can change it, but at the same time it can also change us or at least our behavior. Thus the process of shaping the futures is also a process of shaping ourselves and our behavior.

While the coexistence of control and no control may seem paradoxical, aiming to achieve something and then adapting is actually quite natural to us in our everyday life. We solve problems and learn. The paradox arises if the problem or topic in foresight is presented as external to the stakeholders, because then the emphasis is on control: shaping from the outside. But if it is approached as a process of living, feeling and learning as part of the problem and context, then it makes more sense.

Celebrating the weird

“We have the capacity to celebrate the weird, the wonderful, the unique”

Connected to finding the balance between shaping and thinking in foresight is the question of exploration and exploitation: how much to open up and challenge and how much to build from existing or apply what is already known or anticipated. For example, Dator’s second ”law” of futures studies states that ”Every useful idea about the future should appear to be ridiculous”. Despite this there is often a tendency in foresight to focus on the pressing present problems at the expense of futures exploration. Ridiculous ideas are analysed and every bit of value for the present that they might have in them is squeezed out.

A systems intelligent view would be to celebrate the ridiculous and weird for what they are: something that we don’t quite understand because they are “from another world”. A positive attitude towards the weird enables building on it: to explore this world further, because positivity enhances creativity.

Foresight with Sisu

“To cultivate happiness about the future we can learn how to hope and how to talk ourselves out of pessimistic thoughts”

Positive attitude and positive engagement are also something that are essential to building the futures worth having. Currently there seems to be a tendency at least in Europe to paint picture of doom and gloom, and for good reasons. The economic situation is bleak after a long period of relatively steady growth, environmental problems are becoming more pressing and manifesting themselves in concrete ways and the current political and financial systems seem unable to cope with these problems.

However, I don’t think these gloomy futures serve us for anything other than warnings – they tell us where not to go, not where to go. For this we need images of flourishment, prosperity, happiness. This does not mean that we ignore the problems, but develop a path out of them, using the viewpoints mentioned before: celebrating the weird, acknowledging control and no control and taking a bird’s and worm’s eye view.

This attitude of aiming for flourishment against great adversity could also be called foresight with Sisu. Sisu is defined by Emilia Lahti, a researcher specialised in Sisu, as “extraordinary determination, courage and resoluteness in the face of extreme adversity”. It is more than perseverance or grit. It is about acknowledging the present challenges, but still working towards a goal despite the overwhelming obstacles. I think this is needed in current foresight.

Holding back

“We co-create a system, a less than optimal way of interacting where everyone is holding back, without seeing our own part in that”

The last two examples are more focused on what systems intelligence could contribute to the practice of foresight. The first of these is the concept of systems of holding back. This means that each person acting in the system is unaware that they are maintaining a situation that everyone, including themselves would not prefer. In other words, they are holding back from doing something that would be beneficial for everyone. Apologicing, complementing, being positive, being kind to each other – the absence of this type of behaviour are all examples of holding back.

In a foresight process the facilitators need to be sensitive to identifying systems of holding back and move to deconstruct them. This means encouraging openness, sharing ideas, listening etc. - for example encouraging moving from promoting own agenda and ideas to being open to others' ideas.

Crazy wisdom

“Pushing people out of their comfort zones into new behaviors is called “crazy wisdom” – the ability to be shocking, colorful, dramatic, and wise as one awakens others to new possibilities”

Deconstructing systems of holding back also means creating an atmosphere where fluorishment, flow and creativity is allowed. When people feel safe to explore, they can be pushed out of their comfort zones, where new behaviour can occur, giving rise to changes in the system between the participants. Foresight practitioners need thus to embrace ”crazy wisdom”, to engage in the situation positively.  

To conclude, the idea of systems is not anything new in foresight. However, it might be beneficial to enhance it with the concept of systems intelligence. Systems intelligence is well aligned with existing foresight approaches, emphasising participation, thinking about connections, creativity etc, but it also highlights new aspects that have not been so prominent in foresight. These aspects include for example

  • highlighting the importance of emotions and affect both in thinking about the futures and in the situation of a foresight exercise,

  • understanding system level behaviour as something that emerges out of the interaction between individuals who are part of that system,

  • attuning to the situation, to others and to self when engaged in foresight,

  • having a positive attitude and embracing hope in order to create futures worth having, and

  • putting more focus on the individual level and providing a frame to relate the individual and system levels.

In other words, systems intelligence can contribute to both thinking about the futures, and to the practice of foresight.