🧩 Predicting The Unpredicted

Oliver Jack Dean

Many decisions we make when managing and designing systems - whether empirical experimentation or solutions to software networking problems are based on "mechanical" prediction.

Actually, in general, nearly every critical decision made when building complex systems requires some element of prediction.

Karl Popper's hypothesis that the "act of prediction makes a difference in what is predicted" has split my brain down the middle here. And whilst walking through the forest yesterday evening, I stopped and contemplated this nugget of deterministic wisdom. I guess Karl kind of inspired me for this post. Many thanks to Karl.

Determinism is a broad and messy subject matter.

If I were to speak about this topic with an my uncle, maybe a project manager, Donald Trump, or some Greek philosopher, all of them would lecture for days.

So, I've written this article hesitantly.

For example, when identifying the cost of risk for a specific new product feature to integrate into an extensive complex software system or establish policies for future natural disasters or terrorism - steps towards deciding how to formulate solutions require some sort of predictive analysis.

But let's come back to managing systems and designing and engineering systems.

Often when designing and building systems of scale - some recurring anti-patterns and difficult situations plague nearly every product team working on some complex systems design:

  • We need to predict how much effort is required for work items. If "prediction" goes wrong, designers and other implementers will have wasted effort solving a problem that no one has ever actually encountered.
  • We need to understand whether, if a particular event or risk occurs in the future, will our predicted solution for when this event occurs work? Well, if the anticipated event is never encountered, the desired course of action towards deriving a solution accidentally introduces complexity.
  • If our predicted course of action or solution toward a potential risk is misaligned or not fit for purpose, our design and engineering teams need to pivot in another direction.

Now things are getting rather complex

I am more or less talking about many different types of systems here.

But when it comes to managing complex systems, such as software - I am sceptical of those who apply a predictive top-down approach above and beyond other methods.

The well-known objection, provoked again by Karl Popper, is that the idea of "total knowledge" or "total self-knowledge" of some system is, in principle, incoherent to making accurate predictions and decisions.

Over the years, I believe this to be somewhat true and within the realm of systems design as well.

How often do we see designers, engineers, or managers estimate risk as "gut instinct", and the instinct is proved wrong or totally wide of the mark within a year?

This is interesting and one of many similar scenarios when studying the importance of team cohesion, collaboration, coordination, and knowledge sharing within teamwork frameworks.

At face value, when designing and building complex systems then, there are situations in which a team of individuals simply have to accept spontaneity.

It's well known that when building complex systems or designing or planning engineering projects, individuals or groups cannot be self-consciously aware of all possible end-states ahead of time.

And even if teams were to attain some level of accuracy - it's often not enough - or as Popper conveys, there are "too many dynamic multivariate coefficients to predict consequences or developments with absolute certainty".

Exploiting systemic surprise

Going further, Popper claims that knowledge of a "systemic reality" is widely dispersed and cannot be reduced to codified "reason" or "plans".

When analysing the collapse of centrally planned economies, Friedrich von Hayek also reiterates the same concept throughout his economic systems theories at the end of the 1980s.

And as conveyed by the small list of anti-patterns I've listed above - all attempts to make a "systemic reality" conform to a "central all-seeing-eye plan" is pretty damn hard.

Perhaps, this is my proper Keynesian English self coming out? Uh oh.

Many a Keynesian understands that a capitalist economy will not run smoothly for indefinite periods, no matter how one sets the variables?

So, to be progressive when managing complex interactive systems, especially for facilitating team cohesion and decision making predictability, tradeoffs ought to occur.

Now, most of the tradeoffs occur when handling the flow of knowledge and information across distributed workstreams.

Numerous leading social scientists have asserted that organisational knowledge, the tradeoffs and decisions made based on such knowledge, should be regarded as a "strategic asset".

And I think there is a growing awareness of gathering, locating, capturing, and sharing collective knowledge or expertise throughout when designing and building complex systems.

So, that's cool 👍

Change management

Often the best recipe for managing change and risk in complex environments is through mutual explanation, conversation and knowledge sharing.

What I like to call "mutual knowledge adjustment".

So, finding methods to capture the minor iterative adjustments in understanding and reconciliation that, over time, help product teams accurately predict or estimate future system impacts.

Estimating how systems will evolve and where risks or issues occur in the future can only be done when teams embrace dynamical complexity and changing correlations across workstreams.

How to do this is already a growing research area with multiple overlaps between different scientific and engineering research.

Regardless, it's imperative to have modern process management tools in place to enable product teams embrace and extract dynamic information about the current state of complex system design.

Yet, quite often, many organisations with complex teams struggle with this aspect. This kind of weired product team pragmaticism.

Pragmatic systems design

Pragmatic systems design integrates mechanisms for teams or individuals to gradually facilitate knowledge of why and where systematic change comes from and the contexts in which such risks or changes were created.

There is growing interest in moving away from mechanistic first principles of systems design and maintenance, thanks to state of the art tooling, towards more dynamically "drift-flux" approaches - enabling teams to cope with unforeseen change and continuous decision-making.

Now, don't get me wrong. The objectives of what I have described in this article are wide-ranging.

But no doubt, the ability to orchestrate change to a systems design across complex team workstreams is contingent on the relationship with the people involved in implementing the change.

The notion of mechanical bottom-down approaches toward systems design, particularly in software and placing too much weight on rigorous predictive analysis will increasingly become a trend of "yesterday" across many industries over the next decade.