on
Causality Is Not A Way To Make Sense Of The World
Or rather, it’s the pattern for many different ways to make sense of the world, rather than one specific way to make sense of it, as often claimed.
As humans in an incredibly complex world, we have to find ways to make sense of it, to filter the many, many ways things influence each other into a small-enough subset for us to actually do something with. We can’t make decisions unless we have a model of reality that is small enough for us to make predictions with, small enough for us to have a reasoned guess at what will happen when we take one of the many actions potentially available to us.
Causality is one such way to reduce the amount of things we need to take into account, when you look at it in the context of decision making. It does so by segmenting the world into causes and effects, and telling you that if you want more or less of the effects, you need only worry about the causes. But causality itself is too vast to be useful, every event having uncountable effects, and also uncountable causes.
Some of those causes are present, but we can’t do anything with them, or use them to make any other decisions. The famous meteorologist Edward Norton Lorenz’1 common example of this was the quote “a butterfly flapping its wings in Brazil can produce a tornado in Texas”. The cause is there, the event would not have happened in the way it did if not for the butterfly, but we… can’t really do anything with this information. No potential actions can be eliminated from consideration because of the information available about butterfly populations in Brazil.
Niklas Luhmann2, a German sociologist of the last century, proposed that causality as a concept therefore can’t be enough to reduce the potential actions enough to act, and instead theorised that causality only becomes useful as a tool for that if you pair it with specific other ways of reducing and managing complexity, such as applying axioms to gain “objective truth”, or by adding and applying values, roles and norms.3
In doing so, we separate the causes “worth caring about” from the “noise”, despite all of them being, objectively, causes of what we are examining. Where this now becomes exciting is that the additional ways of managing complexity are very much not uniform across people, demographics, fields, domains, etc, and therefore arrive at different subsets of causes that are “worth caring about”.
This naturally creates tension. For a lawyer, the most important causes might be about legal exposure and liability, for a psychologist, they instead might be about the effects on the involved persons’ mental health, and for an engineer, how the mechanics of it all fit togehter. People from certain cultural backgrounds might focus more on the individual action and reasoning, others on the collective performance.
To tie this to Resilience Engineering and incident analysis, this is where the Root Cause Analysis runs aground. It’s fundamental proposition to find the “one cause worth caring about” is impossible, and so it instead creates contention about how truth is constructed in social systems. If the system demands a Root Cause, it will get one, but what will be elected as that now becomes a social cage match about making your values and sense-making relevant to others, rather than being at all concerned with learning from the incident.
The Resilience/Human Factors/Safety community came to this conclusion entirely without invoking obscure, untranslated books from deceased German sociologists, and instead advocates figuring out what happened and gathering as much information about that as possible. It cleverly shortcuts the entire problem: Rather than having some group do the complexity reduction necessary for others that might not agree with their ways of reducing that complexity, it merely presents what actually happened. That way, the reader can apply their own heuristics, values, rules and find insight in the incident analysis, learning from it and hopefully helping prevent similar ones from occurring.
-
Who did a lot of really interesting things and founded predictive meteorology, and in doing so caused a lot of research and results to happen, one of which was coining the term “Butterfly Effect” after that same quote. If you want a nerd rabbit-hole, his work in mathematics, computers and systems theory is an interesting one. ↩
-
If you’ve been around me recently, you will probably know too much about him already, I have not yet shut up about how fascinating his work is. ↩
-
Niklas Luhmann, Macht Im System, p44. Untranslated, from the original German. ↩