The development of pu-software - blog 3
The third blog in the series "The gap between system and reality", about why working with computers is so frustrating.
The gap between system and reality
The behaviour of cows is unpredictable. You read it in my previous blog "Cows and not-knowing".
And when that behaviour leads to changes in the ecosystem, the ecosystem is unpredictable too.
That's what I learned in my internship.
After my internship I saw the same pattern more often.
The behaviour of individual people is unpredictable, especially when a lot of people interact with each other.
This leads to a completely unpredictable society.
It goes beyond the well-known Chaos Theory.
The Chaos Theory says that a limited amount of causes can create complex, unpredictable effects when those causes act non-linear (interact with each other).
Our society does not have a limited amount of non-linear causes, it has millions.
Our society is more then chaotic, you could call it deep complexity.
Our mind works best in simple, linear circumstances.
We then can analyse it, research it, understand it, find causes for it, build machines with it or build hierarchical organisations.
Our mind works worse in more complex, non-linear situations.
We developed a bias for what we do understand.
We tend to not see what we do not understand, we are a bit blind for it.
It has brought us a lot. We now can create computers for example.
Computers are simple. They abide simple rules. They are very good at it and very fast.
In a clear, linear situation, which is proper analyed and after that, well programmed, computers are very useful.
But computers ignore non-linear complexity, they contain a simplified version of the processes in our reality. A version where the non-linearity is flattened.
It actually is a good thing that computers function in such a simple way. It is what makes them so powerful.
Problems only arise when they have to function within the deep complexity of a society.
When people do not fit the computer labels, but are dependent on them.
When computer systems communicate with each other, but slightly other definitions are used.
When billions of computers are tied to each other and a problem in one system can spread to others.
Then a gap arises, between the simplified world of computer software and the deep complexity of society.
The gap can be brigded a bit, by admitting that the gap exists and by making it visible when it occurs.
Computer systems can be build in such a way that people can specify when they see the gap appearing.
By storing doubt about the functionality and the quality of a system, it is possible to make the gap visible.
The gap between system and reality is manageable, when systems and users take into account that the gap exists.
How that can be done? I'll tell you in blog 4.