Entropy, Uncertainty and Causality

Diego Tsutsumi
7 min readMar 23, 2021
Photo by Ross Sneddon on Unsplash

This article talks about how the awareness of three existing scientific concepts can help you make better decisions in business, improve the quality your relationships, and many other usages you come up with. These concepts appear in many scientific fields, although I mention their usage in science, I’m intentionally not calling out a scientific usage here, but rather using them to promote a different way to view life situations.

These concepts are not new, to some people they might be well-known concepts, to others not, so let me introduce them first and then we can relate and link them back to real life situations.

Entropy

Photo by Ian Parker on Unsplash

Entropy is originally defined in thermodynamics as the level of the internal energy of particles. The higher this energy the higher the level of randomness and disorder of the system. Entropy is many times generalized to other fields to refer as the level of chaos and disorder.

A formal generalization and mathematical generalization of is used in Information Theory to model noisy communication channels and compression algorithms. You have probably seen analog or digital TV displaying noise because its antenna wasn’t properly set. Wireless communication channels show erratic behavior when transmitting information from one side to another side.

On the other side compression algorithms, like *.zip generators in your computer, consider the entropy of the information itself as “real information content”, as the “surprise” or the “unexpected” to get rid of the redundant bytes.

Other informal generalizations of entropy are made in other diverse topics referring to mess, clutter or disorganization. For instance, “this music has too much entropy in it”, or “person X adds too much entropy to meetings”. Many of those usages intend to “look smart”, that’s not the intent here, I am going to use the concept later on to make good use and sense out of it, so stick with me!

Disorder, randomness and unpredictability are features of entropy that fit well into the Probability Theory. Probability distributions, random processes and sampling allow us to understand, model and predict entropy, that’s why it is the framework used to explain entropy.

I don’t want to get too technical, I promise entropy the most technical one, so let me jump to the next concept!

Uncertainty

Photo by Brett Jordan on Unsplash

The easiest way of explaining uncertainty is by explaining its opposite, certainty. When someone is certain of something, she believes she has all the knowledge and information to assert that thing. I can tell it was certainly sunny yesterday because I went to the beach to sunbathe (how I wish I could!).

Uncertainty, on the other hand, is the acknowledgement that I don’t have the information and knowledge enough to assert anything. #warning People can be certain of something and still be wrong about it. You can imagine there are tons of advantages of acknowledging lack of knowledge, when we take it into account we can reason differently in situations and take different actions that generally produces better outcome.

Now, certainty is not white and black, we are able to infer answers we don’t fully know from other things we might know. I might not know a 100% that my wife would like a red purse as a gift, but I know many other things that makes me be 80% certain she won’t like it (I hope she is not reading it! :P). Some other times we might not even infer, but rather rely on whatever beliefs we have to make those predictions.

One of the good frameworks used to model uncertainty in sciences is Bayesian Statistics, which again uses probability theory as its main basis. Bayesian statistics is all about having a first belief about something and update this belief with new observations (which I love and it’s very related to my other article Building on the Top of Yourself). Depending on how relevant are new observations you might choose to drastically change the current belief, not change it at all or somewhere in the middle. It’s definitely a powerful concept used to build A.I. systems.

Causality

Photo by Ryan Searle on Unsplash

Causality is the interpretation of happenings in a cause-effect way. For example, when I say: “if I stop working I stop receiving money”, I’m understanding that the money in my account is an effect of my work (the cause). We can use cause-effect pairs to interpret and understand how everything work around us.

This is what business people do when creating services and products, they are understanding what are the causes of problems that customers have, so that they can solve it. This is what scientists do when creating theories, Isaac Newton came up with a set of cause-effect pairs to explain how classical mechanics work, gravity fields causes force (the effect) on matter close to one another.

I’m sure you’ve already heard something similar to “Fix the root-cause and everything is sorted”. The quest for root cause of problems is very common and many times difficult in businesses, it requires a lot of data, investigation, gut feeling and rationale. That’s what detectives also use when they are figuring out crimes, finding the source of a money laundry scheme can be very challenging.

The famous sentence “Correlation does not imply Causation” is enough to say that statistics alone cannot deal with causality. Currently, there is a lot going on in businesses and the scientific community to combine statistics with other causal frameworks to build better AI systems (I recommend the book “The Book of Why” from J. Pearl, super interesting).

Topic for another time, but truth is that the world can be interpreted as happenings of causes and effects. Many times we do it unconsciously, but I find it important to be aware of our unconscious cause-effect interpretations to be able to change them as we evolve.

Why are these concepts relevant at all? How are they connected? Good questions, the point is that they refer to completely different things, and our main mistake is to mix them up unconsciously! We usually misjudge, misunderstand and confuse the three of them, leading to all sorts problems in life like, personal and professional relationships. I personally believe that this is a big source of problem in business and here is why.

Uncertainty is generally confused with Entropy. I had a business partner once that believed that all our failures came from “this market is too hard to enter and predict no one can sell it, we need a different strategy”. In fact, looking back to it, I’m now convinced that it was our lack of knowledge that made it hard to sell. The dynamics of that market was not particularly “messy” and “risky”. But our lack of knowledge in the company made us fail.

In other words the entropy of the market was not high, our uncertainty was high. Another way to view this confusion is to look at the two classical distinct interpretations of probability, frequentist and bayesian. If I state “there is 60% of chances that we succeed”, that could be a degree of belief (uncertainty, bayesian) or sales metrics (entropy, frequentist). We generally don’t disambiguate between both because it’s all about the percentage, and not what they mean.

This confusion generally leads to wrong causal models. Our course of action in that case led us to fail because our understanding of cause and effect in that context was wrong. In a very basic and simplified way, we thought that a subscription business model would cause sales to increase (effect), and that turned out not to be true.

The same applies to personal relationships! People are chaotic and erratic, we are often times unpredictable, add uncertainty to it makes things even worse. But let me give you another example, a manager is leading a meeting with other employees and at some point he says “The team did not meet the company standards last month, we need to be more attentive to the standards. Next week there will be a training to make sure we all know how to proceed”. These words alone have a bunch of assumptions about the team and a causal model, the training (cause) will improve the standards (effect).

Asserting a cause-effect model about people to themselves is a potential source of conflict, in this case the team might feel bad and internalize conflicts that will influence future relationships even if the assumptions are true. People in the team might not know enough about themselves and are probably not aware of entropy and uncertainty levels, but they have their own causal model.

By being aware of the three concepts, the manager could approach it differently and say “From previous month’s evidence I can tell something happened and our standards level have been oscillating (entropy increased), there are other factors to take into account that I still don’t know (uncertainty), like employee’s workload balance, machines and work conditions, that you guys can help me with. But what I think we need is more training to improve our standards (explaining the cause-effect model)”.

You could generalize this to personal relationships also. I really think it’s tough to be aware of this confusion in daily life, though at the same time the most brilliant people I got to know are the ones that clearly separates these concepts in their mind. So I want to finish with some conclusions to let us move forward:

  • If you ever think “There is a chance of this to happen”, ask yourself if the chance is because you don’t know enough (uncertainty) or because there is an intrinsic randomness (entropy) of the happening. It’s also useful to know that the entropy doesn’t change when you acquire new knowledge.
  • Be aware of the cause-effect models you are making, even unconsciously they strongly guide actions in all our daily lives.
  • Approach professional and personal relationships aware you and other people have their entropy (mood, etc..), uncertainties and their own cause-effect models.

--

--

Diego Tsutsumi

I am an engineer passionate about Artificial Intelligence and what its applications can do to change the world for the better.