Entropy For Dummies: How To Do It The Easy Way- A stick figure on the left flips a coin and is asking the following question in its head: "Heads or tails?" Below this bubble is seen the following word highlighted inside a square block: Entropy. Beside this block is a seal that says 'For Dummies - Approved' on it.

In this essay on entropy for dummies, I will be focusing on a generalised framework that not only enables the reader to easily understand the scientific notion of entropy, but also to benefit from the knowledge in practical real-life applications.

In my previous essay on entropy, I covered Claude Shannon’s framework for entropy (information theory). While still useful knowledge in the context of practical applications, this framework requires the reader to have a fundamental understanding of how logarithms work.

As opposed to that framework, the one that I will be presenting in this essay requires that the reader only be familiar with grade-school-level math and logic. Without any further ado, let us begin.

This essay is supported by Generatebg

A Game with a Bag of Coins

Let us start with a thought experiment. I give you a bag containing one hundred coins. Each of these coins has two sides: heads and tails. Now, I request you to exuberantly shake the bag and then unload the whole lot onto the floor.

As you can imagine, this would cause the coins to randomly bump against each other, roll over, get flipped, etc., before they land on the floor. Regardless of what happens, each of these coins only has two possible outcomes: heads or tails.

Now, what if I told you that all 100 coins landed heads? You would be surprised, right? Ask yourself why. Why would that surprise you? You might answer that question by justifying that it is an unlikely event. But what makes it an unlikely event?

The answer to that question is key to understanding (and benefiting from) the scientific notion of entropy. So, let us dive into it a little deeper.

Entropy for Dummies: Heads Vs. Tails

The first step to understanding why you’d be surprised is to recognise the fact that there is only one way in which a set of coins can be arranged with each showing heads. And that is for every coin to show heads on top. It might sound like I’m stating the obvious here, but bear with me.

Entropy For Dummies: How To Do It The Easy Way — An illustration showing a bag of money out of which a few coins have spilled down on the floor. Beside this illustation is the following text: “Heads Vs. Tails”.
Heads Vs. Tails — Illustrative art created by the author

Now, consider a scenario where after you unload the bag, 99 coins turn up heads and one coin lands tails. You would still be surprised, but a little less surprised than the case with all heads (and zero tails). Why? Let’s go through the same exercise.

There are 100 different ways a set of coins can be arranged such that there is exactly one tail and 99 heads. For example, a combination where the first coin lands tails, another combination where the second coin lands tails, etc. Now that you know, let’s keep going.

There are 4950 different ways in which a set of 100 coins can be arranged such that there are 2 tails and 98 heads. Here’s something even more interesting. What if I told you that after you unloaded your bag of 100 coins, 5 of them landed tails and 95 of them landed heads?

You’d still be surprised, but not nearly as surprised as when I told you that all 100 landed heads. Guess how many ways in which a set of 100 coins can be arranged such that we have 5 tails? Around 75 million ways.


Expectation Vs. Surprise

When we expect something and it happens, we are, by definition, not surprised. So, for something to surprise us, it has to defy expectations. In other words, it has to be unlikely or improbable.

In the context of the bag of coins, notice how there is an inverse relationship between your surprise at the outcome and the number of combinations in which you can arrange the corresponding result.

For instance, there is only one way to arrange 100 heads. If all 100 coins landed heads, you would be very surprised. There are around 75 million ways to arrange 5 tails and 95 heads.

If 5 tails resulted after you unloaded the bag, you would be less surprised. In short, the higher the number of possible combinations for your result, the lesser your level of surprise.

Notice also how your surprise level is only dependent upon the total ratio of heads versus tails. You are not interested in the result of individual coins. For instance, it would not surprise you if told you that the 31st coin landed tails and the 50th coin landed heads.

Now that we have clarified the relationship between the expectation and surprise of the outcome of our little game, we are finally ready to tackle entropy.

Entropy for Dummies — How to Do it the Easy Way?

Imagine that I told you that after you unloaded your bag of 100 coins, 50 of them landed heads and 50 of them landed tails. Would you be surprised? May be, but you would be significantly less surprised than when if I told you that all of them landed heads or 5 of them landed tails.

Now, how many combinations can a set of 100 coins be arranged in such that 50 of them show heads and 50 show tails? Ready for it? Here is the number of ways to do this: 100,891,344,545,564,193,334,812,497,256.

In other words, landing 50 tails and 50 heads is roughly one hundred billion billion billion times more likely than getting all heads (which can be done in only one combination). This is the reason why you would be surprised if I told you that all of the coins landed heads!

The key point to note here is each possible outcome has a different number of combinations. Some outcomes have more combinations than others, and this makes them more likely than the others.

We tend to also expect the outcomes with more combinations (like 50 heads/50 tails), and are surprised by outcomes that have less combinations (like all heads).

The scientific notion of entropy of a given outcome configuration is just the number of possible combinations it can occur (or can be expressed) in.

For example, the outcome of all heads has an entropy of 1, whereas the outcome of 50 heads/ 50 tails has an entropy of about one hundred billion billion billion. The lower the entropy, the higher our surprise, and vice-versa. Now that we know this, let us see how we may use it in real-life.


How to Use Entropy in Real Life

Let us now move away from coins (which have only two states) to molecules (which have more states). All of a sudden, we are able to predict which configurations are more likely than others.

If you think about it, we are (generally) not interested in how each molecule behaves, but how a bunch of them behave. Since they have more possible states, they also feature highly likely configurations that have many more combinations than others.

Combining this property with the notion of entropy, we are able to predict the behaviour of a collection of molecules (like water bodies). Of course, there are physics and thermodynamics concepts that link entropy with how molecules behave. These links are beyond the scope of this essay (I will cover them in a future essay).

But for now, it is sufficient if you understand that we can use entropy to predict how a group of random entities can behave. This also involves some mathematics and statistics, which I will also be covering in a future essay.

Final Comments

We started out with a thought experiment involving a random unloading of 100 coins. Then, we established a relationship between our expectation/surprise level and the outcome of the experiment.

Furthermore, we established that some outcomes are more likely (expected) because they feature many possible configurations. In contrast, there are some outcomes that are less likely (surprising) because they feature lesser possible configurations.

Entropy For Dummies: How To Do It The Easy Way- A stick figure on the left flips a coin and is asking the following question in its head: “Heads or tails?” Below this bubble is seen the following word highlighted inside a square block: Entropy. Beside this block is a seal that says ‘For Dummies — Approved’ on it.
Entropy for dummies — Illustrative art created by the author

We established that entropy is a measure of the number of possible configurations (equivalent combinations) that a particular outcome has. Finally, we extended the notion of entropy from coins to molecules and predicting physical behaviour of water bodies.

As a final note, I find it interesting that we, as human beings, are somehow attracted to low entropy states. A part of this realisation lies in the fact that high entropy states are expected; they are normal; nothing special; dime-a-dozen. A low entropy state represents something non-random; something that demands causal curiosity; something that begs questions.

Picture a highly cluttered desk with tardy stacks of paper and dusty books — a high entropy state. Now, picture a neat and tidy desk with books and paper-stacks meticulously arranged on the far ends — a low entropy state. Which image draws your interest and curiosity more?


Reference and credit: Brian Greene.

If you’d like to get notified when interesting content gets published here, consider subscribing.

Further reading that might interest you:

If you would like to support me as an author, consider contributing on Patreon.

Street Science

Explore humanity's most curious questions!

Sign up to receive more of our awesome content in your inbox!

Select your update frequency:

We don’t spam! Read our privacy policy for more info.