Chaos In Perception: The Subjective Nature Of Entropy - A block of ice on the left; a glass of water at the centre; a steam engine on the right. Ice has lower entropy than water, and water has in turn lower entropy than steam.

After reading my essay on How To Use Entropy Like A Pro In Thermodynamics, reader and patron Mark Copenhaver requested me to address the subjective nature of entropy. This set me thinking, and I believe that I have some insights to offer on this topic.

Let us start with the premise of subjectivity. Can a scientific measure such as entropy be subjective? Is science not about being objective? On a high level, these are the questions that I will be tackling in this essay.

But on a lower level, it turns out that this topic leads to a set of highly interesting meta discussions about the properties of entropy. Without any further ado, let us begin.

This essay is supported by Generatebg

A product with a beautiful background featuring the sponsor: Generatebg - a service that generates high-resolution backgrounds in just one click. The description says "No more costly photographers" and displays a "Get Started" button beneath the description.

The Fundamentals of Entropy — Revisited

Why don’t we start by considering the example that I discussed in entropy for dummies? We shake a bag of 100 fair coins exuberantly and unload them onto the floor. Now, consider a scenario where all 100 coins land heads.

Chaos In Perception: The Subjective Nature Of Entropy — Coins spilling from a green bag of money. The text ‘Heads Vs. Tails’ is writting beside the bag.
Heads Vs. Tails — Illustrative art created by the author

The number of combinations that leads to 100 heads is just one (each coin lands heads). There exists no other such combination. Next, consider a scenario where we end up with 99 heads and 1 tail. There are 100 combinations that lead to this “state”: The first coin could land tails or the second coin could land tails, and so on.

Finally, let us consider a scenario where we end up with 50 heads and 50 tails. Any guesses on how many combinations are possible for this “state”? The answer is: 100,891,344,545,564,193,334,812,497,256.

This disproportionate number tilts the chances in its favor. In other words, when we unload the bag of 100 coins, the result is 100,891,344,545,564,193,334,812,497,256 times more “likely” to be closer to a 50–50 split than a 100–0 split. In this case, a 50–50 split scenario would be a high entropy state (at the limit), whereas a 100–0 scenario would be a low entropy state (at the limit).

In this example, entropy measures the number of combinations that a particular result can occur in. This makes for a reasonable start to revisiting the fundamentals of entropy. However, we could do even better. Cue in Ludwig Boltzmann.

The Boltzmann Framework for Entropy

In the Boltzmann framework, when we talk about a scenario where we end up with 99 heads and 1 tail, we are talking about a “macro” state. When we talk about potential combinations that lead to this state, we are talking about “micro” states. When the first coin lands tails, it represents a microstate. When the second coin lands tails, it represents yet another microstate.

In other words, a macrostate with 99 heads and 1 tail has 100 possible microstates; a macrostate with 98 heads and 2 tails has 4950 microstates. And so on.

In this framework, entropy is nothing but the measure of the number of microstates per macrostate. We are still dealing with the same phenomenon, but we are defining entropy in terms of microstates and macrostates.

I have heard that this framework helps people wrap their heads better around the numbers. So, going forward, I will be sticking with this framework. Having revisited the fundamentals using the Boltzmann framework, we are now ready to discuss the subjective nature of entropy.


Chaos in Perception: When Combinations Mislead..

I mentioned in the introduction that Mark Copenhaver requested me to address the subjective nature of entropy. While this is true, he did more than that. It is just that we had not set the stage to address his concerns yet. Here is a quote from Mark Copenhaver’s original comment:

“In the coin example, we think that the observed state has higher probability than all heads but that’s only when we choose to observe classes of state rather than specific states.

Whatever the actual final state of the coins, that state has precisely the same probability as the all-heads state.”

Let me try and unpack what Mark means here. So far, when considering microstates that lead to a particular macrostate, we have been indifferent to the outcome of each coin. For instance, consider the macrostate with 99 heads and 1 tail. In considering the microstates for this macrostate, we give equal preference to the possibility that each coin could land a tail.

However, Mark is questioning the very premise. Why is it that we choose the macrostate such that it allows for all these microstates? Would it not be possible to “define” a macrostate such that it considers a scenario where the first coin lands a tail and all other coins land heads? In such a case, the number of possible combinations would be just 1.

In fact, in this paradigm, ALL macrostates would have just one possible combination; the macrostate is also the microstate. And if we choose to compute the entropy, we would get an entropy of 1 for each of these “states”. So, what gives?

The Subjective Nature of Entropy — Revealed

To make sense of what is going on here, let us go back to the origins of entropy. The problem that Boltzmann and co. were trying to solve, and which eventually led to the discovery of entropy, is to improve the efficiency of the steam engine.

Consider the comparative illustration below. It describes qualitatively that a glass of liquid water has more entropy than a block of frozen water (ice), and steam expelled by a steam engine, in turn, has more entropy than the glass of liquid water. In this case, temperature and state of matter could be the main causal factors.

Chaos In Perception: The Subjective Nature Of Entropy — A block of ice on the left; a glass of water at the centre; a steam engine on the right. Ice has lower entropy than water, and water has in turn lower entropy than steam.
Entropy: ice < water < steam — Illustrative art created by the author

Expressed otherwise, “We” define the concept of entropy such that its macrostates allow for sufficient microstates that differentiate between ice, water, and steam. Why did Boltzmann and co. do this? Well, it helped them understand the problem of the thermodynamic efficiency of the steam engine better.

And, because of the magic of statistics and the law of large numbers, the scale we used for these macrostates helped us “predict” which macrostates were more likely without using heavy computational resources.

If I were to generalise this situation, nature does not care about entropy. As far as nature is concerned, no entropic macrostate exists; only the current microstate exists. “We” define what the macrostate is. By our very nature, we choose macrostates that make sense for our human scales, because this helps us solve our day-to-day engineering and scientific problems.

This, in turn, makes the very notion of entropy subjective and relative! Before you shoot the messenger here, let me also illustrate to you that the subjective and relative nature of entropy is actually a feature and not a bug.


The Entropic Reservoir

One of the most useful powerful realisations that we derive from the notion of entropy is following fact:

“Systems with low entropy tend towards states of higher entropy until entropic equilibrium is reached.”

This realisation helps us understand (and cope with) the dreaded arrow of time, why things break, why we age, etc. But hidden deep underneath this realisation is also a very subtle but very powerful framework: the entropic reservoir.

We, as human beings, are endlessly and romantically obsessed with low entropic states. A beautiful glass jar is more attractive to us than a shattered glass jar; a well-oiled productive machine is more attractive than scattered pieces of scrap metal; a charged battery is more attractive than a discharged battery. You get the picture.

One could argue that the very life-purpose of most human beings revolves around the act of reducing the entropy of systems. Are you trying to invent something that solves a problem? You are trying to reduce entropy. Are you trying to improve the efficiency of a process? You are trying to reduce entropy.

In order to reduce entropy of systems, we (necessarily) draw from reservoirs of low(er) entropy. Are you trying to blow glass? You need a concentrated source of heat that can sustain the same temperature range over a prolonged duration of time. Are you trying to recharge a battery? You need a source of precise electrical output that forces the electrons to flow against their electrochemical gradient.

Without these “reservoirs” of lower entropy, we could not do useful work. And that is precisely why we invented the notion of entropy: useful work.

To do useful work, a subjective scale for entropic macrostates is useful, not a hindrance. All you need to do is gauge the entropy of two systems, and use the lower entropic system in someway to reduce the entropy of the higher entropic system to perform something useful for humanity.

But then again, this begs the question: If we use up all the reservoirs of low entropy, what would we have left? Are we not playing some sort of game of roulette here?

The Big Bang and the Heat Death of the Universe

Physicists have theorised that our universe began with the big bang. This would have been lowest entropic state of our universe, which helps us make sense of how the entropy of our universe has only been increasing ever since.

Even though the average entropy of the universe is constantly increasing, there exist local systems of “relatively” lower and “relatively” higher entropy. Arguably, “life” itself is one such lower entropic system.

Fast forward to present day, we make use of excellent low entropy reservoirs like fossil fuels to do useful work. By burning fossil fuels, we are increasing the entropy of these systems and are able reduce the entropy of other systems that benefit our relatively short lives (like transportation).

But as we all know, these low entropic reservoirs are limited. As fossil fuels run scarce, where do we turn to (as a species)? The best low entropy reservoir we know of, of course: The sun! It is not going to matter in your lifetime or mine, but the entropy of the sun is increasing too. In due time, it will also “die”.

Physicists theorise that the average entropy of the entire universe will keep increasing until it hits an equilibrium state known as the heat death of the universe. The term “heat death” here might be misleading. “Heat” refers to the irreversible heat loss from thermodynamic processes. In simple terms, heat death means that no more useful work would be possible. This means the universe would no longer feature local systems with lower or higher entropy.

To tie this back to the subjective nature of entropy, at this stage, the universe would feature a macrostate that is also the microstate. At least, that is what physicists theorise. But I can think of at least one alternate future.

The Future of the Universe

This essay has been a roller coaster ride; we started with tossing coins and have now reached the death of our universe. This makes things too sad for my liking. So, I would like to end on a brighter note. Let us once again go back to what drove Boltzmann and co. to discover entropy: the steam engine.

To compute entropy, we set the scales of the macrostates to something that made sense to us. These are human scales. What physicists theorise presently is based on these scales. In other words, we are extrapolating human scales to a universal scale. It is prudent to assume that we are missing information between the two scales (by the way, missing information reduces the entropy of a system).

Fractal art – Cross stitch curve (video and application created by the author)

In my eyes, there is a possibility that the relative nature of entropy IS the norm. What if the relativistic nature of entropy is fractal? Sure, it could be that “our” universe is headed for heat death in “our” scales. But for a higher dimensional being (like future artificial intelligence), it could be yet another local entropic state that just needs some “working” from another lower entropy system.

I strongly feel that we know too little to be talking about the death of our universe. If we know too little, we might as well be more optimistic about the (distant) future. It helps us stay humble and do our best to perform useful work in our limited lives, for we are blessed low entropic systems!


If you’d like to get notified when interesting content gets published here, consider subscribing.

Further reading that might interest you: 

If you would like to support me as an author, consider contributing on Patreon.

Street Science

Explore humanity's most curious questions!

Sign up to receive more of our awesome content in your inbox!

Select your update frequency:

We don’t spam! Read our privacy policy for more info.