Chaos In Perception: The Subjective Nature Of Entropy
Published on July 29, 2023 by Hemanth
--
After reading my essay on How To Use Entropy Like A Pro In Thermodynamics, reader and patron Mark Copenhaver requested me to address the subjective nature of entropy. This set me thinking, and I believe that I have some insights to offer on this topic.
Let us start with the premise of subjectivity. Can a scientific measure such as entropy be subjective? Is science not about being objective? On a high level, these are the questions that I will be tackling in this essay.
But on a lower level, it turns out that this topic leads to a set of highly interesting meta discussions about the properties of entropy. Without any further ado, let us begin.
Why don’t we start by considering the example that I discussed in entropy for dummies? We shake a bag of 100 fair coins exuberantly and unload them onto the floor. Now, consider a scenario where all 100 coins land heads.
Heads Vs. Tails — Illustrative art created by the author
The number of combinations that leads to 100 heads is just one (each coin lands heads). There exists no other such combination. Next, consider a scenario where we end up with 99 heads and 1 tail. There are 100 combinations that lead to this “state”: The first coin could land tails or the second coin could land tails, and so on.
Finally, let us consider a scenario where we end up with 50 heads and 50 tails. Any guesses on how many combinations are possible for this “state”? The answer is: 100,891,344,545,564,193,334,812,497,256.
This disproportionate number tilts the chances in its favor. In other words, when we unload the bag of 100 coins, the result is 100,891,344,545,564,193,334,812,497,256 times more “likely” to be closer to a 50–50 split than a 100–0 split. In this case, a 50–50 split scenario would be a high entropy state (at the limit), whereas a 100–0 scenario would be a low entropy state (at the limit).
In this example, entropy measures the number of combinations that a particular result can occur in. This makes for a reasonable start to revisiting the fundamentals of entropy. However, we could do even better. Cue in Ludwig Boltzmann.
The Boltzmann Framework for Entropy
In the Boltzmann framework, when we talk about a scenario where we end up with 99 heads and 1 tail, we are talking about a “macro” state. When we talk about potential combinations that lead to this state, we are talking about “micro” states. When the first coin lands tails, it represents a microstate. When the second coin lands tails, it represents yet another microstate.
In other words, a macrostate with 99 heads and 1 tail has 100 possible microstates; a macrostate with 98 heads and 2 tails has 4950 microstates. And so on.
In this framework, entropy is nothing but the measure of the number of microstates per macrostate. We are still dealing with the same phenomenon, but we are defining entropy in terms of microstates and macrostates.
I have heard that this framework helps people wrap their heads better around the numbers. So, going forward, I will be sticking with this framework. Having revisited the fundamentals using the Boltzmann framework, we are now ready to discuss the subjective nature of entropy.
Chaos in Perception: When Combinations Mislead..
I mentioned in the introduction that Mark Copenhaver requested me to address the subjective nature of entropy. While this is true, he did more than that. It is just that we had not set the stage to address his concerns yet. Here is a quote from Mark Copenhaver’s original comment:
“In the coin example, we think that the observed state has higher probability than all heads but that’s only when we choose to observe classes of state rather than specific states.
Whatever the actual final state of the coins, that state has precisely the same probability as the all-heads state.”
Let me try and unpack what Mark means here. So far, when considering microstates that lead to a particular macrostate, we have been indifferent to the outcome of each coin. For instance, consider the macrostate with 99 heads and 1 tail. In considering the microstates for this macrostate, we give equal preference to the possibility that each coin could land a tail.
However, Mark is questioning the very premise. Why is it that we choose the macrostate such that it allows for all these microstates? Would it not be possible to “define” a macrostate such that it considers a scenario where the first coin lands a tail and all other coins land heads? In such a case, the number of possible combinations would be just 1.
In fact, in this paradigm, ALL macrostates would have just one possible combination; the macrostate is also the microstate. And if we choose to compute the entropy, we would get an entropy of 1 for each of these “states”. So, what gives?
The Subjective Nature of Entropy — Revealed
To make sense of what is going on here, let us go back to the origins of entropy. The problem that Boltzmann and co. were trying to solve, and which eventually led to the discovery of entropy, is to improve the efficiency of the steam engine.
Consider the comparative illustration below. It describes qualitatively that a glass of liquid water has more entropy than a block of frozen water (ice), and steam expelled by a steam engine, in turn, has more entropy than the glass of liquid water. In this case, temperature and state of matter could be the main causal factors.
Entropy: ice < water < steam — Illustrative art created by the author
Expressed otherwise, “We” define the concept of entropy such that its macrostates allow for sufficient microstates that differentiate between ice, water, and steam. Why did Boltzmann and co. do this? Well, it helped them understand the problem of the thermodynamic efficiency of the steam engine better.
And, because of the magic of statistics and the law of large numbers, the scale we used for these macrostates helped us “predict” which macrostates were more likely without using heavy computational resources.
If I were to generalise this situation, nature does not care about entropy. As far as nature is concerned, no entropic macrostate exists; only the current microstate exists. “We” define what the macrostate is. By our very nature, we choose macrostates that make sense for our human scales, because this helps us solve our day-to-day engineering and scientific problems.
This, in turn, makes the very notion of entropy subjective and relative! Before you shoot the messenger here, let me also illustrate to you that the subjective and relative nature of entropy is actually a feature and not a bug.
The Entropic Reservoir
One of the most useful powerful realisations that we derive from the notion of entropy is following fact:
“Systems with low entropy tend towards states of higher entropy until entropic equilibrium is reached.”
This realisation helps us understand (and cope with) the dreaded arrow of time, why things break, why we age, etc. But hidden deep underneath this realisation is also a very subtle but very powerful framework: the entropic reservoir.
We, as human beings, are endlessly and romantically obsessed with low entropic states. A beautiful glass jar is more attractive to us than a shattered glass jar; a well-oiled productive machine is more attractive than scattered pieces of scrap metal; a charged battery is more attractive than a discharged battery. You get the picture.
One could argue that the very life-purpose of most human beings revolves around the act of reducing the entropy of systems. Are you trying to invent something that solves a problem? You are trying to reduce entropy. Are you trying to improve the efficiency of a process? You are trying to reduce entropy.
In order to reduce entropy of systems, we (necessarily) draw from reservoirs of low(er) entropy. Are you trying to blow glass? You need a concentrated source of heat that can sustain the same temperature range over a prolonged duration of time. Are you trying to recharge a battery? You need a source of precise electrical output that forces the electrons to flow against their electrochemical gradient.
Without these “reservoirs” of lower entropy, we could not do useful work. And that is precisely why we invented the notion of entropy: useful work.
To do useful work, a subjective scale for entropic macrostates is useful, not a hindrance. All you need to do is gauge the entropy of two systems, and use the lower entropic system in someway to reduce the entropy of the higher entropic system to perform something useful for humanity.
But then again, this begs the question: If we use up all the reservoirs of low entropy, what would we have left? Are we not playing some sort of game of roulette here?
The Big Bang and the Heat Death of the Universe
Physicists have theorised that our universe began with the big bang. This would have been lowest entropic state of our universe, which helps us make sense of how the entropy of our universe has only been increasing ever since.
Even though the average entropy of the universe is constantly increasing, there exist local systems of “relatively” lower and “relatively” higher entropy. Arguably, “life” itself is one such lower entropic system.
Fast forward to present day, we make use of excellent low entropy reservoirs like fossil fuels to do useful work. By burning fossil fuels, we are increasing the entropy of these systems and are able reduce the entropy of other systems that benefit our relatively short lives (like transportation).
But as we all know, these low entropic reservoirs are limited. As fossil fuels run scarce, where do we turn to (as a species)? The best low entropy reservoir we know of, of course: The sun! It is not going to matter in your lifetime or mine, but the entropy of the sun is increasing too. In due time, it will also “die”.
Physicists theorise that the average entropy of the entire universe will keep increasing until it hits an equilibrium state known as the heat death of the universe. The term “heat death” here might be misleading. “Heat” refers to the irreversible heat loss from thermodynamic processes. In simple terms, heat death means that no more useful work would be possible. This means the universe would no longer feature local systems with lower or higher entropy.
To tie this back to the subjective nature of entropy, at this stage, the universe would feature a macrostate that is also the microstate. At least, that is what physicists theorise. But I can think of at least one alternate future.
The Future of the Universe
This essay has been a roller coaster ride; we started with tossing coins and have now reached the death of our universe. This makes things too sad for my liking. So, I would like to end on a brighter note. Let us once again go back to what drove Boltzmann and co. to discover entropy: the steam engine.
To compute entropy, we set the scales of the macrostates to something that made sense to us. These are human scales. What physicists theorise presently is based on these scales. In other words, we are extrapolating human scales to a universal scale. It is prudent to assume that we are missing information between the two scales (by the way, missing information reduces the entropy of a system).
In my eyes, there is a possibility that the relative nature of entropy IS the norm. What if the relativistic nature of entropy is fractal? Sure, it could be that “our” universe is headed for heat death in “our” scales. But for a higher dimensional being (like future artificial intelligence), it could be yet another local entropic state that just needs some “working” from another lower entropy system.
I strongly feel that we know too little to be talking about the death of our universe. If we know too little, we might as well be more optimistic about the (distant) future. It helps us stay humble and do our best to perform useful work in our limited lives, for we are blessed low entropic systems!
If you’d like to get notified when interesting content gets published here, consider subscribing.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie is used to record the user consent for the cookies in the "Advertisement" category .
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
CookieLawInfoConsent
1 year
Records the default button state of the corresponding category & the status of CCPA. It works only in coordination with the primary cookie.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Cookie
Duration
Description
_gat
1 minute
This cookie is installed by Google Universal Analytics to restrain request rate and thus limit the collection of data on high traffic sites.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
__gads
1 year 24 days
The __gads cookie, set by Google, is stored under DoubleClick domain and tracks the number of times users see an advert, measures the success of the campaign and calculates its revenue. This cookie can only be read from the domain they are set on and will not track any data while browsing through other sites.
_ga
2 years
The _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors.
_ga_R5WSNS3HKS
2 years
This cookie is installed by Google Analytics.
_gat_gtag_UA_131795354_1
1 minute
Set by Google to distinguish users.
_gid
1 day
Installed by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously.
CONSENT
2 years
YouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
IDE
1 year 24 days
Google DoubleClick IDE cookies are used to store information about how the user uses the website to present them with relevant ads and according to the user profile.
test_cookie
15 minutes
The test_cookie is set by doubleclick.net and is used to determine if the user's browser supports cookies.
VISITOR_INFO1_LIVE
5 months 27 days
A cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface.
YSC
session
YSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt-remote-device-id
never
YouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
Well written
Thank you very much for the compliment!