Imagine that you are working in a silent room. You have been there for a while now, when suddenly, a colleague shows up and rings a loud bell. You are so overwhelmed by the sudden “noise” that you cover your ears to protect yourself.
Now imagine a different scenario, where you are working in the same room listening to thumping music. The same colleague walks in and rings the same loud bell. But this time around, you are not so bothered by the “minor inconvenience” and greet your colleague with a puzzled smile.
What was different about the two scenarios? Think about it. Objectively speaking, the loud bell makes the same noise when measured in decibels. If so, shouldn’t your reaction to its sound be the same? Why does the background music or lack thereof play a role in your reaction?
As you might have guessed, there is a link between this question and entropy. More specifically, this phenomenon is tied to the notion of the entropic cycle. We will come back to this discussion a bit later in this essay.
For now, let us begin with a proper introduction to the notion of entropic cycles via thermodynamic systems.
This essay is supported by Generatebg
Thermodynamic Systems, Entropy, and Heat
In my essay on entropy and heat (the hidden connection behind time flow), I established the following:
“…for heat to flow from the iron to my hand, entropy needs to flow from the iron to my hand as well. Conversely, for entropy to flow from one body to another, heat needs to flow as well. In short, entropy rides heat flow (and vice versa).”
Entropy and heat are very closely connected. One does not flow without the other. To get a grasp of this relationship further, let us consider our favourite thermodynamic system once again, the steam engine. At the end of each “cycle”, the piston returns to its original position.
In Chaos in perception, I covered the Boltzmann framework for entropy, where entropy is nothing but a measure of the number of microstates per macrostate. So, if the steam engine’s piston returns to its original position (macrostate), it must feature an equivalent microstate to the previous cycle.
What this means is that the steam engine operates in cycles and its entropy is reset at the end of each cycle. But here is a more interesting observation. This behaviour is not unique to the steam engine and exists in every system out there.
The Human Body
We are making a bit of a conceptual jump here. But bear with me. Your body is a thermodynamic system just like the steam engine (just much more complex). To prove this, let us simply compare the two systems on a macro scale. The steam engine takes in fuel, burns it, uses the energy to move the locomotive, and then expels exhaust and excess heat.
The human body takes in food, water, and air, converts them into energy, sustains life, and then excretes the waste. But there is an often-overlooked component here: your body gives off excess heat as well. Have you ever seen human beings under a thermal camera?
Furthermore, without expelling/radiating this excess heat, it would not be possible to maintain a stable body temperature. In fact, let us linger on this point for a moment. A stable body temperature means that the body sustains the same macrostate. Expressed otherwise, the body goes through high frequency cycles (much higher than the steam engine) that reset to an equivalent microstate each time.
And guess what happens to the entropy at the end of each such cycle? That is right; it gets reset. From my observation, this phenomenon seems to have a fractal nature to it. The human body features cycles in fractal scales.
Your body temperature is maintained by high frequency thermodynamic cycles, whereas other systems are maintained by low-frequency sleep cycles, and so on. You feel tired before you go to sleep and feel fresh when you wake up. On both instances, the entropic state is reset. What is the by-product? Heat!
And therein lies the beauty of the rhythms of entropy. To appreciate this further, we need to touch upon one more subtle but significant point.
The Entropic Reset
Be it the steam engine or the human body, any thermodynamic system operates in cycles, where the entropy is reset at the end of each cycle. But why is there excess heat?
Let us go back to the steam engine for a moment. It receives energy (heat) from the burning fuel and releases heat to the environment. At the end of this process, its entropy returns to the original value.
In other words, the same amount of entropy flows in and the same amount of entropy flows out. But a higher quantity of heat flows in, whereas a lower quantity of heat flows out. Why is this?
Sure, the body/steam engine uses a bit of this energy to sustain its macrostate (and thereby reduce entropy). But still, why does entropy flow during the fuel-consumption-phase (input) require more heat flow than entropy flow during the exhaust-phase (output)?
The answer lies in temperature delta. The temperature difference between the burning fuel and the environment is higher. In entropy and heat, I explained how two identical room heaters, one in a cold room and one in a hotter room, would lead to a higher entropy increase and a lower entropy increase respectively for the same amount of temperature increase (if you wish for a more elaborate account, check out that essay).
Let us focus on the effect of the temperature delta on entropy. Does this effect ring any bell? It should. Because, this notion is tied to the discussion we started at the beginning of this essay.
Noise or Minor Inconvenience? — The Missing Link
When you are sat in a silent room, a loud bell increases the entropy significantly. In comparison, when you are listening to thumping music in the same room, the same loud bell leads to a lesser entropy increase (since the music already holds the room in an increased entropic state).
This is why you are more sensitive in the former situation than the latter. For some reason, we human beings are attracted to lower entropy states. We use packets of focused high-quality heat/energy to reduce the entropy of systems, and then expel the low-quality excess waste heat to the environment.
In fact, this realization can be generalized to life in general. It is indeed strange to consider life as an artifact of entropy. But evidence suggests that nature optimizes for life because “life” is a very efficient way to increase the global entropy of systems.
Earth as an Entropic System
Even though we strive to reduce the entropy of systems we care about (like air-conditioned rooms or nuclear energy), this ultimately increases the entropy of our immediate environment, the earth. When we look at things from this perspective, we come to the grave realization that climate change is unavoidable.
We may try our best to slow it down by optimizing our energy consumption and increasing our efficiency by doing more useful work for the same input energy. However, climate change is a question of “when” rather than “if”.
If it is not in our generation, it will be in some other. It is indeed quite chilling to think that nature designed us (all of life) to cause climate change (increase entropy) as part of our systemic outcomes, whilst giving us the illusion of decreasing entropy in our local environments.
As I see it, it is impossible to make technological progress without contributing to climate change. We may affect the rate of change at best. But we do not have the power or understanding (yet) to completely halt change.
Final Comments
Westarted in a silent room and have ended up with unavoidable climate change. I would cover that we have covered a lot in this essay. But underneath all the details, there is a simple underlying essence here.
Entropy and heat are closely connected. Even though entropy rides heat, a higher temperature differential causes more entropy flow than a lower temperature differential for the same amount of heat flow.
This phenomenon, coupled with the fact that thermodynamic systems that reduce entropy locally (like the steam engine or the human body) do so in thermodynamic cycles, leads to a necessary increase in global entropy. The local benefit comes at the cost of the global regression, if you will.This is how we went from a silent room to unavoidable climate change over a few hundred words.
Considering the very little information we have gathered about entropy so far in our human venture, I cannot help but wonder about the underlying natural purpose of life. If it were indeed a very efficient mechanism to maximise entropy, why don’t we see more life in our solar system? I know this is a bit of an anti-climax, but I end this essay with more questions than answers. Such is the topic of entropy.
If you’d like to get notified when interesting content gets published here, consider subscribing.
Further reading that might interest you:
- Why Do You See Mirrors Flipping Words?
- The Thrilling Story Of Calculus
- Why Do We Really Use A 12-Hour Clock
If you would like to support me as an author, consider contributing on Patreon.
Comments