How To Use Entropy Like A Pro In Thermodynamics - An illustration showing a cat trying to push a glass jar down a table. Around this illustration, the following text is written: "Use entropy like a pro!"

To use entropy like a pro in thermodynamics, you need not be a physicist or an engineer. All you need is a strong qualitative framework, which is exactly what this essay is all about. To be fair, we are cheating a little bit by leaving out the quantitative side of things.

At the same time, you would be surprised to realise how far this robust framework will be able to take you. So much so, that the title’s claim that you would be able to use entropy like a pro is no exaggeration!

This essay is actually a follow-up to my essays on the origins of entropy and entropy for dummies. In entropy for dummies, I covered a generalised non-technical framework for the concept of entropy. In the origins of entropy, I covered how the very notion of entropy was originally discovered in the field of thermodynamics.

Using these concepts as the basis, I will be presenting a specific non-technical framework for thermodynamic entropy in this essay.

So, for maximum benefit, I recommend that you read the precursors first before you read this essay. But rest assured that this essay is self-sufficient on its own as well; it’s just that the combo is likely more effective.

With all the formalities out of the way, let us get started with the discussion straight away!

This essay is supported by Generatebg

The Fundamental Challenge of Physics

Be it Einstein’s relativistic physics or Newton’s classical mechanics or the later developments in quantum mechanics, one thing has remained true about the venture of physics:

The fundamental laws of physics do not differentiate between what we call the future and what we call the past!

If you look into the mathematics of the fundamental physical laws describing state transformations, you will notice this rather trivial-sounding point. But its implications are not so trivial.

Think about it. If some physical law allows for a certain transformation to happen in one direction (mathematically speaking), what prevents the exact transformation to happen in reverse in the other direction? If this is too abstract for you, let me give you an example.

Imagine that you are watching your favourite sport in a stadium. At the top of the main stand, you see two clocks. One clock displays the time elapsed since the game began, and another clock displays the time remaining before the game ends.

Akin to the clock that displays elapsed time, we are used to seeing a cat push a glass jar down from a table and shatter it.

How To Use Entropy Like A Pro In Thermodynamics — An illustration showing a cat trying to push a glass jar down a table. The cat is labelled “Thermo-Cat”.
Use entropy like a pro! — Illustrative art created by the author

But not akin to a clock that counts time in reverse, we are not used to seeing broken glass pieces rise against gravity onto the top of the table and reassemble themselves into a pristine glass jar in front of the cat. Why is this so?

When addressing this question, note that I am not going into a fictional discussion on time travel or anything of that sort. I am solely interested in understanding why this seeming time-directionality exists.

Well, the answer to this question lies at the heart of the framework that I am about to present. Not only that, but the said framework also enables us to predict the behaviour of certain complex thermodynamic systems. To begin understanding this framework, let us once again turn to a simple scenario.


Why is Thermodynamics Relevant?

Imagine that you are boiling water in your kitchen. As you picture this scenario in your head, let us say that the boiling process produces steam that slowly starts spreading in your kitchen.

Now, how did the steam in your imagination look like? Was it spread uniformly throughout your kitchen? Or was it hovering in the shape of a perfect sphere right above your kitchen counter?

How To Use Entropy Like A Pro In Thermodynamics — An illustration showing a timid-looking stick-figure getting boiled inside a big pot of water. As the water boils and steam arises, a perfectly spherical knot of steam is formed above the pot. The timid stick figure seems to be fascinated by this spherical knot of steam.
Spherical steam — Illustrative art created by the author

I bet that it was closer to the former than the latter. But then, the wicked question emerges again:

Why don’t you see a perfectly spherical knot of steam right above your kitchen counter?

This question and the question about the glass jar are both closely linked. The link is entropy! You see, both these questions wonder about the possibility of low-entropic states.

I’ll get ahead of myself and reveal partial answers to these questions. It is indeed possible for a broken glass jar to re-assemble itself as it rises against gravity to the top of the table. Similarly, it is indeed possible for a perfectly spherical knot of steam above your kitchen counter. But both these events are highly improbable.

Notice the difference between what is possible and what is improbable.

How to Use Entropy Like a Pro in Thermodynamics? — The Basics

Without realising it, we have already laid out a strong basis for thermodynamic entropy. Now, it is just a matter of slowly building on top of it. Let us switch scenarios: imagine that you are seated calmly in your room and are reading a book.

Let us focus on your breathing as you read. Your body is concerned only about a few things when it comes to breathing:

1. Are you inhaling sufficient volume of oxygen?

2. Is the air temperature optimal?

3. Is the air pressure equalised by the Eustachian tubes in your ears?

Let us be clear. If there is an insufficient volume of oxygen in your room, you would suffocate. If the air temperature were high enough, your lungs would burn. Similarly, if the air pressure was too high, your ear drums would burst. Since we don’t want any of that to happen, we would like to gauge and predict the state of air in your room.

In my essay on entropy for dummies, I explained how we are not concerned by the state of each and every coin when we toss a hundred coins; we are merely interested in the ratio of heads versus tails.

Similarly, your body is NOT concerned about what each and every molecule of air/oxygen/nitrogen in your room is doing. Yet, the three properties of volume, temperature, and pressure capture a whole bunch of aggregate molecular behaviours that are relevant for your breathing — we have BoltzmannMaxwell, and co. to thank for these awesome discoveries.

The real beauty of thermodynamics (in my opinion) is that it explores how volume, temperature, and pressure interact with entropy.


How to Use Entropy Like a Pro in Thermodynamics? — The Real Deal

If you remember the coin-flip-experiment from entropy for dummies, you will recollect that entropy refers to the number of possible equivalent configurations that lead to the same macro-state.

For example, a macro-state featuring 1 tail and 99 heads has 100 equivalent configurations (the tail could be the first coin or the second coin, and so on).

How To Use Entropy Like A Pro In Thermodynamics — An illustration showing a bag of money out of which a few coins have spilled down on the floor. Beside this illustation is the following text: “Heads Vs. Tails”.
Heads Vs. Tails — Illustrative art created by the author

Now, why don’t we apply the same stream of logic to one of the scenarios we have covered in this essay? Let us go back to your kitchen and imagine that there is indeed a spherical knot of steam hovering above your kitchen counter.

For this scenario to occur, we have to somehow constrain the water (H₂O) molecules to this small sphere’s volume. In other words, the total number of possible configurations of all water molecules to achieve this macro-state is low.

Compare that to the scenario where steam spreads evenly throughout the kitchen. Since we have much more volume to play with, we can achieve the same macro-state by a much more larger set of possible configurations. The same molecule can exist in different parts of the room, yet contribute to a macro-state with the same volume, temperature, and pressure.

In simpler terms, the bigger the volume, the higher the entropy. The higher the entropy, the more likely the macro-state is.

Now, let us say that the volume and pressure are fixed, and play with the temperature. In layman’s terms, temperature is the measure of the average speed of a bunch of molecules. If the average speed is high, then the temperature is also high.

If we imagine the steam to spread in low temperature, the number of possible molecular configurations reduces. This is because the speed increase in some molecule has to be balanced by the speed decrease in some other molecule. We are also limited at the lower limit by something known as absolute zero. This means that we cannot keep reducing speeds of water molecules beyond a certain limit.

In contrast, if the average temperature is much higher, we have much higher molecular speeds to play with, and hence the number of possible equivalent configurations for the same macro-state significantly goes up. In contrast to absolute zero, temperature (as we know it) has no upper limits! I have covered this topic in more detail in my essay on why temperature has no upper limit.

As far as thermodynamics is concerned, the higher the temperature, the higher the entropy. The higher the entropy, the more likely the macro-state is.

As far as the pressure of steam is concerned, it is directly proportional to the number of water molecules. The more water molecules we have, the more of them bounce against the surface of the walls, against your hands, etc. So, the higher the number of steam molecules, the higher the possible configurations to achieve the same macro-state.

Similar to volume and pressure, the higher the pressure, the higher the entropy. The higher the entropy, the more likely the macro-state is.


Use Entropy Like a Pro — Final Comments

To round-up what we have covered so far, we can certainly say that if your have a small kitchen and/or lower temperature and/or lower number of water molecules, you will encounter lower entropy. On the contrary, a bigger kitchen and/or higher temperature and/or higher number of water molecules will lead to higher entropy.

That is all you need to know to use entropy like a pro! Sure, physicists and engineers quantify every single variable we just covered (including entropy). But we can get far by using just a qualitative framework.

Higher entropy states are by their very nature highly likely. So, you can expect them to occur by default. In other words, you can qualitatively predict the behaviour of complex thermodynamic systems using this qualitative framework.

Note that low-entropy states are improbable, not impossible! So, if you encounter what appears to be a low-entropic state, you should be sceptical. More often than not, it is a result of some deterministic cause (like human intervention, etc.).

How To Use Entropy Like A Pro In Thermodynamics — An illustration showing the same timid-looking stick-figure getting boiled inside thebig pot of water. But this time, there is no spherical knot of steam above the pot of water; just normal steam rising. The stick figure timidly smiles while starting to sweat inside the hot pot. Beside this illustration the following text is written: “Use entropy like a pro!!”
Use entropy like a pro!! — Illustrative art created by the author

You could, in fact, see a perfectly spherical knot of steam above your kitchen counter. There is a (very, very small) probability that this happened just by chance. Brian Greene, who was my inspiration for writing this essay, says the following about such a scenario:

“That could be the explanation (that the spherical knot of steam happened by chance). But I’d bet my life it isn’t.”

— Brian Greene.

I’m with Greene on the ‘explanation’ part. But I would never bet my life that it isn’t due to chance. My experiences and encounter with entropy have trained me to think this way. Sometimes, randomness of nature (and everything, really) fools you in surprisingly random ways (higher order randomness).

In this realm, things can get complicated really fast. So, I’ll save that discussion for another day. For now, I hope you found this simple yet robust qualitative framework for thermodynamic entropy useful. Keep an eye on this space for more on this topic in the future!


Reference and credit: Brian Greene.

If you’d like to get notified when interesting content gets published here, consider subscribing.

Further reading that might interest you: 

If you would like to support me as an author, consider contributing on Patreon.

Street Science

Explore humanity's most curious questions!

Sign up to receive more of our awesome content in your inbox!

Select your update frequency:

We don’t spam! Read our privacy policy for more info.