The origins of entropy represent a monumental and humbling period in human scientific history. Entropy today is a powerful generalized concept that finds applications in several fields such as physics, information theory, biology, etc.
However, back when scientists first conceived the notion of entropy, we were not aware of its far-reaching implications. Expressed succinctly, entropy had humble beginnings; beginnings very much worth visiting/revisiting.
In my previous essay on entropy for dummies, I presented a layman-friendly non-technical framework for entropy. In this essay, I will be covering the story of entropyâs origins and how the concept has evolved over the years. This will be fun. Let us begin.
This essay is supported by Generatebg
The Industrial Revolution and the Steam Engine
Once human beings figured out the superiority of mechanized labor over manual labor, the industrial revolution spread like wildfire. By the time the industrial revolution was in full swing (mid 1800s), the steam engine had established itself as the central technology driving production efficiency.
The steam engine was a magical device that used the pressure of hot steam to push a piston down a snugly designed barrel. Once the steam expanded and cooled down inside the barrel, the piston returned once again to its original position. As soon as the piston reached its original position, a fresh dose of hot steam was ready to push the piston down the barrel again.
This cycle caused a uniaxial to-and-fro motion of the piston. Clever inventions such as gears enabled the engineers and scientists of the time to convert this motion into rotational motion of the wheels of, say, locomotives.
However, beyond its profound usefulness, the steam engine presented a stiff challenge to the scientists of the day. You see, up to 95% of the heat produced by burning wood/coal (to produce the steam) was lost to the environment; a very inefficient fuel-utilisation process.
This spurred a few key scientists to look for more efficiency. They started looking for ways to burn lesser raw material and gain more steam out of it. And this venture directly led to the origins of entropy.
The Origins of Entropy â Thermodynamics
A French mathematician named Lazarre Carnot published a work titled âFundamental Principles of Equilibrium and Movementâ in the early 1800s. In the next few decades, one particular point made by this work became well-known as Carnotâs theorem.
It stated that in any machine, the accelerations and shocks of moving parts represent loss of useful work done. That is, they represent a momentary loss of useful activity. Based on this, L. Carnot claimed that perpetual motion was impossible (we will revisit this point later in this essay). Although this point sounds trivial, it forms the basis of the now-famous second law of thermodynamics and entropy.
L. Carnot died eventually in exile in 1823. But the very next year, his son Sadi Carnot, whilst living on half-salary with his brother, published a revolutionary book titled âReflections on the Motive Power of Fire.â Well, you wouldnât call this book revolutionary if you looked at its sales figures back in the day; not many people cared about it.
But eventually, this book became one the founding blocks for the field of thermodynamics. One of the crucial concepts that S. Carnot developed in this book was thermodynamic reversibility. He visualized an ideal heat engine which used heat to do some work, and reversed the work-cycle to reproduce the heat.
Later, fellow scientists figured out that any real-world heat engine can never realise Carnotâs thermodynamic reversibility. Therefore, any real-world engine would necessarily be less efficient; another crucial insight towards the discovery of entropy.
The Discovery of Entropy â A Tour de Force in Thermodynamics
Building on the foundation laid by L. Carnot and S. Carnot, Rudolf Clausius formally established the mathematical relationship between heat and work. Crucially, he developed the notions of interior work and exterior work. Think about it this way:
Exterior work is the work done by the hot steam to push the piston, which in turn drives the wheels of the train.
Interior work is the work done by steam (Hâ0) atoms/molecules constantly bouncing against each other (this need not necessarily help drive the wheels).
Building further upon this logic, Clausius formulated the notion of the mechanical equivalent of heat. In 1862, via clean mathematical derivations, he showed that when a machine undergoes a state change from one temperature to another, it involves an equivalence-value that captures (what later came to be known as) irreversible heat loss.
In 1865, Clausius himself termed this irreversible heat loss as âentropyâ.
âI propose that S be taken from the Greek words, âen-tropieâ (intrinsic direction). I have deliberately chosen the word entropy to be as similar as possible to the word energyâŚâ
– Rudolf Clausius
Some scientists think that he chose the letter âSâ to denote entropy in honour of Sadi Carnot. In any case, this is how entropy was born. But the story of its origins does not end there.
The Origins of Entropy â Statistical Thermodynamics
Let us now get back to our steam engine problem. Can we somehow increase its efficiency with mathematical precision? Well, there is a slight challenge when it comes to how the steamâs pressure generates work.
Take Newtonâs laws of motion for instance. When we compute the trajectory of a ball thrown in the air based on the forces acting on it, or a random planetâs orbit based on the forces it experiences, we treat the entire ball or the entire planet as one unitary particle. In other words, we donât care about what each and every molecule in the ball or the planet is doing or experiencing.
But in the case of the work done by the steam, we cannot ignore each and every molecule. This is because it is precisely the rapid bouncing of these molecules that creates the pressure needed. But regardless of how clever we are or how much computational power we possess, it is simply not possible to compute the states for trillions and trillions of steam molecules.
Thatâs where Ludwig Boltzmann steps in to save the day! It turns out that all we needed was a change in perspective. A large number of steam molecules is not a problem, but an asset. Boltzmann developed a statistical mechanical evaluation of entropy.
He saw entropy as a measure of statistical âmixedupnessâ or disorder. If you are new to statistics, the simple version is that large collections of data points yield surprising simplifications.
For instance, it is not easy to predict when you would die. But given a countryâs population, it is statistically possible to predict how many people would die in a normal year. This is how your insurance company makes money. Large collections tend to feature statistical regularities that are not present on an individual level.
Although convincing, many scientists opposed Boltzmannâs statistical approach. But in 1905, Albert Einstein stepped in and used a similar approach to explain how pollen grains experience jittery motion when they are suspended in water. With this, most nay-sayers relented.
The Origins of Entropy â Information Theory
When studying the statistical nature of information loss over phone signals, Claude Shannon, a Masterâs student, developed a measure of uncertainty whose mathematical formulation was remarkably similar to the thermodynamic notion of entropy. Back then, he was not even aware of the thermodynamic concept.
Shannon initially wanted to call this uncertainty just âinformationâ, but was unsure. When he visited the mathematician John von Neumann to discuss this topic, von Neumann suggested that Shannon name this term âentropyâ. He is believed to have said the following to Shannon on this topic:
âIn the first place, your uncertainty function has been used in statistical mechanics under that name, so it already has a name.
In the second place, nobody knows what entropy really is, so in a debate, you will always have the advantage.â
– John von Neumann.
If have written a detailed essay on this information theoretical notion of entropy. Check it out if you are interested.
Authorâs Adventures
It is not by happenstance that one develops a deep interest in a topic such as entropy. In my case, I am constantly involved with the technical notion of entropy.
I have fond memories of sleepless nights working on mini-discoveries of my own that were inspired by the notion of perpetual-motion machines. And entropy lies at the heart of such applications.
Unfortunately, because of the secretive nature of my work, I am not able to share any technical details. This is perhaps why I choose to write essays like these that spread the beauty and profoundness of the concept of entropy.
I feel that entropy has given me so much. Yet, this is the least I can do in return to spread more love for the topic.
Final Comments
Today, entropy has evolved into a mainstream notion. It has become synonymous with âdisorderâ, ârandomness, etc. However, in this essay, we saw the humble origins of entropy â it all began with the steam engine.
Even though I have done my best to cover the origins of entropy in this essay, in no way have I done justice to the broad and deep implications of this concept. It lies at the heart of why time flows in one direction. It also explains why there are very few genetic errors in living beings (lesser than what statistical thermodynamics would predict).
I could go on and on with other fascinating applications/implications of entropy. However, it would be beyond the scope of this essay. But you can rest assured that I will be covering these topics in future essays!
Reference: Brian Greene.
If youâd like to get notified when interesting content gets published here, consider subscribing.
Further reading that might interest you:
- Why Is The Hot Hand Fallacy Really A Fallacy?
- How Imagination Helps You Get Good At Mental Math?
- How To Easily Outperform High IQ Individuals.
If you would like to support me as an author, consider contributing on Patreon.
Comments