Primer Home / Entropy / Organised Chaos

Organised Chaos

Topic: Entropy
by Prasoon, 2019 Cohort

Rudolf Clausius coined the term Entropy in 1865, according to him in any thermodynamic process there is always degradation of energy. So, Entropy is the system’s property that helps in measuring this degradation of energy in a dynamic or spontaneous system. An alternative definition was phrased by Ludwig Boltzmann which says, that a system will tend to move towards the most probable state and will continue to do so until it has attained maximum probability state, hence its clear to say that entropy of a system will remain constant or keep increasing but never decreases for a real life irreversible system. The best example to clear this statistical definition of entropy is:

Let’s take example of 2 solids. Say both have a total energy worth 8 Quanta(packets). The distribution of these quanta in both solids can be 8 in solid A and 0 in solid B, or 7:1, or 6:2 or 4:4 (solid A: solid B). All these possible combinations of energy states have some probability of occurrence.

Organised, its molecules are not vibrating or colliding fast enough to generate enough heat, instead they are closely and peacefully packed and hence it has less randomness and less degree of disorder so eventually very less or near zero entropy. Let’s keep the same cup of ice in room temperature, it starts melting due to surrounding, because energy tends to move from hotter region to colder. Now we observe the same solid ice melt down to be converted to liquid water, the water molecules are not as organised as the solid ice crystals, they have more randomness, they vibrate more, they collide more, so they have more heat and they are still struggling to achieve thermal equilibrium with the surrounding , to become stable. So, the entropy (randomness/disorder) has increased eventually and will keep on increasing until the solid ice is completely converted to liquid and the same liquid has the same temperature as the surrounding/ system temperature.

So, we cannot understand entropy perfectly if we ignore any factor out of heat, kinetic energy of molecules, structural arrangement, interacting surrounding. We know that in order to do any useful work the system has to be in order. And another intriguing definition of entropy is the amount of unavailable energy to do useful work. The best possible explanation can be that when the big bang happened, we had unlimited amount of energy in start, but as energy tends to dissipate and get evenly spread out, we started to see planets, stars asteroids, moons etc. being formed. A Star is born with huge nuclear energy of its own, planets have volcanoes. But without any additional supply of energy the days are numbered for their existence. Eventually they will die releasing all possible energy they possess, hence we observe their entropy increases(star is born with cool gases coming together ultimately they expand and expand as the temperature increases so we observe more disorder and randomness) because they have lesser and lesser available energy to do useful work which can be supplying heat in this case.

Entropy directly relates to the energy configuration probability. When energy is most spread out, we observe maximum entropy (Which is 4:4). Hence, low entropy means the energy is concentrated and high entropy means its evenly spread out.

Explore this topic further#

Return to Entropy in the Primer

Disclaimer#

This content has been contributed by a student as part of a learning activity.
If there are inaccuracies, or opportunities for significant improvement on this topic, feedback is welcome on how to improve the resource.
You can improve articles on this topic as a student in "Unravelling Complexity", or by including the amendments in an email to: Chris.Browne@anu.edu.au

Rudolf Clausius coined the term Entropy in 1865, according to him in any thermodynamic process there is always degradation of energy. So, Entropy is the system’s property that helps in measuring this degradation of energy in a dynamic or spontaneous system. An alternative definition was phrased by Ludwig Boltzmann which says, that a system will tend to move towards the most probable state and will continue to do so until it has attained maximum probability state, hence its clear to say that entropy of a system will remain constant or keep increasing but never decreases for a real life irreversible system. The best example to clear this statistical definition of entropy is:

Let’s take example of 2 solids. Say both have a total energy worth 8 Quanta(packets). The distribution of these quanta in both solids can be 8 in solid A and 0 in solid B, or 7:1, or 6:2 or 4:4 (solid A: solid B). All these possible combinations of energy states have some probability of occurrence.

Organised, its molecules are not vibrating or colliding fast enough to generate enough heat, instead they are closely and peacefully packed and hence it has less randomness and less degree of disorder so eventually very less or near zero entropy. Let’s keep the same cup of ice in room temperature, it starts melting due to surrounding, because energy tends to move from hotter region to colder. Now we observe the same solid ice melt down to be converted to liquid water, the water molecules are not as organised as the solid ice crystals, they have more randomness, they vibrate more, they collide more, so they have more heat and they are still struggling to achieve thermal equilibrium with the surrounding , to become stable. So, the entropy (randomness/disorder) has increased eventually and will keep on increasing until the solid ice is completely converted to liquid and the same liquid has the same temperature as the surrounding/ system temperature.

So, we cannot understand entropy perfectly if we ignore any factor out of heat, kinetic energy of molecules, structural arrangement, interacting surrounding. We know that in order to do any useful work the system has to be in order. And another intriguing definition of entropy is the amount of unavailable energy to do useful work. The best possible explanation can be that when the big bang happened, we had unlimited amount of energy in start, but as energy tends to dissipate and get evenly spread out, we started to see planets, stars asteroids, moons etc. being formed. A Star is born with huge nuclear energy of its own, planets have volcanoes. But without any additional supply of energy the days are numbered for their existence. Eventually they will die releasing all possible energy they possess, hence we observe their entropy increases(star is born with cool gases coming together ultimately they expand and expand as the temperature increases so we observe more disorder and randomness) because they have lesser and lesser available energy to do useful work which can be supplying heat in this case.

Entropy directly relates to the energy configuration probability. When energy is most spread out, we observe maximum entropy (Which is 4:4). Hence, low entropy means the energy is concentrated and high entropy means its evenly spread out.

bars search times arrow-up