Entropy of Thermodynamics Systems

The second law of thermodynamics tells us about the Concept Of Entropy. The term entropy was introduced by the scientist Rudolf Clausius in 1850. This idea comes from the concept that heat always flows from hot to cold regions spontaneously which is equal to the entropy change. The concept of entropy develops from the fact that for energy to be converted into work, there must be some dissipation. This lost energy is called "ENTROPY".

Let us now learn about the entropy for a thermodynamic system. Before I explain to you about the entropy, let me ask you a question.
What do you do in your classroom in the presence of a teacher and without a teacher??
You sit silently and disciplined in the classroom and not moving at all when there is a teacher in the classroom. That means there is no disorder or no randomness in the classroom. But when there is no teacher in the classroom, you all are moving here & there and not showing any discipline. That means there is disorder and randomness in the classroom.

!!!!This Is A Simple Concept Of Randomness Or Disorder. And Entropy Is Nothing But The Measure Of Randomness Or Disorder In Any System!!!!


WHAT IS ENTROPY
"Entropy is defined as a measure of the randomness or disorder of a thermodynamic system". It’s simple, it is just a measure of how much randomly the molecules are moving in a System.
• In solids, the molecules are properly arranged, which means it has less randomness, so the entropy of solids is least.
• In gases, the molecules move very fast throughout the container. It has more randomness which means it has more entropy.
• The entropy of liquids lies in between the solids and gases.
Therefore, the greater the randomness or disorder of the system, the higher the entropy of that system. It tells us about the tendency of the universe to be disordered or random. 

"The movement of molecules is known as randomness or disorder".
The entropy is denoted by the alphabet “S”. The unit of entropy is J/K.
The Fact
We can not measure the exact entropy of any system. But we can only measure the change in the entropy (∆S) of the system.
The formula for change in entropy is given by the equation; 
                   ➩ ∆S = ∆Q/T

Entropy In Second Law Of Thermodynamics
Here is the entropy statement of second law of thermodynamics.
"The entropy of an isolated system (such as universe) always increases and can never decrease". This happens because we know that any process naturally occurs in a direction in which the randomness or disorder of that process increases. 
              ∆S(universe)  0 ]
The second law also states that whenever a system interacts with its surroundings and because of this interaction, that system becomes more orderly, then its environment should become more disordered. That is, if the entropy of the system decreases then the entropy of the surroundings will increase and conversely if the entropy of the surroundings decreases then the entropy of the system will increase.

Examples Of Spontaneous Processes
1. Melting of an ice
2. Cooling of hot coffee is automatic
3. Water always flows from higher level to lower level
4. Gas occupies entire volume of the container on its own
5. Air leaks from the balloon on its own
These processes occur on their own. So they are known as spontaneous processes. For all these processes, the entropy of the universe always increases.

Another Way To Define Entropy
From second law of thermodynamic, we found that complete conversion of heat into work is not possible in a continuous process. Since, we know that work is produced only by the ordered motion of the molecules whereas the disordered motion of the molecules is unable to produce work. 
Thus, the energy content of a system can be divided into two parts;
Available Energy, which under ideal conditions (orderly motion of molecules) may be completely converted into work.
Unavailable Energy, which is usually rejected as waste (disordered motion of molecules) and not converted into work.

The energy of a system which can not be converted into work (or disordered motion of molecules within the system or the unavailable energy) is called the entropy of that system.
That's why we sometimes say that, "Entropy is a measure of the unavailability of internal energy (or system's thermal energy) Or Entropy is a measure of the system's thermal energy that is incapable of doing work". 

For Example
Heat is a form of energy that is incapable of doing work. Therefore, the more heat added to the system (or tranfer from the system), the higher (or the lower) will be its entropy.
 
Second Law Equation For Entropy
Second law of thermodynamics equation (formula) can be stated as below;
                        ➩ ∆S(universe) > 0
          ➩ ∆S(system) + ∆S(surrounding) > 0
This entropy equation is very important as it tells us whether the process will occur on it’s own or not (or spontaneous or not).
But what is a spontaneous process??
"Spontaneous process is a process which occurs on it’s own without any external help". We can easily find out whether the process is spontaneous or not by simply calculating the entropy change of the universe. Because for all spontaneous processes, the entropy of the universe always increases according to the entropy statement of the second law of thermodynamics.
     ➩ ∆S(universe) ≻ 0, Spontaneous Process
    ➩ ∆S(universe) ≺ 0, Non-spontaneous Process
     ➩ ∆S(universe) = 0, Reversible Process


Mathematical Relation
Entropy is an intrinsic property of a substance and is not affected by the external position of a system or its motion relative to other systems. It is a state function. Let us now see how the entropy of a system increases and what factors affect it,
(1) We know that entropy is a measure of the system's thermal energy that is incapable of doing work. For example, heat is a form of energy that is incapable of doing work. Therefore, the more heat added to the system (or tranfer from the system), the higher (or the lower) will be its entropy. Thus,
              ➩ ∆S (↑↑) ∝ Q (↑↑)
(2) The amount of heat added to a system is only a partial measure of the magnitude of its entropy increase. The temperature of the system can also cause an increase in entropy. The higher the temperature of the system, the smaller the increase in its entropy. Whereas the lower the temperature of the system, the greater the increase in its entropy. Thus,
              ➩ ∆S (↑↑) ∝ T (↓↓)
Quantitative description for entropy should taken into account the heat transferred & the temperature level at which it is transferred.
In a combined form, the mathematical expression for entropy is given as;
When temperature is constant then the entropy is given as;
Thus, we can say that the entropy of a thermodynamic system is directly proportional to amount of heat added to the system and inversely proportional to the temperature of the system.
Heat added to a lower temperature system- "more entropy"
    Heat added to a higher temperature system- "less entropy".


Important Note
The Total Change In Entropy;
(Entropy change is zero at equilibrium condition or whenever the process becomes reversible)
➩ [(∆S)total  ≽ 0 ]
This mathematical statement of the second law of thermodynamics confirms that "Every process proceeds in such a direction that the total entropy change associated with it is always positive and the limiting value of zero is achieved only by a reversible process".

Entropy is an extensive property which means that the value of entropy changes according to the amount of matter present in the system. If a system is highly ordered (less chaotic) then it has low entropy and vice versa. The SI unit of entropy is "Joule/Kelvin".

 The SI unit of entropy for unit mass is "Joule/K−kg" and for entropy per unit amount of substance is "Joule/K−mol".

 A thermodynamic system always prefers maximum entropy. 
 There are several methods to calculate the entropy of a system. But, two of the most common ways are calculating the entropy of a reversible process and an isothermal process.
For calculating the entropy of a reversible process, The formula is;
                 ➩ S = kB.ln (W) ]
    where,
          kB - Boltzmann’s constant (1.381×10 J/K)
           W is the number of possible states
For calculating the entropy of an isothermal process, the formula is;
                 ➩ ΔS = ΔQ / T ]
   where,
         ΔQ refers to the change in heat 
        T is the absolute temperature (Kelvin)


FAQs
Does Entropy Depend Upon Pressure?
Entropy is inversely proportional to pressure. According to Boyle’s law when pressure increases, the volume decreases meaning that the particles come close to each other and they become less spread out that decreases entropy.
Is Entropy Proof Of Global Warming?
As entropy is a function of temperature. An increase in temperature leads to an increase in entropy. An increase in entropy means that more energy is spread out which increases global warming.
Can Entropy Be Negative?
Entropy cannot be negative but a change in entropy can be negative only in the case of a non-spontaneous process. For example, during the condensation process, entropy decreases because gas is condensed back into a less entropy liquid.



Hope you have found this article helpful!!
Do you have suggestions? Please write in comment box!!!
Feel free to comment if you have any queries!!

Comments

Most Viewed Posts

Zeroth Law Of Thermodynamic: The Thermal Equilibrium Law

Characteristics Of Entropy

Reversed Carnot Cycle