PhysicsLAB Resource Lesson
2nd Law of Thermodynamics and Entropy

Printer Friendly Version
The Zeroth Law of Thermodynamics establishes the fact that objects in thermal equilibrium are at the same temperature. In essence, the 1st Law of Thermodynamics is merely a restatement of the Law of Conservation of Energy. But what defines the direction of the flow of heat? …the direction of the flow of energy? … the relationship between available energy and the amount of work done by a system? The answer is the arrow of time. Time is not reversible. Time only moves forward and its direction is governed by the 2nd Law of Thermodynamics.
 
Three versions of the 2nd Law of Thermodynamics
 
  1. Rudolf Clausius' statement (1854) says that under natural conditions heat will always flow from a high temperature to low temperature.
 
According to Clausius' statement, it would be impossible to operate the system shown below without external work being done on the system when T1 < T2. Heat does not flow spontaneously from a cold temperature to a hot temperature.
 
Image fig2notpossible_web
image courtesy of WebMIT
 
2. The Kelvin-Planck statement of the 2nd Law says that it is impossible to construct an engine which, operating in a cycle, will produce no other effect than the extraction of heat from a reservoir and the performance of an equivalent amount of work.
 
According to the Kelvin-Planck statement, it would be impossible to operate the engine shown below (T2 < T1) without some energy being exhausted to a cold reservoir.
 
Image fig2notpossiblekp
image courtesy of WebMIT
 
3. The total entropy of the universe will always remain constant or increase but never decrease. Entropy is therefore a measure of a system's ability to do useful work between two well-defined temperature reservoirs.
 
Entropy
 
A reversible process is one in which the state variables (P, V, T, and U) are well-defined throughout the process. Reversible processes are slow and require that the system remain in a continuous state of equilibrium throughout the transition. If a single (reversible) process were to be operated in reverse, then the system must return to its original conditions with no changes to the surrounding environment or to the internal system.
 
Entropy, S, is also defined as a state variable. Any two gases which have equivalent values for their macroscopic properties of P, T, V, and U would also have equivalent values for S. Furthermore, since entropy is a state variable, any closed cycle (whether it is reversible or irreversible) will have a net change in entropy equal to zero. However, the consequent entropy changes to the surrounding environment would differ between a reversible and an irreversible cycle stating and stopping at the same position.
 
If an engine’s real efficiency  is set equal to its ideal efficiency

then we can obtain the expression

which can be rearranged to become .


This ratio (ΔQ/T)reversible was defined by Clausius as entropy.
 
If the addition of heat is needed to maintain a constant temperature; for example, in an reversible (ideal) isothermal process, then the change in the system’s entropy is given as ΔS = +ΔQin/T. In this formula, the temperature is in Kelvin and should remain constant, and the units for entropy are J/K.
 
The change in entropy can also be calculated for isovolumetric and isobaric processes in which the temperature changes by evaluating the following integral:



where Q = nCvΔT for isovolumetric processes and Q = nCPΔT for isobaric processes.

  and .

There is no change in entropy during an adiabatic process since there can be no heat added to the system.
 
 
Refer to the following information for the next three questions.

Let's examine the change in entropy that would occur during an isothermal expansion of a monatomic gas from 1.0 liters to 1.25 liters.  Suppose 1000 J of energy is added to change the volume while continuously maintaining a constant temperature of 23ºC.
 Express the temperature in Kelvin.

 How much heat was added?

 What was the change in entropy during this process?

Other variations on this calculation involve the change in a system's entropy when a substance is melting/freezing or vaporizing/condensing since those processes occur at specific temperatures.
 
 
Refer to the following information for the next five questions.

Suppose that a heat engine has a hot reservoir of 600 K and a cold reservoir of 300 K. The gas expands and lifts a 200 kg mass 1.0 meter.
 
If 6500 J of heat is added, then what is the change in entropy for the entire system during this process?
 
 
Step 1: What is the decrease in the entropy of the hot reservoir?

 Step 2: How much effective work (increase in PE) was accomplished during this process?

 Step 3: How much heat was exhausted during this process?

 Step 4: What was the change in the entropy of the cold reservoir?

 Step 5: What was the net change in entropy for the system?

Statistical Mechanics
 
In general, spontaneous processes cause ordered systems to become disordered; that is, they result in an increase in the system's entropy. Suppose there is a container having two identical compartments in which a sample of gas is initially confined to only one side - its entropy is VERY LOW. If the partition between the sides is removed, the likelihood that the gas molecules will all remain in only one compartment is virtually null. The gas molecules will randomly spread to fill the entire container equally with both compartments probably holding roughly half of the original gas - its entropy is large. It is now much more difficult to locate a specific gas molecule.
 
Later in the 1870's, a probabilistic definition of Clausius' entropy was introduced by Boltzmann. The equation he introduced was



where Ω is the system's most frequent equilibrium state - the state that has the largest number of combinations - and kB is Boltzmann's constant which equals the ratio of R/NA = 1.38 x 10-23 J/kg. Boltzmann's statistical approach to the study of thermodynamics is now called statistical mechanics.
 
For example, suppose 50 coins are all arranged so that they are all initially "heads-up." Then they are gathered up and tossed back onto the table. There is virtually no likelihood that they will all once again land heads-up. Instead, the collection of coins will be disordered with various numbers of heads and tails after each subsequent toss. But there will be patterns in the results which we will examine in the next example.
 
Refer to the following information for the next five questions.

Suppose that you are given 4 colored coins: red, green, blue, and brown. When these four coins are tossed randomly there are a total of 24 = 16 possible outcomes.
 
The combinations of coins can be organized into 5 "states" - all heads, 3 heads (1 tail), 2 heads (2 tails), 1 head (3 tails), and no heads (4 tails).
 
 
Before looking at the entropy values for this situation, we first need to determine the number of ways, Ω, which each of these 5 states can be acquired.
 
 all heads

 3 heads, 1 tail

 2 heads, 2 tails

 1 head, 3 tails

 0 heads, 4 tails

 Using the values for Ω, determine the entropy (degree of disorder) of tossing all coins 4 and all 4 being heads. This is equivalent to asking if all four coins were tails.

 Determine the entropy (degree of disorder) of tossing all four coins and getting 2-heads and 2-tails?



Thus entropy can be considered either as a macroscopic function of the gas, like the other state variables P, V, T, and U. Or, it can be considered to be a probabilistic microscopic function calculated at the molecular level. Whichever way it is determined, Boltzmann showed that the interpretations were equivalent.
 
But what about the arrow of time? At the beginning of the lesson we stated that entropy represents the unavailability of a system's thermal energy to do useful, mechanical work. We also learned that the 2nd Law of Thermodynamics restricts the changes in entropy to (at best) remain the same, but more likely, to increase. Once there is no longer a spontaneous direction for energy to flow; that is, the entire universe reaches thermal equilbrium, entropy reaches its maximum value, life (or any process requiring energy) will cease. Entropy cannot be reversed. It is the arrow of time.
 





PhysicsLAB
Copyright © 1997-2024
Catharine H. Colwell
All rights reserved.
Application Programmer
    Mark Acton