General Chemistry is a free introductory textbook on chemistry. See the editorial for more information.... 
Home Physical Chemistry Equilibrium and the Second Law of Thermodynamics Probability and Entropy  
Search the VIAS Library  Index  
Probability and EntropyAuthor: John Hutchinson
There is a subtlety in our conclusion to be considered in more detail. We have concluded that all possible arrangements of molecules are equally probable. We have further concluded that mixing occurs because the final mixed state is overwhelmingly probable. Placed together, these statements appear to be openly contradictory. To see why they are not, we must analyze the statements carefully. By an "arrangement" of the molecules, we mean a specification of the location of each and every molecule. We have assumed that, due to random molecular motion, each such arrangement is equally probable. In what sense, then, is the final state "overwhelmingly probable"? Recall the system illustrated in figure 1, where we placed three identical blue marbles into ten spaces. We calculated before that there are 120 unique ways to do this. If we ask for the probability of the arrangement in subfigure 1.1, the answer is thus 1/120. This is also the probability for each of the other possible arrangements, according to our model. However, if we now ask instead for the probability of observing a "mixed" state (with no drop), the answer is 112/120, whereas the probability of observing an "unmixed" state (with a drop) is only 8/120. Clearly, the probabilities are not the same when considering the less specific characteristics "mixed" and "unmixed". In chemistry we are virtually never concerned with microscopic details, such as the locations of specific individual molecules. Rather, we are interested in more general characteristics, such as whether a system is mixed or not, or what the temperature or pressure is. These properties of interest to us are macroscopic. As such, we refer to a specific arrangement of the molecules as a microstate, and each general state (mixed or unmixed, for example) as a macrostate. All microstates have the same probability of occurring, according to our model. As such, the macrostates have widely differing probabilities. We come to an important result: the probability of observing a particular macrostate (e.g., a mixed state) is proportional to the number of microstates with that macroscopic property. For example, from figure 1, there are 112 arrangements (microstates) with the "mixed" macroscopic property. As we have discussed, the probability of observing a mixed state is 112/120, which is obviously proportional to 112. Thus, one way to measure the relative probability of a particular macrostate is by the number of microstates W corresponding to that macrostate. W stands for "ways", i.e., there are 112 "ways" to get a mixed state in figure 1. Now we recall our conclusion that a spontaneous process always produces the outcome with greatest probability. Since W measures this probability for any substance or system of interest, we could predict, using W, whether the process leading from a given initial state to a given final state was spontaneous by simply comparing probabilities for the initial and final states. For reasons described below, we instead define a function of W,
called the entropy, which can be used to make such predictions about spontaneity. (The k is a proportionality constant which gives S appropriate units for our calculations.) Notice that the more microstates there are, the greater the entropy is. Therefore, a macrostate with a high probability (e.g. a mixed state) has a large entropy. We now modify our previous deduction to say that a spontaneous process produces the final state of greatest entropy. (Following modifications added below, this statement forms the Second Law of Thermodynamics.) It would seem that we could use W for our calculations and that the definition of the new function S is unnecessary. However, the following reasoning shows that W is not a convenient function for calculations. We consider two identical glasses of water at the same temperature. We expect that the value of any physical property for the water in two glasses is twice the value of that property for a single glass. For example, if the enthalpy of the water in each glass is H_{1}, then it follows that the total enthalpy of the water in the two glasses together is H_{total} = 2 H_{1}. Thus, the enthalpy of a system is proportional to the quantity of material in the system: if we double the amount of water, we double the enthalpy. In direct contrast, we consider the calculation involving W for these two glasses of water. The number of microstates of the macroscopic state of one glass of water is W_{1}, and likewise the number of microstates in the second glass of water is W_{1}. However, if we combine the two glasses of water, the number of microstates of the total system is found from the product W_{total} = W_{1} × W_{1}, which does not equal 2W_{1}. In other words, W is not proportional to the quantity of material in the system. This is inconvenient, since the value of W thus depends on whether the two systems are combined or not. (If it is not clear that we should multiply the W values, consider the simple example of rolling dice. The number of states for a single die is 6, but for two dice the number is 6×6 = 36, not 6+6 = 12.) We therefore need a new function S(W), so that, when we combine the two glasses of water, S_{total} = S_{1} + S_{1}. Since S_{total} = S(W_{total}), S_{1}=S(W_{1}), and W_{total} = W_{1} × W_{1}, then our new function S must satisfy the equation S(W_{1} × W_{1}) = S(W_{1}) + S(W_{1}) The only function S which will satisfy this equation is the logarithm function, which has the property that ln(x×y) = lnx + lny. We conclude that an appropriate state function which measures the number of microstates in a particular macrostate is equation 1.


Home Physical Chemistry Equilibrium and the Second Law of Thermodynamics Probability and Entropy 