Lectures on Physics has been derived from Benjamin Crowell's Light and Matter series of free introductory textbooks on physics. See the editorial for more information.... 
Home Conservation Laws Thermodynamics Entropy  
See also: Entropy and evolution  
Search the VIAS Library  Index  
EntropyWe would like to have some numerical way of measuring the grade of energy in a system. We want this quantity, called entropy, to have the following two properties: (1) Entropy is additive. When we combine two systems and consider them as one, the entropy of the combined system equals the sum of the entropies of the two original systems. (Quantities like mass and energy also have this property.) (2) The entropy of a system is not changed by operating a Carnot engine within it.
It turns out to be simpler and more useful to define changes in entropy than absolute entropies. Suppose as an example that a system contains some hot matter and some cold matter. It has a relatively high grade of energy because a heat engine could be used to extract mechanical work from it. But if we allow the hot and cold parts to equilibrate at some lukewarm temperature, the grade of energy has gotten worse. Thus putting heat into a hotter area is more useful than putting it into a cold area. Motivated by these considerations, we define a change in entropy as follows:
A system with a higher grade of energy has a lower entropy.
The example above involved closed systems, and in all of them the total entropy either increased or stayed the same. It never decreased. Here are two examples of schemes for decreasing the entropy of a closed system, with explanations of why they don't work.
Observation such as these lead to the following hypothesis, known as the second law of thermodynamics: The entropy of a closed system always increases, or at best stays the same: ΔS ≥ 0. At present my arguments to support this statement may seem less than convincing, since they have so much to do with obscure facts about heat engines. A more satisfying and fundamental explanation for the continual increase in entropy was achieved by Ludwig Boltzmann, and you may wish to learn more about Boltzmann's ideas from my book Simple Nature, which you can download for free. Briefly, Boltzmann realized that entropy was a measure of randomness at the atomic level, and randomness doesn't spontaneously change into organization. To emphasize the fundamental and universal nature of the second law, here are a few examples.


Home Conservation Laws Thermodynamics Entropy 