Lectures on Physics has been derived from Benjamin Crowell's Light and Matter series of free introductory textbooks on physics. See the editorial for more information....


We would like to have some numerical way of measuring the grade of energy in a system. We want this quantity, called entropy, to have the following two properties:

(1) Entropy is additive. When we combine two systems and consider them as one, the entropy of the combined system equals the sum of the entropies of the two original systems. (Quantities like mass and energy also have this property.)

(2) The entropy of a system is not changed by operating a Carnot engine within it.

Entropy can be understood using the metaphor of a water wheel. Letting the water levels equalize is like letting the entropy maximize. Taking water from the high side and putting it into the low side increases the entropy. Water levels in this metaphor correspond to temperatures in the actual definition of entropy.

It turns out to be simpler and more useful to define changes in entropy than absolute entropies. Suppose as an example that a system contains some hot matter and some cold matter. It has a relatively high grade of energy because a heat engine could be used to extract mechanical work from it. But if we allow the hot and cold parts to equilibrate at some lukewarm temperature, the grade of energy has gotten worse. Thus putting heat into a hotter area is more useful than putting it into a cold area. Motivated by these considerations, we define a change in entropy as follows:

A system with a higher grade of energy has a lower entropy.

Entropy is additive

Entropy isn't changed by a Carnot engine

Entropy increases in heat conduction

Entropy is increased by a non-Carnot engine

A book sliding to a stop

The example above involved closed systems, and in all of them the total entropy either increased or stayed the same. It never decreased. Here are two examples of schemes for decreasing the entropy of a closed system, with explanations of why they don't work.

Using a refrigerator to decrease entropy?

Maxwell's daemon

Observation such as these lead to the following hypothesis, known as the second law of thermodynamics:

The entropy of a closed system always increases, or at best stays the same: ΔS ≥ 0.

At present my arguments to support this statement may seem less than convincing, since they have so much to do with obscure facts about heat engines. A more satisfying and fundamental explanation for the continual increase in entropy was achieved by Ludwig Boltzmann, and you may wish to learn more about Boltzmann's ideas from my book Simple Nature, which you can download for free. Briefly, Boltzmann realized that entropy was a measure of randomness at the atomic level, and randomness doesn't spontaneously change into organization. To emphasize the fundamental and universal nature of the second law, here are a few examples.

Entropy and evolution

The heat death of the universe

Hawking radiation

Last Update: 2010-11-11