Entropy

Entropy is a measure of how uncertainty we are about something. In particular, Shannon entropy σ\sigma represents the minimum number of bits to identify an element out of an ensemble, on average.

σ=ipilgpi. \sigma = -\sum_i p_i \lg p_i.

Here pip_i is the probability of the iith state.

In thermodynamics we have an enormous number of microstates and usually don’t know their probability distribution. We assume a worst case where each of the Γ\Gamma microstates has p=1/Γp = 1/\Gamma. This maximizes entropy because it gives us the least information about which microstate we are in.

Thermodynamic entropy SS is defined as

S=kiΓ1Γln1Γ=klnΓ. S = - k \sum_i^\Gamma \frac1\Gamma \ln \frac1\Gamma = k \ln \Gamma.

Where Γ\Gamma is the multiplicity, or the number of microstates compatible with a given macrostate. Entropy is extensive and a state function. S(U,V,N)S(U,V,N) can be found from the Sackur-Tetrode equation.