EntropyEntropy is a measure of how uncertainty we are about something. In particular, Shannon entropy represents the minimum number of bits to identify an element out of an ensemble, on average. Here is the probability of the th state. In thermodynamics we have an enormous number of microstates and usually don’t know their probability distribution. We assume a worst case where each of the microstates has . This maximizes entropy because it gives us the least information about which microstate we are in. Thermodynamic entropy is defined as Where is the multiplicity, or the number of microstates compatible with a given macrostate. Entropy is extensive and a state function. can be found from the Sackur-Tetrode equation. |