An active revision of cond-mat/9711074 in the Los Alamos archives

*Abstract*

Statistical physics since Shannon
has shown, following the 19th century work of Gibbs, that physical units for
temperature kT defined statistically (via 1/T = dS/dE) are energy per ``nat''
of information uncertainty. Consequences of this for heat capacities are
explored here for quadratic systems, and systems for which equipartition has
little meaning. We show for *any* system
that total thermal energy E over kT (an integral or average heat
capacity when T>0) is the log-log derivative of multiplicity with respect to
energy, as well as (for *all* b) the number of base-b units of information lost
about the state of the system per b-fold increase in the amount of thermal
energy therein. Similarly the work-free instantaneous heat capacity C_v/k is
a ``local version'' of this log-log derivative equal (for example) to bits of
information lost per 2-fold increase in *temperature*. This makes C_v/k
independent of both: (i) the energy zero, unlike E/kT, and (ii) one's choice of
the Lagrange multiplier for energy (e.g. kT versus 1/kT) to within a constant,
explaining why it's usefulness may go well beyond the detection of phase
changes and quadratic modes. From UMStL-CME-94a09pf.

July 2002 PDF version of the paper, with color figures reorganized and (thanks to RevTeX4) incorporated directly into the text.