``Ignorance of codes, as complement to excitations, by masking their culpability favors replicators that often fail organisms. Old cultures and old money often share this blindspot. Awareness of complementarity can help avoid spinouts e.g. you correct differently when you know that the problem is the frequency not the displacement, or the idea not the people.'' (Otmef Namuh, 2003)

``. . . . I am conscious of the danger of a bias. It may happen to me, as to others, that a meditation which has long been dwelt on shall assume an unreal importance; and that a method which has for a long time been practised shall acquire an only seeming facility. It must remain for others to judge how far my attempts have been successful, and how far they require to be completed, or set aside, in the future progress of the science.'' (William Rowan Hamilton, "On a General Method of Expressing the Paths of Light, and of the Planets, by the Coefficients of a Characteristic Function," Dublin University Review and Quarterly Magazine, 1, p795-826, 1833): this thanks to E. F. Taylor, 2003

"Cv in bits" and other complex-system informatics

Contents: [Cv in Bits] [Thermal Roots] [Ala Tribus] [IFZX] [Simplex] [Other Stuff]

What's new?

Toward a simplex model of layered-niche networks

P. Fraundorf, Complexity 13/6 (2008) 29-39 (arXiv:physics/0603068)

Gene "expression" translates molecule strings (replicable nucleic acid codes) into the proteins and enzymes needed to run a cell. Bacterial cells (prokaryotes, i.e. cells without a nucleus) have fewer ways to control which codes they express than do the eukaryotic cells of plants and animals. In bacteria, what's there as code may get expressed whether it's appropriate or not. That's why only eukaryotes are able to do cool things like assemble a bobcat or human from a single cell.

Idea "expression" in the same sense translates memetic codes, e.g. a blueprint, into behaviors or possibly even into a building. Newspapers, for example, transcribe stories for distribution while the translation itself is done by their readership, one sentence at a time. This similarity between genes and ideas is not just metaphorical. To wit, the 2nd law of thermodynamics applies to both types of expression by prescribing a minumum energy cost for each step in the process. Just as going digital with molecule codes made a big difference for one-celled lifeforms, going digital with idea codes could do the same for us (if it doesn't do us in first).

The talk below* addresses this question: How may we better inform idea-expression in metazoan communities to multiple layers of community structure? Consider this in light of the above-mentioned conclusion from molecular biology, that informed gene-expression in eukaryotes is crucial to the development and sustainable operation of individual metazoans. As with ideas, with genes an ability to adapt the codes expressed in light of inputs from more than one level of organization seems crucial e.g. to recovery from injury as well as to embryo development. Bacteria and other prokaryotic microbes are in that sense our genetic ideologues, less able to inform their songs to issues on more than one scale. They are also more likely to be "all that's left" when we're confronted with decreasing thermodynamic availability i.e. less free-energy per capita and/or an unfriendly environment./pf

* Slides (IE works best) from a 15 May 2007 talk at UIUC (UCS 2007) on simplex models of layered niche-networks.

Credits: The draft figure above right is a reduced-size collage of much larger images made available on the following web sites: upper left, upper right, lower left, lower right.

Thermal roots of correlation-based complexity

Bayesian maxent lets one integrate thermal physics and information theory points of view in the quantitative study of complex systems. Since net surprisal (a free energy analog for measuring ``departures from expected'') allows one to place second law constraints on mutual information (a multi-moment measure of correlations), it makes a quantitative case for the role of reversible thermalization in the natural history of invention, and suggests multiscale strategies to monitor standing crop as well. It prompts one to track evolved complexity starting from live astrophysically-observed processes, rather than only from evidence of past events. Various gradients and boundaries that play a role in availability flow, ranging from the edge of a wave-packet to the boundary between idea-pools, allow one to frame wide-ranging correlations (including that between a phenomenon and its explanation) as delocalized physical structures.

Note: Published in Complexity 13 No 3 (Jan/Feb 2008) 16-26, and online in EarlyView of Complexity on Dec 12, 2007. The draft paper is here.

Correlation thermodynamics at work: The simplest illustration of reversible thermalization is perhaps the "N=1"-atom version of the Szilard isothermal compressor shown in the figure to the right above, put to use as a vacuum-pump memory. It allows one to put its single atom into the desired side of a bi-partitioned container EITHER with help from a piston plus kT/ln2 of available work, OR with help from knowledge (about which side the atom's on) used to direct an arbitrarily low-energy rotation to the desired orientation. This provides a most transparent example of the process of reversible thermalization, wherein the work of compression (whose energy escapes to the ambient as heat) yields one bit of mutual information (for those outside the container, namely, valid data on location of the gas atom within).


Net surprisals ala Tribus:
correlations from reversible thermalization

Sun 15 Feb 2004: for International Conference on Complex Systems 2004 May/Boston MA

Note: The submitted paper (#392) is here

The Bayesian vision of net surprisals underlying the connection between energy and information, put forward by Myron Tribus four decades ago, has new life today. One example is widespread application of mutual information to the study of correlated codes, quantum computing, and nonlinear dynamics. We show here that net surprisals can also help students quantify finite departures from the ambient in second law terms, and offer a framework for tracking hierarchical correlations in complex systems (along with the replicable codes used to nurture those correlations).

In a nutshell, one might say that developments (since Tribus' early papers) indicate: (i) that net-surprisal is indeed a robust measure of correlation with "second law teeth", (ii) that thermodynamic information engines literally power the natural history of invention, and (iii) that complementarity* between replicable codes** & steady-state excitations*** means that balanced awareness requires dual perspectives e.g. the perspectives of both genes & their phenotypes, or the perspectives of both ideas & the people that nurture them.

The first of these items (net-surprisal's utility) was well-grounded in the maximum entropy work of E. T. Jaynes, which helped inspire Tribus' contributions, even while the presently burgeoning practical applications of mutual information (a special case of net-surprisal) were still un-imagined. The second of these items (correlation thermodynamics) required first that physical entropy be made non-extensive by considering systems with correlated subsystems, i.e. for which mutual information about subsystem states is available. Jaynes' max-ent formalism, applicable to both classical and quantum mechanical systems, likewise laid a rigorous foundation for this. The last item (code-excitation complementarity) is likely just a metaphor. Incentive for recognizing it comes not from a mathematical argument, but because consideration of gene, along with organism, perspectives is now accepted as crucial to understanding evolution (cf. Dawkins' work in the 1970's), just as the perspective of idea replication has become crucial to understanding the development of culture (cf. McLuhan's work in the 1960's, as well as more recent work e.g. by Blakemore on the perspective of memes).

A heuristic argument may also provide insight into code-excitation complementarity. Organisms thermalize available work while updating correlations in their environment. Lack of care for codes results in irreversible thermalization (i.e. an out-of-control burn), while too much care for specific codes results in their failure to evolve (i.e. in failure to see that the correlations those codes offer are updated to reflect changes in their environment). Thus, for example, successful accomplishment of large undertakings may require significant compromise between equally aggressive idea-focused, and people-focused, problem-solving teams.

* analogous to the mathematical complementarity between time & frequency
** which in practice rule time (survive) by retaining correlations
*** which in practice put energy to work creating correlated subsystems

Sketch of some ``temporally-stacked'' layers of correlation-based complexity: The table below lists what some might argue are purely physical examples of phenomena stackable in a common thermodynamic context. They stack in the sense that each set of structural and process correlations is predicated on the prior existence of the level before. Some mathematical tools, like network analysis, allow one to put Bayesian inference to use nicely at the more complex levels (further down the list), while the tools of thermal physics (with deep Bayesian roots) are of more use at the less complex levels where information losses (e.g. due to irreversible heat flows) are noticable on the total energy landscape. With a few unifying examples at each level, this might help folks make useful connections that are not presently obvious in their information environment. For example, the complex-systems concepts of emergence and symmetry-breaking may turn out to have both deep roots and every-day consequences as well.

new level drivers boundaries emerging correlations
stable nuclei voltage gradients neutral matter
density fluctuations gradients in gravitational potential forming galaxies
interstellar cloud collapse radial temperature variation spin up & stellar ignition
orbital accretion of dust & gas radial pressure variation planetary differentiation & geocycles
geothermal & solar gradients compositional variation biomolecular cycles
biological cells bilayer membranes & cell walls chemical communication, microbial symbioses & differentiation
biofilms & live tissues organ surfaces skeletal, respiratory, digestive, & nervous systems
metazoans individual skins pair bonds & redirected aggression
reproductive bargains, family gene pool boundaries social heirarchies & politics, ritualized available work
cultures & belief systems meme pool boundaries sciences & diversity protocols


Information physics: From energy to codes

This is an evolving collection of notes for educators on how the tools of thermal physics are grounded in gambling theory, and how in that context they are providing clues to more complex things going on around us than 19th century thermodynamicists might have imagined. The new tools have been guiding error-correction in communication lines and data-compression methods for half a century, but they are now moving into applications relevant not just to the future of computers. They may in fact someday provide a common physical framework for considering such disparate (and sometimes competing) issues as conservation of available work, maintenance of genetic diversity within and between species, and the future of cultural diversity in the presence of constraints imposed by increasingly rapid means of communication and transportation. The approach, which shifts focus to the role of reversible thermalization in the natural history of invention, provides useful context and limits on a wide diversity of systems, while augmenting (rather than cartoonifying) the value of detailed knowledge about the way any given instance of correlation-based complexity works.


Sound Bytes: maxent's best guess | information engines | life power stream | excitations and codes

02 Feb 2003: working draft PDF (19 pages, 58 refs, 7 figures, TOC, RevTeX4, suggestions for improvement invited) release notes

27 Jan 2003: the spiffy hyperlinked v3 PDF from the archive at Cornell (formerly Los Alamos)

eprint citation: P. Fraundorf, "Information physics: From energy to codes" (2003), arXiv:physics/9611022


Abstract: We illustrate in terms familiar to modern day science students that: (i) an uncertainty slope mechanism underlies the usefulness of temperature via it's reciprocal, which is incidentally around 42 [nats/eV] at the freezing point of water; (ii) energy over kT and differential heat capacity are ``multiplicity exponents'', i.e. the bits of state information lost to the environment outside a system per 2-fold increase in energy and temperature respectively; (iii) even awaiting description of ``the dice'', gambling theory gives form to the laws of thermodynamics, availability minimization, and net surprisals for measuring finite distances from equilibrium, information content differences, and complexity; (iv) heat and information engine properties underlie the biological distinction between autotrophs and heterotrophs, and life's ongoing symbioses between steady-state excitations and replicable codes; and (v) mutual information resources (i.e. correlations between structures e.g. a phenomenon and it's explanation, or an organism and it's niche) within and across six boundary types (ranging from the edges of molecules to the gap between cultures) are delocalized physical structures whose development is a big part of the natural history of invention. These tools might offer a physical framework to students of the code-based sciences when considering such disparate (and sometimes competing) issues as conservation of available work and the nurturing of genetic or memetic diversity.



Fig. 4 (Left) Life's Energy Flow: The top half represents some of the primary processes involved with energy flow, while the bottom half illustrates significant physical repositories for life's energies, and paths for conversion of energy (and net surprisal) from one form to another. Fig. 5 (Right) Life's Stores of Availability: Horizontal bars represent inward-looking correlations, while vertical bars represent outward-looking correlations. This breakdown seems to work reasonably well to categorize by domain both the types of correlations found, and the types of codes (e.g. genetic or memetic replicators) used to help maintain them.


Heat capacity in bits

This note discusses how the developing awareness of entropy, as a lack of mutual information about a system's state, might be used by introductory thermodynamics teachers (in various disciplines) to simplify and deepen student understanding in context of applications in computer science and molecular biology. It provides suggestions that complement the growing collection of thermal physics texts built on the statistical approach, and in particular explores the way in which heat capacities may be expressed in terms of measured uncertainty (i.e. bits, bytes, etc.) directly. It suggests that analogous capacities for other conserved quantities might prove useful as well. For example, the ``volume capacity'' of an ideal gas is just the number of gas molecules that it contains.


Sound Bytes: Cv in bits | law zero w/teeth | thermochapter possibility puzzles | life's uncertainty slopes

Copyright 2003 American Association of Physics Teachers. This article PDF may be downloaded for personal use only. Any other use requires prior permission of the author and the American Association of Physics Teachers. It appeared in the November issue of American Journal of Physics, and is citable as "Heat capacity in bits" by P. Fraundorf (2003) Amer. J. Phys. 71 (11) 1142-1151.

02 Jun 2003 revised draft (13 pages, 32 refs, 4 figures, RevTeX4) much like that in AJP.

eprint citation: P. Fraundorf, "Heat Capacity in Bits" (1999), arXiv:cond-mat/9611074



American Association of Physics Teachers (AAPT) Summer Meeting 2003
Title: Heat Capacity in Bits; Author: Phil Fraundorf; Paper: FL07; Room: L/M; Time: 2:30pm August 6

Some slides (or a PDF with notes) from the talk.

Abstract: The developing awareness of entropy, as lack of correlation between system and environment, can be used by intro-physics teachers to deepen student insight into widening applications of information physics e.g. in molecular biology and computer science. That means clarifying the central role of uncertainty (and information measures) in thermal physics, and defining temperature as an energy derivative (the reciprocal of an uncertainty slope) rather than as something proportional to energy per molecule. Using detailed examples of familiar systems like water and the ideal gas, we show how in the process heat (and other) capacities emerge in natural units as multiplicity exponents, measured for example in bits of mutual information loss per two-fold increase in temperature. For more on this see http://www.umsl.edu/~fraundor/hcapbits.html



This page is http://www.umsl.edu/~fraundor/hcapbits.html. Although there are many contributors, the person responsible for errors is P. Fraundorf. This site is hosted by the Department of Physics and Astronomy (and Center for Molecular Electronics) at UM-StL, and you are visitor number [broken counter] since 07 Jan 2003. Whole-site page requests est. around 2000/day hence more than 500,000/year. Requests for a "stat-counter linked subset of pages" since 4/7/2005: .