### The second law and growth of complexity

Many biological systems seem driven to have high complexity, often accompanied by low

entropy. Many processes can drive these increases, including natural selection, auto-catalysis,

constructive neutral evolution, and embryogenesis. The second law of thermodynamics

allows these complexity-increasing processes, since biological systems are open. However

it may be that the second law in fact drives these processes, i.e., drives biological systems to have low entropy. This possibility is known as â€śSchrodingerâ€™s paradoxâ€ť.

Ultimately, one would like to address Schrodinger's paradox in terms of dynamic systems theory. To be more precise, let p(z) refer to the phase space density of a system over phase space position z, and let H be the Hamiltonian governing its dynamics. Then our goal is to understand what characteristics of H determine whether the resultant dynamics of p has an attractor throughout which p has high complexity, and understand how the dynamics of the complexity of p is determined by H.

To illustrate this dynamic systems perspective in a biological context, say our system is initially described by a p within a basin of attraction of a high-complexity attractor. Say that the system experiences an external shock knocking it to another point in the basin that has lower complexity. After the shock, the system would start increasing its complexity back to the value it had before the shock. Examples of this arguably include asteroid impacts, volcanic eruptions, etc., that cause a mass extinction, thereby reducing the complexity of the terrestrial biosphere, after which the biosphere's complexity grows back. Note though that if the shock were big enough to knock the systems completely out of the basin of attraction, then the system would "die", and not increase its complexity back to what it was.

The goal then is to investigate how the Hamiltonian of a system determines its high-complexity attractors, and in particular how its thermodynamic properties do so.

To start to address this we need a framework that encompasses both thermodynamics and a formalization of "complexity". A natural choice is Shannon's information theory, since as Jaynes showed it can be used to express statistical physics (thermodynamics), and would seem to provide a possible measure of complexity. However Shannon entropy is often dismissed as a measure of complexity, since it assigns different values to gases and crystals, while both are typically viewed as non-complex. This would seem to rule out the use of information theory to analyze Schrodinger's paradox.

However like so much work with entropy, this result is based on using a uniform prior in the definition of entropy. In a recent paper I have shown that if one is careful in specifying the prior distribution in the entropy measure, that measure actually assigns a low complexity to both crystals and gases. This provides the possibility of using information theory to address Schrodinger's paradox.