**93**(2011) 20006

## A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions

^{1}
Section for Science of Complex Systems, Medical University of Vienna - Spitalgasse 23, A-1090, Austria, EU

^{2}
Santa Fe Institute - 1399 Hyde Park Road, Santa Fe, NM 87501, USA

^{a}
thurner@univie.ac.at

Received:
21
September
2010

Accepted:
12
January
2011

Motivated by the hope that the thermodynamical framework might be extended to strongly interacting statistical systems —complex systems in particular— a number of generalized entropies has been proposed in the past. So far the understanding of their fundamental origin has remained unclear. Here we address this question from first principles. We start by observing that many statistical systems fulfill a set of three general conditions (Shannon-Khinchin axioms, K1–K3). A fourth condition (separability) holds for non-interacting, uncorrelated or Markovian systems only (Shannon-Khinchin axiom, K4). If all four axioms hold the Shannon theorem provides a unique entropy, , *i.e.* Boltzmann-Gibbs entropy. Here we ask about the consequences of violating the 4th axiom while assuming the first three to hold. By a simple scaling argument we prove that under these conditions *each* statistical system is characterized by a unique pair of scaling exponents (*c*, *d*) in the large size limit. These exponents define equivalence classes for all interacting and non-interacting systems and parametrize a unique entropy, , where Γ(*a*,*b*) is the incomplete Gamma function. It covers all systems respecting K1–K3. A series of known entropies can be classified in terms of these equivalence classes. Corresponding distribution functions are special forms of Lambert- exponentials containing —as special cases— Boltzmann, stretched exponential and Tsallis distributions (power laws) —all widely abundant in Nature. In the derivation we assume , with *g* some function, however more general entropic forms can be classified along the same lines. This is to our knowledge the first *ab initio* justification for generalized entropies. We discuss a physical example displaying two sets of scaling exponents depending on the external parameters.

PACS: 05.20.-y – Classical statistical mechanics / 02.50.Cw – Probability theory / 05.90.+m – Other topics in statistical physics, thermodynamics, and nonlinear dynamical systems (restricted to new topics in section 05)

*© EPLA, 2011*