A short account of a connection of power laws to the information entropy

Yaniv Dover*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Scopus citations


We use the formalism of "maximum principle of Shannon's entropy" to derive the general power law distribution function, using what seems to be a reasonable physical assumption, namely, the demand of a constant mean "internal order" (Boltzmann entropy) of a complex, self-interacting, self-organized system. Since the Shannon entropy is equivalent to the Boltzmann's entropy under equilibrium, non-interacting conditions, we interpret this result as the complex system making use of its intra-interactions and its non-equilibrium in order to keep the equilibrium Boltzmann's entropy constant on the average, thus enabling it an advantage at surviving over less ordered systems, i.e., hinting towards an "Evolution of Structure". We then demonstrate the formalism using a toy model to explain the power laws observed in Cities' populations and show how Zipf's law comes out as a natural special point of the model. We also suggest further directions of theory.

Original languageAmerican English
Pages (from-to)591-599
Number of pages9
JournalPhysica A: Statistical Mechanics and its Applications
Issue number3-4
StatePublished - 15 Mar 2004


  • Dynamical systems
  • Information theory
  • Power laws
  • Self-organizing systems
  • Statistical physics


Dive into the research topics of 'A short account of a connection of power laws to the information entropy'. Together they form a unique fingerprint.

Cite this