## Abstract

A measure called physical complexity is established and calculated for a population of sequences, based on statistical physics, automata theory, and information theory. It is a measure of the quantity of information in an organism's genome. It is based on Shannon's entropy, measuring the information in a population evolved in its environment, by using entropy to estimate the randomness in the genome. It is calculated from the difference between the maximal entropy of the population and the actual entropy of the population when in its environment, estimated by counting the number of fixed loci in the sequences of a population. Up until now, physical complexity has only been formulated for populations of sequences with the same length. Here, we investigate an extension to support variable length populations. We then build upon this to construct a measure for the efficiency of information storage, which we later use in understanding clustering within populations. Finally, we investigate our extended physical complexity through simulations, showing it to be consistent with the original. Crown Copyright (C) 2011 Published by Elsevier B.V. All rights reserved.

Original language | English |
---|---|

Pages (from-to) | 3732-3741 |

Number of pages | 10 |

Journal | Physica A: Statistical Mechanics and its Applications |

Volume | 390 |

Issue number | 21-22 |

DOIs | |

Publication status | Published - 15 Oct 2011 |

## Keywords

- Complexity
- Entropy
- Clustering
- Evolution
- Population
- DEPENDENCE