**Complexity and Life**

The Kolmogorov

complexity of a string *x* is the length of the

description

of the minimal computer program that outputs *x*,

relative to a given model of computation. Hence, complexity

describes the amount of redundant information, or structure,

in the string: as the string’s complexity approaches its

length, the string becomes increasingly random. (A string’s

complexity can be no greater than an additive constant of

its length, since we can always emit the string by embedding

it in a Turing machine that just prints out the string).

I’d like to write more about Kolmogorov complexity later,

but for now I’ll just point to an interesting application

of this idea to studying biological life. Zhanna Reznikova has

applied information theory to studying the communication

between animals, in particular ants. This paper

describes an interesting result: an experiment was setup in

which a “scout” ant communicates the location of some food

to another group of “forager” ants. The path from the ants

to the food is a binary maze, so the path can be represented

as a series of left or right turns (e.g. “RRR”, “RLR”,

etc.). Reznikova measured the time it took for the scout to

transmit the location of the food to the other ants, and

then varied the path, to see how the time of transmission

varied with the complexity of the route. As it turns out (p.

9, table 2), the ants were able to more

efficiently communicate paths that could be encoded as

strings with low complexity. For example, the path “RRRRRR”

required a mean duration of 88 seconds to transmit, whereas

“RLRLRL” took 135 seconds.