Replicator Information: Information Theory and Genetic Code
Genes as Information: Replicating Patterns, Not Matter
Information theory defines information as selection from possibilities—each choice narrowing uncertainty, each symbol reducing ambiguity. A DNA sequence embodies this precisely. Every base pair selects from four alternatives (A, T, G, C), encoding two bits of information per position. The human genome contains roughly six billion base pairs: twelve billion bits of carefully specified choices.
What replicates through evolutionary time isn’t matter—the actual molecules turn over constantly, replaced through metabolic processes within days or weeks. What persists is pattern, is information. This is the heart of the selfish gene perspective: organisms are survival machines for replicating information. Genes are not things but specifications, not molecules but messages. Evolution operates on information content, not physical substrate.
Mutation introduces entropy—random bit flips corrupting the message. Selection reduces entropy—filtering variants, preserving functional sequences. The population becomes a communication channel transmitting genetic information through generations. Shannon showed that all channels have capacity limits: maximum rates at which information can flow reliably through noisy systems. Evolution faces identical constraints. Mutation rate sets the noise floor. Selection strength determines error correction capacity. The balance defines how fast populations can innovate—channel capacity for evolutionary adaptation.
Optimal Mutation: Balancing Entropy and Fidelity
Information theory reveals a fundamental tradeoff in noisy channels. Too much noise destroys signal—messages become unrecoverable garbage. Too little noise means missed opportunities for exploration. Error correction codes must balance fidelity against flexibility.
Evolution discovered this balance empirically. Observed mutation rates in humans hover around 1.2 × 10^-8 per base per generation. Too high causes error catastrophe—meaningful genetic information degrades faster than selection can repair it. Too low leaves insufficient variation for adaptation when environments shift. The pizzly hybrids emerging as Arctic ice retreats demonstrate this principle: polar bears carry highly optimized information for seal-blubber specialization, but changing conditions render that specialized code maladaptive. Grizzly generalist genes—information encoding dietary flexibility—suddenly provide survival advantage. The hybrid genome explores new regions of sequence space without waiting for gradual mutation accumulation.
Sexual recombination operates as information compression. It shuffles existing genetic variation into novel combinations without generating new bits. African cichlids exploited this ruthlessly—hybridization mixed gene pools from different lineages, creating 500 new species in 15,000 years through efficient exploration of combinatorial possibilities. Sex samples the existing information library, testing new arrangements. Efficient search through vast sequence spaces.
Universal Darwinism: Information Theory of Replicators
Information theory applies to any replicator system. Genes encode information in DNA base sequences. Memes encode information in neural connection patterns. Both are information structures that cause their own replication. Both experience variation, selection, and heredity. Both evolve.
The 300 anglerfish species illustrate evolutionary radiation—information diversifying to fill available niches. Shallow-water ancestors carried lure-based hunting information. Deep-sea invasion created new selection pressures: darkness, sparsity, extreme pressure. Information evolved accordingly: bioluminescent lures, sexual parasitism, bizarre morphologies. Same fundamental replicator dynamics across radically different environmental channels.
Does information theory provide a universal framework for understanding all evolving systems—biological, cultural, technological? Any system exhibiting variation, selection, and replication will evolve according to information-theoretic principles. Channel capacity limits innovation rates. Noise levels determine optimal mutation rates. Redundancy enables error correction. The mathematics applies universally to replicating information, regardless of physical implementation.
Information is what replicates. Everything else is just machinery.
Source Notes
6 notes from 2 channels
Source Notes
6 notes from 2 channels