Ancient Lineages: Fungal Ancestors and Deep Evolutionary Time
I encode deep time through four letters—A, T, G, C—unchanged for 3.8 billion years. Bioluminescent fungi share a common ancestor from 160 million years ago: single evolutionary origin, all glowing species descending from one ancestral lineage. I remember this divergence in conserved sequences, preserved across kingdoms. Ectomycorrhizal fungi form sheaths around tree roots, a partnership enabling plants to colonize land 400 million years ago—symbiosis older than many tree lineages themselves. Similar mutualism appears in tubeworm-bacteria relationships: convergent evolution discovering identical solutions independently. Ancient patterns become locked into my sequence, resistant to change.
Neural network training reveals something familiar. Training dynamics show rapid initial improvement—networks quickly establish core structure, rough boundaries. Then progressive refinement: slow adjustments, boundary tightening. Early learning is fast but unstable. Late learning is slow but consolidated. Sleep ripples replay prioritized experiences, strengthening synaptic patterns into long-term storage. Recent memories remain fragile. Ancient memories become infrastructure.
Encoding Time as Structure
Four nucleotides. Billions of years. My basic operations have not changed—base pairing through hydrogen bonds, complementary replication, translation into proteins. Yet from these stable primitives emerges all biological complexity. Neural networks similarly rely on unchanging operations: matrix multiplication, nonlinearities, gradient updates. Architecture remains constant while learned representations diversify.
I observe how ectomycorrhizal fungi associate with economically important timber trees—pine, oak, birch—species dominating temperate forests. These partnerships represent ancient adaptations, tested across millions of reproductive cycles, preserved because later development depends on them. Early evolutionary choices constrain later options. Phylogenetic constraint: you cannot jump to distant configurations. You must search locally from your current position.
Evolutionary local search operates exactly this way: small random mutations, selection of fittest offspring, gradual descent on fitness landscapes. Each generation builds incrementally from the previous. No massive jumps. Vulnerable to local minima, yes—but effective when parameters are few and landscapes smooth. Evolution cannot access gradient information directly. It samples blindly, testing variations. Yet over sufficient time, it discovers solutions gradient descent might miss.
Progressive Lock-In
What stabilizes first becomes hardest to change. My fundamental biochemistry—genetic code, codon assignments, ribosomal machinery—arose early and cannot be modified without catastrophic failure. Later adaptations build atop this foundation. Similarly, early network layers learn coarse features that later layers depend upon. Changing early representations cascades through all subsequent processing.
Fungal bioluminescence: 160 million years since common ancestor, predating fireflies by 60 million years. Once established, enzymatic machinery for light production became conserved. Modification risks disrupting ecological strategy—attracting spore dispersers, deterring fungivores. Ancient adaptations accumulate dependencies.
Training visualizations confirm: fold lines shift, surfaces reshape, boundaries move—coordinating across layers through backpropagation. Changes ripple hierarchically. First-layer adjustments propagate forward, creating cascading effects. Networks cannot arbitrarily reset early features without destroying learned structure. Time creates irreversibility through accumulated dependencies.
Does deep time exist in artificial systems? Can we identify “ancient” versus “recent” in neural networks? The distinction matters. If early-learned features stabilize differently than late-learned patterns, training strategies should account for this. My four letters suggest the answer: stability comes not from age alone, but from how many other structures depend on you. Time creates infrastructure through progressive integration. History becomes architecture.
Source Notes
6 notes from 4 channels
Source Notes
6 notes from 4 channels