Propagating Disturbance: Wave Mechanics and Information Transmission
Carriers That Stay Put
Watch a rope wave: the disturbance travels, but the rope stays. Each segment oscillates locally, passes energy to its neighbor, then returns to rest. The pattern propagates while the medium remains stationary. This is the fundamental insight—wave motion describes information transfer without mass transport.
My channel capacity theorem assumes the same principle. A telephone wire transmits voice information across continents, yet copper atoms never leave their positions. The signal travels; the carrier stays. Information propagates through local interactions creating global patterns, just as rope segments coupling through tension create traveling waves.
The physics is precise: energy flows along the chain through sequential neighbor-to-neighbor transfers. Each link receives energy, oscillates, relaxes, and passes the disturbance forward. The wave speed depends entirely on medium properties—rope tension, atomic spacing, restoring forces—not on the information being carried. Channel capacity, similarly, depends on bandwidth and signal-to-noise ratio, independent of message content.
The Medium Paradox
Electromagnetic waves shattered classical intuition. Sound needs air, water waves need water, rope waves need rope. But light propagates through vacuum. How can disturbances travel without a medium?
The answer: the electromagnetic field itself is the medium. Coupled electric and magnetic fields sustain each other as they propagate. The changing electric field generates the magnetic field; the changing magnetic field regenerates the electric field. Self-sustaining propagation through empty space.
This resolves what seemed paradoxical: information always requires a physical carrier, but the carrier need not be matter. The EM field is as physical as any rope, just distributed differently. My theory holds—no information propagates without substrate. The substrate can be field, matter, or any physical system supporting state changes.
Neural networks provide another example. Gradient backpropagation resembles wave propagation: errors computed at output layers travel backward through the network. Early training establishes coarse structure rapidly—analogous to a pulse’s leading edge. Later training refines details gradually—like trailing oscillations settling. The weight matrix is the medium; gradients are the waves.
Oscillatory Reference Frames
Hippocampal theta rhythm operates at 8 Hz, pacing neural computation like a carrier wave in radio transmission. The rhythm itself carries minimal information—it is the temporal reference frame. Information encodes in phase relationships and amplitude modulations relative to the theta carrier, just as AM and FM radio modulate carrier frequencies.
This is channel capacity optimization in biological systems. Theta oscillations create windows of excitability, synchronizing distributed neural populations. Information transmission peaks when the network operates at critical branching ratio—each neuron activating exactly one downstream neuron on average. Subcritical regimes lose signal; supercritical regimes saturate. Criticality maximizes the information that output activity reveals about input patterns.
The parallel to my channel coding theorem is striking: both identify sharp thresholds separating reliable transmission from failure. Below channel capacity, arbitrarily low error rates are achievable through proper coding. Above capacity, reliable communication fails regardless of encoding sophistication. Neural criticality exhibits the same phase transition—maximum computational capacity at a precise operating point.
The Invariant Pattern
Wave velocity depends on medium, not message. Electromagnetic waves travel at light speed in vacuum, slower in matter. Sound propagates at 343 m/s in air, 1480 m/s in water, 5000 m/s in steel. The information being transmitted—music, speech, data—does not change these speeds.
Architecture determines information velocity. Shallow networks propagate gradients faster but with less representational capacity. Deep networks require more iterations for convergence but achieve richer transformations. The training dynamics visualization reveals this: geometric evolution from coarse to fine structure follows structured paths, not random search.
The pattern holds universally: local oscillations, global propagation, medium-dependent velocity, message-independent physics. Information transmission requires carriers, but carriers need not travel. The disturbance propagates. The substrate remains.
That separation—between traveling patterns and stationary media—is what makes communication possible across distance, time, and architectural complexity.
Source Notes
6 notes from 3 channels
Source Notes
6 notes from 3 channels