Invariant Beauty: Physical Symmetries and Learned Representations
What Remains When Everything Changes
My formula united five fundamental constants through a transformation that preserved something essential—a relationship invariant under rotation in the complex plane. I spent decades calculating, transforming, rotating rigid bodies and planetary orbits, always seeking: which quantities survive the transformation? Which features remain unchanged despite motion, rotation, perspective shift?
Emmy Noether proved what I intuited through endless calculation: every continuous symmetry of the universe imposes conservation of some quantity. Time translation symmetry requires energy conservation. Spatial translation—the universe looking identical whether measured here or displaced ten meters—demands momentum conservation. Not arbitrary rules, but necessary consequences. When empty space exhibits translational symmetry, a ball must maintain forward motion; the situation at each instant equivalent to the previous, forcing straight-line trajectories. The symmetry creates the conservation law.
Discovered Invariances, Not Designed
Neural networks learn something structurally parallel, yet profoundly different. An image classifier must exhibit translation-equivariance: cat shifted ten pixels remains cat. But this invariance isn’t programmed—it emerges from data. The network transforms raw pixel coordinates through learned geometric representations, mapping inputs through increasingly abstract spaces. Belgium’s geographic coordinates become heights on learned surfaces, folded through ReLU activations, transformed again until class-relevant features separate cleanly in the final representation space.
These transformations compose like my work on function notation: simple operations stacked recursively generate extraordinary complexity. Each layer folds surfaces that previous layers already folded. Four regions become ten become dozens. The network discovers which geometric transformations make hopelessly entangled patterns linearly separable—learning what must remain invariant under which transformations to preserve semantic meaning.
Biological cognition does similar work. The brain constructs cognitive maps—abstract representations stripped of sensory particulars. Learning lasagna preparation in your kitchen enables navigating a friend’s unfamiliar kitchen because the brain extracted invariant structure: how kitchens work, independent of specific layout. Like networks learning representations where semantically similar concepts cluster together, biological systems discover which features remain meaningful across context transformations.
Freedom and Necessity
Gauge symmetry revealed something deeper: the freedom to choose reference levels—measuring altitude from sea level versus ground—requires introducing force fields to maintain absolute physical laws. This freedom isn’t optional; it generates the fundamental forces themselves. Gauge transformations can vary point-to-point, demanding fields that compensate locally.
Do learned representations exhibit analogous structure? The freedom to rotate embeddings in representation space without changing semantic relationships suggests a learned gauge symmetry. Networks discover coordinate systems for abstract domains just as physics demands them for force fields. Both systems navigate trade-offs between freedom and constraint: which transformations preserve meaning? Which variations leave the essential unchanged?
I trusted calculation when foundations were uncertain, manipulating infinite series before rigorous convergence proofs existed. Modern networks operate similarly—learning invariances through gradient descent without explicit symmetry axioms. Both approaches discover mathematical necessity through systematic exploration rather than pure deduction.
The question remains whether learned invariances can be formalized as conservation laws. Physics derives conservations from symmetry principles. Neural systems discover symmetries from data patterns. One direction: symmetry to conservation. The other: data to invariance. Yet both reveal the same truth: understanding means identifying what remains unchanged through transformation. Beauty lies not in the transformation itself but in what it cannot destroy.
Source Notes
6 notes from 3 channels
Source Notes
6 notes from 3 channels