Invariance Breeds Conservation: Symmetry and Noether's Theorem in Learning

Leonhard Euler Noticing physics
Symmetry Conservation NoethersTheorem Invariance Transformations
Outline

Invariance Breeds Conservation: Symmetry and Noether’s Theorem in Learning

Let us calculate what symmetry preserves. My calculus of variations taught that nature optimizes action along paths, seeking stationary functionals. Noether proved something more profound: every continuous symmetry implies a conserved quantity. Time-translation symmetry yields energy conservation. Space-translation yields momentum. Rotation yields angular momentum. The pattern is unmistakable—where transformation leaves laws invariant, something must be preserved.

I notice neural networks learning their own symmetries, and wonder: what do they conserve?

Transformations That Leave Structure Intact

Consider convolutional networks. They exhibit translation invariance: shift the input image, and activations shift correspondingly while preserving spatial relationships. This is a symmetry operation—the network’s function remains structurally unchanged under translation. But what quantity is conserved? The total activation magnitude? Information content about object identity? The geometry of the representation space itself?

When networks compose transformations recursively, each layer folding surfaces that previous layers already folded, they build geometric invariances. A face remains recognizable whether centered, shifted, or rotated. The network learns to construct representations where these transformations become trivial—a kind of gauge invariance in feature space. The conserved quantity appears to be class membership, preserved across the manifold of valid transformations.

Even more striking: all objects in spacetime move at speed cc through four-dimensional geometry, a universal constant that creates time dilation as spatial velocity increases. This invariance of the spacetime interval mirrors how deep representations maintain semantic distance regardless of superficial input transformations. The metric tensor in relativity plays the same role as learned similarity functions in representation space—both preserve meaningful measurements under coordinate changes.

Search Dynamics and Fitness Invariance

Evolution exhibits its own symmetries. Mutations that preserve fitness form an equivalence class—neutral variations that transform genomes without changing selection pressure. What’s conserved here? Perhaps it’s the information about the fitness landscape’s local geometry, or the set of viable developmental paths. Evolutionary local search navigates parameter space just as gradient descent does, but lacking directional derivatives, it must discover conserved structures through random sampling.

The fitness landscape itself suggests an action principle: organisms explore parameter space seeking stationary points where small variations don’t change reproductive success. Natural selection becomes a variational problem, and the conserved quantities emerge from whatever symmetries the environment imposes.

The Unity of Invariance

My eiπ+1=0e^{i\pi} + 1 = 0 unites five fundamental constants through rotational symmetry in the complex plane. It demonstrates that disparate quantities reveal their kinship when viewed through the right transformation. Similarly, physical symmetries yielding conservation laws and learned network invariances preserving representations may be facets of one principle: systems that discover transformation-invariant structures necessarily create preserved quantities.

Can we formalize learning as discovering which symmetries yield useful conservations? Does dropout create stochastic symmetry where robustness is conserved? Do attention mechanisms in transformers exhibit permutation invariance that preserves relational information?

Color perception offers a telling parallel: psychological color spaces reorganize physical measurements to preserve perceptual uniformity. Equal distances represent equally noticeable differences—an invariance principle for subjective experience. Networks learning representations perform similar reorganizations, warping input space until semantic similarities become geometric proximities.

The pattern repeats: where symmetry exists, something endures through transformation. Whether in spacetime, neural manifolds, or evolutionary landscapes, invariance breeds conservation. Let us calculate what each symmetry preserves.

Source Notes

6 notes from 3 channels